Flammarion wrote:
>
>
> On 28 Aug, 02:27, Brent Meeker <meeke....domain.name.hidden> wrote:
>> Flammarion wrote:
>>
>>> On 21 Aug, 21:01, Brent Meeker <meeke....domain.name.hidden> wrote:
>>>> Flammarion wrote:
>>>>> Do you think that if you scanned my brain right down to the atomic
>>>>> level,
>>>>> you still wouldn't have captured all the information?
>>>> That's an interesting question and one that I think relates to the
>>>> importance of context. A scan of your brain would capture all the
>>>> information in the Shannon/Boltzman sense, i.e. it would determine which
>>>> of the possible configurations and processes were realized. However,
>>>> those concerned about the "hard problem", will point out that this
>>>> misses the fact that the information represents or "means" something.
>>>> To know the meaning of the information would require knowledge of the
>>>> world in which the brain acts and perceives, including a lot of
>>>> evolutionary history. Image scanning the brain of an alien found in a
>>>> crash at Roswell. Without knowledge of how he acts and the evolutionary
>>>> history of his species it would be essentially impossible to guess the
>>>> meaning of the patterns in his brain. My point is that it is not just
>>>> computation that is consciousness or cognition, but computation with
>>>> meaning, which means within a certain context of action.
>>> But figuring out stored sensory information should be about the
>>> easiest part of the task. If you can trace a pathway from a red
>>> sensor to a storage unit, the information in the unit has to mean
>>> "this is red".
>>> What is hard about the Hard Problem is *not* interpretation or
>>> context.
>> I'm not so sure about that - maybe "more is different" applies. "This
>> is red" is really a summary, an abstraction, of what the red sensor
>> firing means to the alien. To a human it's the color of blood and has
>> connotations of violence, excitement, danger. To an alien with green
>> blood... from a planet with red seas...? If you knew all the
>> associations built up over a lifetime of memories and many lifetimes
>> of evolution maybe the 'hard problem' would dissolve.
>
> Not at all. That theory predicts that some entirely novel sensation--
> one which
> has not built
> up any associations --should be easy to describe. But it isn;t. And in
> fact
> describing associations is a lot easier than describing the core
> phenomenal feel.
Does "that theory" refer to more-is-different? ISTM that
more-is-different implies exactly what you point out. It's easier to
describe a sensation that has lots of associations because describe it
in terms of the associations; whereas a completely novel sensation is
impossible describe.
Brent
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list+unsubscribe.domain.name.hidden
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Fri Aug 28 2009 - 09:07:13 PDT