Stathis Papaioannou wrote:
> Brent Meeker writes:
>
>
>>>>Self-awareness is awareness of some specific aspect of a construct called "myself".
>>>>It is not strictly reflexive awareness of the being aware of being aware... So in
>>>>the abstract computation it is just this part of a computation having some relation
>>>>we identify as "awareness" relative to some other part of the computation. I think
>>>>it is a matter of constructing a narrative for memory in which "I" is just another
>>>>player.
>>>
>>>
>>>I don't think "self-awareness" captures the essence of consciousness.
>>
>>Neither do I; I was just responding to you noting that self-awareness is
>>"observer-relative". The "observer" is really just a construct forced on us by
>>grammar which demands that an action be done by someone or something. We could more
>>accurately say there is observation.
>
>
> OK.
>
>
>>>We commonly think
>>>that consciousness is associated with intelligence, which is perhaps why it is often stated
>>>that a recording cannot be conscious, since a recording will not adapt to its environment in
>>>the manner we normally expect of intelligent agents. However, consider the experience of
>>>pain when you put your hand over a flame. There is certainly intelligent behaviour associated
>>>with this experience - learning to avoid it - but there is nothing "intelligent" about the raw
>>>experience of pain itself. It simply seems that when certain neurons in the brain fire, you
>>>experience a pain, as reliably and as stupidly as flicking a switch turns on a light. When an
>>>infant or an animal screams in agony it is not engaging in self-reflection, and for that matter
>>>neither is a philosopher: acute pain usually displaces every other concurrent conscious
>>>experience. A being played a recording of a painful experience over and over into the relevant
>>>neural pathways may not be able to meaningfully interact with its environment, but it will
>>>be hellishly conscious nonetheless.
>>>
>>>Stathis Papaioannou
>>
>>I could make a robot that, having suitable thermocouples, would quickly withdraw it's
>>hand from a fire; but not be conscious of it. Even if I provide the robot with
>>"feelings", i.e. judgements about good/bad/pain/pleasure I'm not sure it would be
>>conscious. But if I provide it with "attention" and memory, so that it noted the
>>painful event as important and necessary to remember because of it's strong negative
>>affect; then I think it would be conscious.
>
>
> It's interesting that people actually withdraw their hand from the fire *before* they experience
> the pain. The withdrawl is a reflex, presumably evolved in organisms with the most primitive
> central nervour systems, while the pain seems to be there as an afterthought to teach us a
> lesson so we won't do it again. Thus, from consideration of evolutionary utility consciousness
> does indeed seem to be a side-effect of memory and learning.
Even more curious, volitional action also occurs before one is aware of it. Are you
familiar with the experiments of Benjamin Libet and Grey Walter?
>
> I also think that this is an argument against zombies. If it were possible for an organism to
> behave just like a conscious being, but actually be unconscious, then why would consciousness
> have evolved?
An interesting point - but hard to give any answer before pinning down what we mean
by consciousness. For example Bruno, Julian Jaynes, and Daniel Dennett have
explanations; but they explain somewhat different consciousnesses, or at least
different aspects.
Brent Meeker
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at
http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---
Received on Sat Sep 09 2006 - 03:13:56 PDT