Stathis Papaioannou wrote:
> Peter Jones writes:
>
> > > > With physical supervenience, it is possible for the same person to
> > > > supervene on multiple physical objects. What is disallowed is multiple
> > > > persons to supervene on the same physical object.
> > >
> > > That is what is usually understood, but there is no logical reason why
> > > the relationship between the physical and the mental cannot be
> > > one->many, in much the same way as a written message can have
> > > several meanings depending on its interpretation.
> >
> > There is a reason: multiple meanings depend on external observers
> > and interpretations. But who observes the multiverse ?
>
> I'm not sure how the multiverse comes into the discussion, but you have
> made the point several times that a computation depends on an observer
No, I haven't! I have tried ot follow through the consequences of
assuming it must.
It seems to me that some sort of absurdity or contradiction ensues.
> for its meaning. I agree, but *if* computations can be conscious (remember,
> this is an assumption) then in that special case an external observer is not
> needed.
Why not ? (Well, I would be quite happy that a conscious
computation would have some inherent structural property --
I want to foind out why *you* would think it doesn't).
> In fact, that is as good a definition of consciousness as any: it is
> that aspect of an entity that cannot be captured by an external observer,
> but only experienced by the entity itself.
> Once we learn every observable
> fact about stars we know all about stars, but if we lean every observable
> fact about bats, we still don't know what it is like to be a bat.
It is quite possible that "what it is like to be a bat" is subjective,
whilst "bats are conscious" is objective.
> To put it
> differently, it would not add anything to our knowledge of stars if we could
> become a star (assuming stars are not conscious), but it would add something
> to our knowledge of bats if we could become or perhaps mind-meld with a
> bat.
>
> While it is possible, as Brent Meeker has argued, that environmental input is
> necessary to maintain consciousness in a person, that would just be a technical
> detail about brains, like the requirement for oxygen. The neurons in a brain could
> fire in the same pattern as they do naturally but as a result of stimulation by
> electrodes, or stimulation by self-exciting neurons grafted onto the brain. You
> don't even need to accept the validity of computationalism or functionalism to
> make that work.
>
> Stathis Papaioannou
> _________________________________________________________________
> Be one of the first to try Windows Live Mail.
> http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at
http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---
Received on Sat Sep 09 2006 - 09:32:54 PDT