Re: Bruno's argument

From: Brent Meeker <meekerdb.domain.name.hidden>
Date: Wed, 02 Aug 2006 10:53:32 -0700

Stathis Papaioannou wrote:
>
> Brent Meeker writes:
>
>
>>Consider a computer which is doing something (whether it is dreaming or
>>musing or just running is the point in question). If there is no
>>interaction between what it's running and the rest of the world I'd say
>>it's not conscious. It doesn't necessarily need an external observer
>>though. To invoke an external observer would require that we already
>>knew how to distinguish an observer from a non-observer. This just
>>pushes the problem away a step. One could as well claim that the walls
>>of the room which are struck by the photons from the screen constitute
>>an observer - under a suitable mapping of wall states. The computer
>>could, like a Mars rover, act directly on the rest of the world.
>
>
> The idea that we can only be conscious when interacting with the environment
> is certainly worth considering. After all, consciousness evolved in order to help
> the organism deal with its environment, and it may be wrong to just assume
> without further evidence that consciousness continues if all interaction with the
> environment ceases. Maybe even those activities which at first glance seem to
> involve consciousness in the absence of environmental interaction actually rely
> on a trickle of sensory input: for example, maybe dreaming is dependent on
> proprioceptive feedback from eye movements, which is why we only dream
> during REM sleep, and maybe general anaesthetics actually work by eliminating
> all sensory input rather than by a direct effect on the cortex. But even if all this
> is true, we could still imagine stimulating a brain which has all its sensory inputs
> removed so that the pattern of neural activity is exactly the same as it would
> have been had it arisen in the usual way. Would you say that the artificially
> stimulated brain is not conscious, even though everything up to and including
> the peripheral nerves is physically identical to and goes through the same
> physical processes as the normal brain?

No. I already noted that we can't insist that interaction with the
environment is continuous. Maybe "potential interaction" would be
appropriate. But I note that even in your example you contemplate
"stimulating" the brain. I'm just trying to take what I consider an
operational defintion and abstract it to the kind of
mathematical/philosophical definition that can be applied to questions
about rocks thinking.

At the experimental level, I recall that in the late '60s, when sensory
deprivation experiments were the craze, there was a report that after an
hour or so in a sensory deprivation tank a persons mind would end up in
a loop.

Incidentally, the attribute in question seems to morph around among
"conscious", "intelligent", and "computing something". I don't think
those are all exactly the same. Certainly computing and intelligence
don't necessarily entail consciousness. And consciousness itself admits
of categories.

Brent Meeker

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---
Received on Wed Aug 02 2006 - 13:57:01 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:11 PST