Re: Implementation

From: Russell Standish <R.Standish.domain.name.hidden>
Date: Mon, 26 Jul 1999 11:58:30 +1000 (EST)

>
> Let me offer an alternative approach to the Maudlin/Marchal implementation
> question which is consistent with computationalism but avoids some of
> the difficulties.
>
> Computationalism says that implementation of a certain class of algorithms
> is necessary and sufficient for consciousness. Maudlin presents an
> experiment where a seemingly minor and (he argues) irrelevant change makes
> the difference between whether a conscious computation is implemented
> or not.
>
> However to conclude that Maudlin's change (adding an inert block) makes
> consciousness go away (even assuming computationalism) is a fallacy.
> We agree that the initial computation (sans block) is conscious.
> We agree that adding the block changes the computation (by changing
> counterfactual behavior). But we cannot conclude from this that the
> resulting computation is not conscious.
>
> The new configuration, with the block, implements a different computation
> than without (if you take counterfactuals into consideration). But have
> we proven that this new computation is not conscious? No! All we have
> proven is that this is different from the computation we started with.
>
> But more than one computation can be conscious, obviously. It is
> conceivable that the new computation, although different, is conscious
> as well. This is a possible escape from Maudlin's argument.

I disagree with this. It is clear that Olympia without Karas is not concious.

>
> Let me further offer a rationale for why this might be plausible.
>
> Consider a human brain which is listening to music with its eyes closed.
> It will not be having very much activity in the visual cortex. Now we,
> acting as a demon, come along and shut off much of the visual cortext.
> We disturb the neurons so that they will not behave properly. In fact,
> we set it up so that the whole brain will shut down if a certain level
> of activity is reached in the visual cortex.
>
> Now as long as the brain keeps its eyes closed and keeps listening to
> music, it will not notice these changes. But if it opens its eyes, it
> will experience a drastic change, including loss of consciousness.
>
> The question is whether, as long as the brain keeps its eyes closed, the
> brain will continue to be conscious even though we have made this change.
> I think the answer is clearly, or at least very plausibly, yes, it will
> remain conscious. All the neural activity will stay the same as it
> would have if we had not intervened, and there is no reason to expect
> consciousness to evaporate just because we made a change which *could*
> terminate conciousness if certain events occur.
>
> If you agree with this, I would argue that getting to Maudlin's
> situation (with the block) is now just a matter of degree. In the
> listening-to-music case, the change we made was relatively minor, and we
> left the brain with a considerable amount of freedom of action before
> it would stumble onto a counterfactual state which would terminate it.
> In Maudlin's case, the brain is much more restricted. It has only a
> minutely narrow path it must follow before it falls into a counterfactual
> state which will make it stop. As long as it stays on that path, doing
> exactly what we predicted, it will behave normally and will not be "aware"
> that danger lurks at every step.
>
> We can imagine a range of situations where we provide greater or
> lesser amounts of restrictions in terms of counterfactuals. If minor
> restrictions leave the brain conscious, it seems plausible that major
> restrictions would do so as well.
>
> If we agree with this argument, we can have both supervenience and
> computationalism, it seems to me. We agree that Maudlin's machine changes
> the program which is instantiated, but we claim that the new program
> is also conscious.
>
> Hal
>
>

Nice try, but I think a brain in a resting state listening to music is
so much more complex in its processing of "counterfactuals" than the
Olympia example. There must be a dividing line somewhere between the
two examples - where the nonconcious entity crosses a threshold to
conciousness.

                                Cheers

----------------------------------------------------------------------------
Dr. Russell Standish Director
High Performance Computing Support Unit,
University of NSW Phone 9385 6967
Sydney 2052 Fax 9385 6965
Australia R.Standish.domain.name.hidden
Room 2075, Red Centre http://parallel.hpc.unsw.edu.au/rks
----------------------------------------------------------------------------
Received on Sun Jul 25 1999 - 19:09:47 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:06 PST