Re: The implementation problem

From: Gilles HENRI <Gilles.Henri.domain.name.hidden>
Date: Thu, 21 Jan 1999 09:01:02 +0100

>On Tue, 19 Jan 1999, Wei Dai wrote:
>> A definition of when a physical system contains some information may not
>> be necessary. If the measure of a conscious experience is related to the
>> measure of the associated state information, then all we need is a measure
>> on the set of all possible states. We can simply say that the universe is
>> this measure, and any perceived physical systems are just illusions
>> produced by our own minds. Similarly, if the measure of a conscious
>> experience is related to the measure of the associated computation, then
>> it would suffice to have a measure on the set of all possible
>> computations.
>
> I don't think that avoids the problem. Suppose you start off with
>some kind of uniform measure on the space of computations. You then have
>to consider that computation A can implement computation B. To find the
>real measure you would have to take such secondary implementations into
>account, perhaps along the line I suggested in one of my first posts to
>this list. It is essentially the same problem to determine when one
>computation implements another as it is to determine when a physical
>system (which is like the first computation) implements one.
> You could arbritrarily rule out secondary implementations, but
>then you'd be stuck with a trivial uniform measure, with no mechanism for
>Darwinian natural selection.
>

I am personnaly very reluctant to interpret the consciousness in terms of
"intrinsical" information (that could be objectively defined and measured).
Take a typical state of the brain, or a succession of states realizing an
implementation of some computation, for example corresponding to you seeing
an elephant and saying "Oh an elephant!". Now imagine you realize a random
permutation of neural cells which keep their electric and chemical states,
but put them in different cortical areas. Most probably in this new state
you won't feel and say anything sensible, although the new implementation
is logically perfectly equivalent to the first one. The meaning of your
thoughts must come in final analysis from the way your neural cells are
actually connected to your I/O devices (eyes, ears, muscles..), not only
from their formal arrangement and state.
In other words, consciousness is not a objective property of a given state,
but a functionality that allows to reactivate (by "thinking") the
sensations you feel from outer world. Although a conscious computer is not
impossible, a conscious computer IS impossible without I/O devices.

Gilles

(sorry for the double sending, Jacques!)
Received on Thu Jan 21 1999 - 00:02:14 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:06 PST