Re: The implementation problem

From: Jacques M Mallah <jqm1584.domain.name.hidden>
Date: Sat, 23 Jan 1999 15:33:41 -0500

On Thu, 21 Jan 1999, Gilles HENRI wrote:
> I am personnaly very reluctant to interpret the consciousness in terms of
> "intrinsical" information (that could be objectively defined and measured).
> Take a typical state of the brain, or a succession of states realizing an
> implementation of some computation, for example corresponding to you seeing
> an elephant and saying "Oh an elephant!". Now imagine you realize a random
> permutation of neural cells which keep their electric and chemical states,
> but put them in different cortical areas. Most probably in this new state
> you won't feel and say anything sensible, although the new implementation
> is logically perfectly equivalent to the first one. The meaning of your
> thoughts must come in final analysis from the way your neural cells are
> actually connected to your I/O devices (eyes, ears, muscles..), not only
> from their formal arrangement and state.

        No. If you rearrange the order, you change the information, just
like you would change the information of a string if you rearrange it.
Also, if you rearrange the connections, you change the causal
relationships that also define a computation. Your 'new implementation'
would be totally different from the old computation.
        Also, there is no such thing as input or output. These concepts
only apply when you predesignate one part of the universe as 'the system'
and the rest as 'the environment'. It is an artificial distinction,
not known to nature.

                         - - - - - - -
              Jacques Mallah (jqm1584.domain.name.hidden)
       Graduate Student / Many Worlder / Devil's Advocate
"I know what no one else knows" - 'Runaway Train', Soul Asylum
            My URL: http://pages.nyu.edu/~jqm1584/
Received on Sat Jan 23 1999 - 12:35:18 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:06 PST