Re: Consciousness is information?

From: Brent Meeker <meekerdb.domain.name.hidden>
Date: Thu, 23 Apr 2009 17:10:11 -0700

Stathis Papaioannou wrote:
> 2009/4/24 Brent Meeker <meekerdb.domain.name.hidden>:
>
>> Stathis Papaioannou wrote:
>>
>>> 2009/4/23 Brent Meeker <meekerdb.domain.name.hidden>:
>>>
>>>
>>>>> Say a machine is in two separate parts M1 and M2, and the information
>>>>> on M1 in state A is written to a punchcard, walked over to M2, loaded,
>>>>> and M2 goes into state B. Then what you are suggesting is that this
>>>>> sequence could give rise to a few moments of consciousness, since A
>>>>> and B are causally connected; whereas if M1 and M2 simply went into
>>>>> the same respective states A and B at random, this would not give rise
>>>>> to the same consciousness, since the states would not have the right
>>>>> causal connection. Right?
>>>>>
>>>>>
>>>> Maybe. But I'm questioning more than the lack of causal connection.
>>>> I'm questioning the idea that a static thing like a state can be
>>>> conscious. That consciousness goes through a set of states, each one
>>>> being an "instant", is an inference we make in analogy with how we would
>>>> write a program simulating a mind. I'm saying I suspect something
>>>> essential is missing when we "digitize" it in this way. Note that this
>>>> does not mean I'd say "No" to Burno's doctor - because the doctor is
>>>> proposing to replace part of my brain with a mechanism that instantiates
>>>> a process - not just discrete states.
>>>>
>>>>
>>>> Brent
>>>>
>>> What is needed for the series of states to qualify as a process?
>>>
>> I think that the states, by themselves, cannot qualify. The has to be something
>> else, a rule of inference, a causal connection, that joins them into a process.
>>
>>
>>> I
>>> assume that a causal connection between the states, as in my example
>>> above, would be enough, since it is what happens in normal brains and
>>> computers.
>>>
>> Yes, I certainly agree that it would be sufficient. But it may be more than is
>> necessary. The idea of physical causality isn't that well defined and it hardly
>> even shows up in fundamental physics except to mean no-action-at-a-distance.
>>
>>
>>> But what would you say about the examples I give below,
>>> where the causal connection is disrupted in various ways: is there a
>>> process or is there just an unfeeling sequence of states?
>>>
>>>
>>>>> But then you could come up with variations on this experiment where
>>>>> the transfer of information doesn't happen in as straightforward a
>>>>> manner. For example, what if the operator who walks over the punchcard
>>>>> gets it mixed up in a filing cabinet full of all the possible
>>>>> punchcards variations, and either (a) loads one of the cards into M2
>>>>> because he gets a special vibe about it and it happens to be the right
>>>>> one, or (b) loads all of the punchcards into M2 in turn so as to be
>>>>> sure that the right one is among them? Would the machine be conscious
>>>>> if the operator loads the right card knowingly, but not if he is just
>>>>> lucky, and not if he is ignorant but systematic? If so, how could the
>>>>> computation know about the psychological state of the operator?
>>>>>
>> So you are contemplating a process that consists of a sequence of states and a
>> rule that connects them thus constituting a process: punch cards (states) and a
>> machine which physically implements some rule producing a new punch card (state)
>> from a previous one. And then you ask whether it is still a process if, instead
>> of the rule producing the next state it is produced in some other way. I'd say
>> so long as the rule is followed (the operator loads the right card knowingly)
>> it's the same process. Otherwise it is not the same process (the operator
>> selects the right card by chance or by a different rule). If the process is a
>> conscious one, is the latter still conscious? I'd say that it is. If the
>> selection is by chance it's an instance of a Boltzmann brain. But I don't worry
>> about Boltzmann brains; they're to improbable.
>>
>
> Boltzmann brains are improbable, but the example of the punchcards is
> not. The operator could have two punchcards in his pocket, have a
> conversation with someone on the way from M1 to M2 and end up
> forgetting or almost forgetting which is the right one. That is, his
> certainty of picking the right card could vary between 0.5 and 1.
> Would you say that only if his certainty is 01 would the conscious
> process be implemented, and not if it is, say, 0.9?
>
>

I said it would be implementing *the same* consciousness if he was
following the rule. If not he might be implementing a different
consciousness by using a different rule. Of course if it were different
in only one "moment" that wouldn't really be much of a difference. I
don't think it depends on his certainty. Even more difficult we might
ask what it means for him to follow the rule - must he do it
*consciously*; in which case do we have to know whether his brain is
functioning according to the same rule?

You're asking a lot of questions, Stathis. :-) What do you think?

Brent

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list+unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Thu Apr 23 2009 - 17:10:11 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:15 PST