2009/8/27 Quentin Anciaux <allcolor.domain.name.hidden>:
> This is because if consciousness is a computational process then it is
> independant of the (physical or ... virtual) implementation. If I
> perfom the computation on an abacus or within my head or with stones
> on the ground... it is the same (from the computation pov).
>
> And that's my problem with physicalism. How can it account for the
> independance of implementation if computations are not real ?
Yes, precisely. All my arguments have been towards the conclusion
that CTM+PM=false. IOW, you can have one or the other, but not both.
David
>
> 2009/8/26 David Nyman <david.nyman.domain.name.hidden>:
>>
>> 2009/8/26 Stathis Papaioannou <stathisp.domain.name.hidden>:
>>
>>> With the example of the light, you alter the photoreceptors in the
>>> retina so that they respond the same way when to a blue light that
>>> they would have when exposed to a red light.
>>
>> Ah, so the alien has photoreceptors and retinas? That's an assumption
>> worth knowing! This is why I said "a successful theory wouldn't be
>> very distant from the belief that the alien was, in effect, human, or
>> alternatively that you were, in effect, alien".
>>
>>> I think what I have proposed is consistent with functionalism, which
>>> may or may not be true. A functionally identical system produces the
>>> same outputs for the same inputs, and functionalism says that
>>> therefore it will also have the same experiences, such as they may be.
>>> But what those experiences are like cannot be known unless you are the
>>> system, or perhaps understand it so well that you can effectively run
>>> it in your head.
>>
>> Well, it's precisely the conjunction of functionalism with a
>> primitively material assumption that prompted this part of the thread.
>> Peter asked me if I thought a brain scan at some putatively
>> fundamental physical level would be an exhaustive account of all the
>> information that was available experientially, and I was attempting to
>> respond specifically to that. Given what you say above, I would again
>> say - for all the reasons I've argued up to this point - that a purely
>> functional account on the assumption of PM gives me no reason to
>> attribute experience of any kind to the system in question.
>>
>> The way you phrase it rightly emphasises the focus on invariance of
>> inputs and outputs as definitive of invariance of experience, rather
>> than the variability of the actual PM process that performs the
>> transformation. As Brent has commented, this seems a somewhat
>> arbitrary assumption, with the implied rider of "what else could it
>> be?" Well, whatever else could provide an account of experience, this
>> particular conjecture happens to fly directly in the face of the
>> simultaneous assumption of primitively physical causation.
>
> This is because if consciousness is a computational process then it is
> independant of the (physical or ... virtual) implementation. If I
> perfom the computation on an abacus or within my head or with stones
> on the ground... it is the same (from the computation pov).
>
> And that's my problem with physicalism. How can it account for the
> independance of implementation if computations are not real ?
>
> Regards,
> Quentin
>
>>
>> There's something trickier here, too. When you say "unless you are
>> the system", this masks an implicit - and dualistic - assumption in
>> addition to PM monism. It is axiomatic that any properly monistic
>> materialist account must hold all properties of a system to be
>> extrinsic, and hence capable of *exhaustive* extrinsic formulation.
>> IOW if it's not extrinsically describable, it doesn't exist in terms
>> of PM. So what possible difference could it make, under this
>> restriction, to 'be' the system? If the reply is that it makes just
>> the somewhat epoch-making difference of conjuring up an otherwise
>> unknowable world of qualitative experience, can we still lay claim to
>> a monistic ontology, in any sense that doesn't beggar the term?
>>
>> David
>>
>>>
>>> 2009/8/26 David Nyman <david.nyman.domain.name.hidden>:
>>>>
>>>> On 25 Aug, 14:32, Stathis Papaioannou <stath....domain.name.hidden> wrote:
>>>>
>>>>> Let's say the alien brain in its initial environment produced a
>>>>> certain output when it was presented with a certain input, such as a
>>>>> red light. The reconstructed brain is in a different environment and
>>>>> is presented with a blue light instead of a red light. To deal with
>>>>> this, you alter the brain's configuration so that it produces the same
>>>>> output with the blue light that it would have produced with the red
>>>>> light.
>>>>
>>>> In terms of our discussion on the indispensability of an
>>>> interpretative context for assigning meaning to 'raw data', I'm not
>>>> sure exactly how much you're presupposing when you say that "you alter
>>>> the brain's configuration". You have a bunch of relational data
>>>> purporting to correspond to the existing configuration of the alien's
>>>> brain and its relation to its environment. This is available to you
>>>> solely in terms of your interpretation, on the basis of which you
>>>> attempt to come up with a theory that correlates the observed 'inputs'
>>>> and 'outputs' (assuming these can be unambiguously isolated). But how
>>>> would you know that you had arrived at a successful theory of the
>>>> alien's experience? Even if you somehow succeeded in observing
>>>> consistent correlations between inputs and outputs, how could you ever
>>>> be sure what this 'means' for the alien brain?
>>>
>>> With the example of the light, you alter the photoreceptors in the
>>> retina so that they respond the same way when to a blue light that
>>> they would have when exposed to a red light. Photoreceptors are
>>> neurons and synapse with other neurons, further up the pathway of
>>> visual perception. The alien will compare his perception of the blue
>>> sky of Earth with his memory of the red sky of his home planet and
>>> declare it looks the same. Now it is possible that it doesn't look the
>>> same and he only thinks it looks the same, but the same could be said
>>> of ordinary life: perhaps yesterday the sky looked green, and today
>>> that it looks blue we only think it looks the same because we are
>>> deluded.
>>>
>>>> I would say that in effect what you have posed here is 'the problem of
>>>> other minds', and that consequently a 'successful' theory wouldn't be
>>>> very distant from the belief that the alien was, in effect, human, or
>>>> alternatively that you were, in effect, alien. And, mutatis
>>>> mutandis, I guess this would apply to rocks too.
>>>
>>> I think what I have proposed is consistent with functionalism, which
>>> may or may not be true. A functionally identical system produces the
>>> same outputs for the same inputs, and functionalism says that
>>> therefore it will also have the same experiences, such as they may be.
>>> But what those experiences are like cannot be known unless you are the
>>> system, or perhaps understand it so well that you can effectively run
>>> it in your head.
>>>
>>>
>>> --
>>> Stathis Papaioannou
>>>
>>> >
>>>
>>
>> >
>>
>
>
>
> --
> All those moments will be lost in time, like tears in rain.
>
> >
>
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list+unsubscribe.domain.name.hidden
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Thu Aug 27 2009 - 15:52:46 PDT