Re: Dreaming On

From: David Nyman <david.nyman.domain.name.hidden>
Date: Thu, 10 Sep 2009 19:58:09 +0100

2009/9/10 Brent Meeker <meekerdb.domain.name.hidden>:

>
>> Yes, I agree. But if we're after a physical theory, we also want to
>> be able to give in either case a clear physical account of their
>> apprehensiveness, which would include a physical justification of why
>> the fine-grained differences make no difference at the level of
>> experience.
>
> Consider what a clear physical account of apprehensiveness might be:
> There's an increased level of brain activity which is similar to that
> caused by a strange sound when along in the dark, a slight rise in
> adrenaline, a tensing of muscles that would be used to flee, brain
> patterns formed as memories while watching slasher movies become more
> excited. Fine-grained differences below these levels, as might differ
> in others, are irrelevant to the experience. For comparison consider a
> Mars rover experiencing apprehension: Sensor signals indicate lack of
> traction which implies likely inability to reach it's next sampling
> point. Extra battery power is put on line and various changes in paths
> and backtracking are calculated. Mission control is apprised. The
> soil appearance related to poor traction is entered into a database with
> a warning note.
> Notice how the meaning, the content of 'apprehension' comes from the
> context of action and purpose and interaction with an external world.
> We summarize these things as a single word 'apprehension' which we then
> take to describe a strictly internal state. But that is because we have
> abstracted away the circumstances that give the meaning. There are
> difference cirmcustances that would give the same hightened states.

Whilst I am of course in sympathy with the larger import of you're
saying, Brent, I'm not sure how it's relevant to the intentionally
more restricted focus of the current discussion. It is by definition
true that "fine-grained differences below these levels, as might
differ in others, are irrelevant to the experience". My point still
is that a complete physical theory of consciousness would be capable
of explicating - both in general physical principles and in detail -
the relation between coarse and fine-grained physical accounts of an
experiential state, whatever the wider context in which it might be
embedded. Or IOW, of explaining what physical principles and
processes are responsible for the fineness of fine graining and the
coarseness of coarse graining. CTM doesn't appear to offer any
physically explicit route to this goal.

David


>
> David Nyman wrote:
>> 2009/9/9 Flammarion <peterdjones.domain.name.hidden>:
>>
>>
>>>> What you say above seems pretty much in sympathy with the reductio
>>>> arguments based on arbitrariness of implementation.
>>>>
>>> It is strictly an argument against the claim that
>>> computation causes consciousness , as opposed
>>> to the claim that mental states are identical to computational
>>> states.
>>>
>>
>> I'm not sure I see what distinction you're making.  If as you say the
>> realisation of computation in a physical system doesn't cause
>> consciousness, that would entail that no physically-realised
>> computation could be identical to any mental state. This is what
>> follows if one accepts the argument from MGA or Olympia that
>> consciousness does not attach to physical states qua computatio.
>>
>>
>>>> But CTM is not engaged on such a project; in fact it entails
>>>> the opposite conclusion: i.e. by stipulating its type-token identities
>>>> purely functionally it requires that a homogeneous phenomenal state
>>>> must somehow be associated with a teeming plurality of heterogeneous
>>>> physical states.
>>>>
>>> It doesn't suggest that any mental state can be associated with any
>>> phsycial
>>> state.
>>>
>>
>> It doesn't need to say that to be obscure as a physical theory.  The
>> point is that it can ex hypothesi say nothing remotely physically
>> illuminating about what causes a mental state.  To say that it results
>> whenever a physical system implements a specific computation is to say
>> nothing physical about that system other than to insist that it is
>> 'physical'.
>>
>>
>>> It has been accused of overdoing  Multiple Realisability, but MR
>>> can be underdone as well.
>>>
>>
>> I agree.  Nonetheless, when two states are functionally equivalent one
>> can still say what it is about them that is physically relevant.  For
>> example, in driving from A to B it is functionally irrelevant to my
>> experience whether my car is fuelled by petrol or diesel.  But there
>> is no ambiguity about the physical details of my car trip or precisely
>> how either fuel contributes to this effect.
>>
>>
>>>> Various arguments - Olympia, MGA, the Chinese Room etc. - seek to
>>>> expose the myriad physical implausibilities consequential on such
>>>> implementation independence.  But the root of all this is that CTM
>>>> makes impossible at the outset any possibility of linking a phenomenal
>>>> state to any unique, fully-explicated physical reduction.
>>>>
>>> That's probably a good thing. We want to be able to say that
>>> two people with fine-grained differences in their brain structure
>>> can both be (for instance) apprehensiveness.
>>>
>>
>> Yes, I agree.  But if we're after a physical theory, we also want to
>> be able to give in either case a clear physical account of their
>> apprehensiveness, which would include a physical justification of why
>> the fine-grained differences make no difference at the level of
>> experience.
>>
>
> Consider what a clear physical account of apprehensiveness might be:
> There's an increased level of brain activity which is similar to that
> caused by a strange sound when along in the dark, a slight rise in
> adrenaline, a tensing of muscles that would be used to flee, brain
> patterns formed as memories while watching slasher movies become more
> excited.  Fine-grained differences below these levels, as might differ
> in others, are irrelevant to the experience.  For comparison consider a
> Mars rover experiencing apprehension: Sensor signals indicate lack of
> traction which implies likely inability to reach it's next sampling
> point.  Extra battery power is put on line and various changes in paths
> and backtracking are calculated.  Mission control is apprised.   The
> soil appearance related to poor traction is entered into a database with
> a warning note.
>
> Notice how the meaning, the content of 'apprehension' comes from the
> context of action and purpose and interaction with an external world.
> We summarize these things as a single word 'apprehension' which we then
> take to describe a strictly internal state. But that is because we have
> abstracted away the circumstances that give the meaning.  There are
> difference cirmcustances that would give the same hightened states.
>
> Brent
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list+unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Thu Sep 10 2009 - 19:58:09 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:16 PST