RE: computationalism and supervenience

From: Colin Hales <C.Hales.domain.name.hidden>
Date: Wed, 13 Sep 2006 13:26:43 +1000

Brent Meeker:
>
> Colin Hales wrote:
> >
> > Stathis Papaioannou
> > <snip>
> >
> >>Maybe this is a copout, but I just don't think it is even logically
> >>possible to explain what consciousness
> >>*is* unless you have it. It's like the problem of explaining vision to a
> >>blind man: he might be the world's
> >>greatest scientific expert on it but still have zero idea of what it is
> >>like to see - and that's even though
> >>he shares most of the rest of his cognitive structure with other humans,
> >>and can understand analogies
> >>using other sensations. Knowing what sort of program a conscious
> computer
> >>would have to run to be
> >>conscious, what the purpose of consciousness is, and so on, does not
> help
> >>me to understand what the
> >>computer would be experiencing, except by analogy with what I myself
> >>experience.
> >>
> >>Stathis Papaioannou
> >>
> >
> >
> > Please consider the plight of the zombie scientist with a huge set of
> > sensory feeds and similar set of effectors. All carry similar signal
> > encoding and all, in themselves, bestow no experiential qualities on the
> > zombie.
> >
> > Add a capacity to detect regularity in the sensory feeds.
> > Add a scientific goal-seeking behaviour.
> >
> > Note that this zombie...
> > a) has the internal life of a dreamless sleep
> > b) has no concept or percept of body or periphery
> > c) has no concept that it is embedded in a universe.
> >
> > I put it to you that science (the extraction of regularity) is the
> science
> > of zombie sensory fields, not the science of the natural world outside
> the
> > zombie scientist. No amount of creativity (except maybe random choices)
> > would ever lead to any abstraction of the outside world that gave it the
> > ability to handle novelty in the natural world outside the zombie
> scientist.
> >
> > No matter how sophisticated the sensory feeds and any guesswork as to a
> > model (abstraction) of the universe, the zombie would eventually find
> > novelty invisible because the sensory feeds fail to depict the novelty
> .ie.
> > same sensory feeds for different behaviour of the natural world.
> >
> > Technology built by a zombie scientist would replicate zombie sensory
> feeds,
> > not deliver an independently operating novel chunk of hardware with a
> > defined function(if the idea of function even has meaning in this
> instance).
> >
> > The purpose of consciousness is, IMO, to endow the cognitive agent with
> at
> > least a repeatable (not accurate!) simile of the universe outside the
> > cognitive agent so that novelty can be handled. Only then can the zombie
> > scientist detect arbitrary levels of novelty and do open ended science
> (or
> > survive in the wild world of novel environmental circumstance).
>

> Almost all organisms have become extinct. Handling *arbitrary* levels of
> novelty is probably too much to ask of any species; and it's certainly
> more than is necessary to survive for millenia.

I am talking purely about scientific behaviour, not general behaviour. A
creature with limited learning capacity and phenomenal scenes could quite
happily live in an ecological niche until the niche changed. I am not asking
any creature other than a scientist to be able to appreciate arbitrary
levels of novelty.

>
> >
> > In the absence of the functionality of phenomenal consciousness and with
> > finite sensory feeds you cannot construct any world-model (abstraction)
> in
> > the form of an innate (a-priori) belief system that will deliver an
> endless
> > ability to discriminate novelty. In a very Godellian way eventually a
> limit
> > would be reach where the abstracted model could not make any prediction
> that
> > can be detected.
>
> So that's how we got string theory!
>
> >The zombie is, in a very real way, faced with 'truths' that
> > exist but can't be accessed/perceived. As such its behaviour will be
> > fundamentally fragile in the face of novelty (just like all computer
> > programs are).
>

> How do you know we are so robust. Planck said, "A new idea prevails, not
> by the
> conversion of adherents, but by the retirement and demise of opponents."
> In other
> words only the young have the flexibility to adopt new ideas. Ironically
> Planck
> never really believed quantum mechanics was more than a calculational
> trick.

The robustness is probably in that science is actually, at the level of
critical argument (like this, now), a super-organism.

In retrospect I think QM will be regarded as a side effect of the desperate
attempt to mathematically abtract appearances rather then deal with the
structure that is behaving quantum-mechanically. After the event they'll all
be going..."what were we thinking!".... it won't be wrong... just not useful
in the sense that any of its considerations are not about underlying
structure.


>
> > -----------------------------------
> > Just to make the zombie a little more real... consider the industrial
> > control system computer. I have designed, installed hundreds and wired
> up
> > tens (hundreds?) of thousands of sensors and an unthinkable number of
> > kilometers of cables. (NEVER again!) In all cases I put it to you that
> the
> > phenomenal content of sensory connections may, at best, be characterised
> as
> > whatever it is like to have electrons crash through wires, for that is
> what
> > is actually going on. As far as the internal life of the CPU is
> concerned...
> > whatever it is like to be an electrically noisy hot rock, regardless of
> the
> > program....although the character of the noise may alter with different
> > programs!
>

> That's like say whatever it is like to be you, it is at best some waves of
> chemical
> potential. You don't *know* that the control system is not conscious -
> unless you
> know what structure or function makes a system conscious.
>

There is nothing there except wires and electrically noisy hot rocks,
plastic and other materials = <stuff>. Whatever its consciousness is... it
is the consciousness of the <stuff>. The function is an epiphenomenon at the
scale of a human user that has nothing to do with the experiential qualities
of being the computer.

Colin Hales



--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---
Received on Tue Sep 12 2006 - 23:27:48 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:12 PST