Re: computationalism and supervenience

From: Brent Meeker <meekerdb.domain.name.hidden>
Date: Tue, 12 Sep 2006 10:41:15 -0700

Stathis Papaioannou wrote:
> Colin Hales writes:
>
>
>>Please consider the plight of the zombie scientist with a huge set of
>>sensory feeds and similar set of effectors. All carry similar signal
>>encoding and all, in themselves, bestow no experiential qualities on the
>>zombie.
>>
>>Add a capacity to detect regularity in the sensory feeds.
>>Add a scientific goal-seeking behaviour.
>>
>>Note that this zombie...
>>a) has the internal life of a dreamless sleep
>>b) has no concept or percept of body or periphery
>>c) has no concept that it is embedded in a universe.
>>
>>I put it to you that science (the extraction of regularity) is the science
>>of zombie sensory fields, not the science of the natural world outside the
>>zombie scientist. No amount of creativity (except maybe random choices)
>>would ever lead to any abstraction of the outside world that gave it the
>>ability to handle novelty in the natural world outside the zombie scientist.
>>
>>No matter how sophisticated the sensory feeds and any guesswork as to a
>>model (abstraction) of the universe, the zombie would eventually find
>>novelty invisible because the sensory feeds fail to depict the novelty .ie.
>>same sensory feeds for different behaviour of the natural world.
>>
>>Technology built by a zombie scientist would replicate zombie sensory feeds,
>>not deliver an independently operating novel chunk of hardware with a
>>defined function(if the idea of function even has meaning in this instance).
>>
>>The purpose of consciousness is, IMO, to endow the cognitive agent with at
>>least a repeatable (not accurate!) simile of the universe outside the
>>cognitive agent so that novelty can be handled. Only then can the zombie
>>scientist detect arbitrary levels of novelty and do open ended science (or
>>survive in the wild world of novel environmental circumstance).
>>
>>In the absence of the functionality of phenomenal consciousness and with
>>finite sensory feeds you cannot construct any world-model (abstraction) in
>>the form of an innate (a-priori) belief system that will deliver an endless
>>ability to discriminate novelty. In a very Godellian way eventually a limit
>>would be reach where the abstracted model could not make any prediction that
>>can be detected. The zombie is, in a very real way, faced with 'truths' that
>>exist but can't be accessed/perceived. As such its behaviour will be
>>fundamentally fragile in the face of novelty (just like all computer
>>programs are).
>>-----------------------------------
>>Just to make the zombie a little more real... consider the industrial
>>control system computer. I have designed, installed hundreds and wired up
>>tens (hundreds?) of thousands of sensors and an unthinkable number of
>>kilometers of cables. (NEVER again!) In all cases I put it to you that the
>>phenomenal content of sensory connections may, at best, be characterised as
>>whatever it is like to have electrons crash through wires, for that is what
>>is actually going on. As far as the internal life of the CPU is concerned...
>>whatever it is like to be an electrically noisy hot rock, regardless of the
>>program....although the character of the noise may alter with different
>>programs!
>>
>>I am a zombie expert! No that didn't come out right...erm....
>>perhaps... "I think I might be a world expert in zombies".... yes, that's
>>better.
>>:-)
>>Colin Hales
>
>
> I'm not sure I understand why the zombie would be unable to respond to any
> situation it was likely to encounter. Doing science and philosophy is just a happy
> side-effect of a brain designed to help its owner survive and reproduce. Do you
> think it would be impossible to program a computer to behave like an insect, or a
> newborn infant, for example? You could add a random number generator to make
> its behaviour less predictable (so predators can't catch it and parents don't get
> complacent) or to help it decide what to do in a truly novel situation.
>
> Stathis Papaioannou

And after you had given it all these capabilities how would you know it was not
conscious?

Brent Meeker

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---
Received on Tue Sep 12 2006 - 13:42:30 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:12 PST