Jacques Mallah (jqm1584.domain.name.hidden):
> The Turing test is a rule of thumb, not a necessary or sufficient
> condition. [for consciousness]
Balderdash!!
> The standard example is a huge look-up table (HLUT). This is a
> database that contains an answer that will be printed out (or voice
I certainly consider the lookup table implementation of a
Turing-test-passer as conscious as any other implementation.
An HLUT could be produced by a compiler expanding a "conventionally"
written AI program, unrolling loops and turning branches
into parallel code tracks, resulting in a unidirectional
code tree, each branchpoint switched by input or by time events.
The tree ends at every point where the consciousness is
deemed to "die", presumably finite on all branches.
Optimizing compilers do that sort of thing all the time (rarely to the
total extent of the HLUT, but that's just a compile-time parameter
setting!). I would find it outrageous to classify two identically
behaving robots as one conscious and the other a zombie simply because
one had been compiled with a higher loop-unrolling constant than the
other (from the same source code).
The confusion results from people trying to define consciousness
as an objective property, when it's not.
Let me repeat my firm position: consciousness is an attribution we
place on behavior, a subjective matter, not an objective fact. Things
with the right behavior make the attribution natural. ANY Turing test
passer can naturally be attributed consciousness, by virtue of its
conversation, irrelevant of how it achieves that internally. We
project onto the behavior a "psychological" explanation involving
non-physical quantities like belief, feelings, motives, dislikes,
passions etc.
Our own personal consciousness are the psychological states we
attribute to ourselves. Since "attribution" and "ourselves" are
themselves psychological properties, the whole concept of
consciousness is circular, and exists only in its own fantasy.
An alien not versed in "consciousness lore" could represent you or me
in purely mechanical terms, like wind-up-toys. We do what we do and
make the noises we make simply because of the physical interactions of
our various parts, connected as they are. No ridiculous "mental
properties" there at all.
As long as people try to treat consciousness as an objective property,
they will remain confused, looking for some mysterious property
possessed by some machinery, and not by others.
There's no such objective property.
The attribution of consciousness is a matter of opinion, even though
we have a strong disposition to say certain things are, and certain
things are not. When it comes to new things like robots, there is no
consensus, and need never be. Engineers, analyzing the technical
diagrams, can look at any robot as a pure insensate machine, while
those that interact with the robot socially will worry about its
feelings, commiserating with its setbacks and rejoicing in its
successes. When the latter asks the robot about its internal life,
they will interpret its answers ("I have a pain in all the diodes on
my left side") as a consequence of its consciousness. The engineers
will take the same sounds being produced as evidence of a certain
byte-stream being sent to the voice unit, those bytes coming from
line 7,588,678,201 of a phrase table, triggered by input condition
99,654,777,922. No consciousness there, only mindless machinery.
Neither opinion is right or wrong, they're just different ways of
interpreting the same situation.
Received on Tue Jul 27 1999 - 17:15:05 PDT
This archive was generated by hypermail 2.3.0
: Fri Feb 16 2018 - 13:20:06 PST