Re: Implementation

From: Russell Standish <R.Standish.domain.name.hidden>
Date: Wed, 28 Jul 1999 10:51:50 +1000 (EST)

>
> Jacques Mallah (jqm1584.domain.name.hidden):
> > The Turing test is a rule of thumb, not a necessary or sufficient
> > condition. [for consciousness]
>
> Balderdash!!
>
> > The standard example is a huge look-up table (HLUT). This is a
> > database that contains an answer that will be printed out (or voice
>
> I certainly consider the lookup table implementation of a
> Turing-test-passer as conscious as any other implementation.
>

With any HLUT algorithm, the machine will eventually fail the Turing
test if we quiz it long enough. The situation is probably even better
than that - the HLUT algorithm is most likely of higher computation
complexity that the Turing test required to falsify it.

However, in the context of our Olympia story, we know the questions of
the Turing test before hand. I think you will agree that that it is
trivial to construct a nonconcious entity that will pass that
particular Turing test. In retrospect, I was being a little vague in
my argument.


> An HLUT could be produced by a compiler expanding a "conventionally"
> written AI program, unrolling loops and turning branches
> into parallel code tracks, resulting in a unidirectional
> code tree, each branchpoint switched by input or by time events.
> The tree ends at every point where the consciousness is
> deemed to "die", presumably finite on all branches.

Untrue. This would imply that all computations halt.

>
> Optimizing compilers do that sort of thing all the time (rarely to the
> total extent of the HLUT, but that's just a compile-time parameter
> setting!). I would find it outrageous to classify two identically
> behaving robots as one conscious and the other a zombie simply because
> one had been compiled with a higher loop-unrolling constant than the
> other (from the same source code).
>
> The confusion results from people trying to define consciousness
> as an objective property, when it's not.
>
> Let me repeat my firm position: consciousness is an attribution we
> place on behavior, a subjective matter, not an objective fact. Things
> with the right behavior make the attribution natural. ANY Turing test
> passer can naturally be attributed consciousness, by virtue of its
> conversation, irrelevant of how it achieves that internally. We
> project onto the behavior a "psychological" explanation involving
> non-physical quantities like belief, feelings, motives, dislikes,
> passions etc.
>
> Our own personal consciousness are the psychological states we
> attribute to ourselves. Since "attribution" and "ourselves" are
> themselves psychological properties, the whole concept of
> consciousness is circular, and exists only in its own fantasy.
>
> An alien not versed in "consciousness lore" could represent you or me
> in purely mechanical terms, like wind-up-toys. We do what we do and
> make the noises we make simply because of the physical interactions of
> our various parts, connected as they are. No ridiculous "mental
> properties" there at all.
>
> As long as people try to treat consciousness as an objective property,
> they will remain confused, looking for some mysterious property
> possessed by some machinery, and not by others.
>
> There's no such objective property.
>
> The attribution of consciousness is a matter of opinion, even though
> we have a strong disposition to say certain things are, and certain
> things are not. When it comes to new things like robots, there is no
> consensus, and need never be. Engineers, analyzing the technical
> diagrams, can look at any robot as a pure insensate machine, while
> those that interact with the robot socially will worry about its
> feelings, commiserating with its setbacks and rejoicing in its
> successes. When the latter asks the robot about its internal life,
> they will interpret its answers ("I have a pain in all the diodes on
> my left side") as a consequence of its consciousness. The engineers
> will take the same sounds being produced as evidence of a certain
> byte-stream being sent to the voice unit, those bytes coming from
> line 7,588,678,201 of a phrase table, triggered by input condition
> 99,654,777,922. No consciousness there, only mindless machinery.
>
> Neither opinion is right or wrong, they're just different ways of
> interpreting the same situation.
>
>



----------------------------------------------------------------------------
Dr. Russell Standish Director
High Performance Computing Support Unit,
University of NSW Phone 9385 6967
Sydney 2052 Fax 9385 6965
Australia R.Standish.domain.name.hidden
Room 2075, Red Centre http://parallel.hpc.unsw.edu.au/rks
----------------------------------------------------------------------------
Received on Tue Jul 27 1999 - 17:51:08 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:06 PST