RE: Implementation

From: <>
Date: Tue, 3 Aug 1999 09:51:18 -0700

To follow up on Bruno's comment, can we use a HLUT type structure to
implement something equivalent to a universal Turing machine?

The TM can be thought of as implementing an algorithm, mapping from an
input tape to an output tape. We can do the same thing with an HLUT,
using the contents of the input tape as the index to look up an entry
in the "humongous" table. The data at that entry in the table is then
put out as the contents of the output tape.

The resulting system, let's call it HLUT-TM, if considered as a "black
box", has the same I/O behavior as the TM. We could even include a
time-delay value in the HLUT if we wanted the HLUT machine to take the
same amount of time as the TM.

If the HLUT-TM would be considered as an implementatino of the computation
executed by the TM, then if the TM was executing a conscious program,
perhaps the HLUT equivalent of the TM executing that same program should
also be considered conscious.

As is common in philosophical paradoxes like this, we can then try
to find a range of cases between the two extremes (TM vs HLUT-TM).
Hans Moravec suggests something along these lines in Mind Children.

Consider a variant on a TM, a cellular automaton which is performing
some calculation. Many such CAs are known to be universal. Hans notes
that occasionally implementations of CAs are optimized for speed by
incorporating local lookup tables. The local configuration of cell
states is checked against the table and if the entry is found the entire
local set of cells is updated immediately, potentially skipping many
steps of computation. Otherwise the calculation is done laboriously
(and perhaps it is added to the lookup table for future use).

You could imagine doing this on larger and larger scales. At the
smallest level you have a simple optimization that wouldn't seem to
have any significant effects, but at the highest level you essentially
have a HLUT. If you want to say that the original system was conscious
(say, the CA is running a TM which is running a conscious program),
but you don't want to say that the HLUT is conscious, you have to say
something about at what point consciousness would go away. You also
have to say whether consciousness would go away gradually or suddenly
as larger volumes of the CA are swept into the local lookup tables.

Received on Tue Aug 03 1999 - 09:52:42 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:06 PST