It is true that there are some physical systems for which we can
predict the future state without calculating all intermediate states.
Periodic systems will fall into this category if we can figure out
analytically what the period is. But there are other systems where
this is thought to be impossible; for example, chaotic systems.
Chaotic systems are ones whose future behavior is sensitively dependent
on the current state. Making even an infinitisimal change to the current
state will cause massive changes in the future. I don't think it would be
possible with any computational model to predict the state of a chaotic
system far in the future without computing intermediate states.
My guess is that consciousness as we know it is inherently chaotic.
It seems like small changes to our beliefs and knowledge can lead to
large changes in behavior. So often we experience being torn between
alternate courses of action, where the tiniest change could tip us from
one choice to the other.
Neural behavior is inherently chaotic as well. Neurons are believed to
sum the recent activity levels on their synapses and when this exceeds a
threshold, the neuron suddenly and catastrophically fires a nerve impulse.
It then goes through a refractory period (about 1 millisecond) in which it
is unable to fire again until it has "rested" and regathered its strength,
at which point it goes back to summing its inputs. If we plotted the net
input strength to the neuron, it would be an irregular line with lots of
little jags and bumps, and whenever it manages to exceed a certain level,
there is a sudden firing. Probably we would often see the stimulation
level approach that threshold line and fall back, not quite meeting the
threshold, until we just reach it and another nerve impulse is fired.
This kind of sensitive dependence on initial conditions is a recipe for
mathematical chaos.
Of course, this is not a rigorous proof, and it is conceivable that
consciousness is not in fact chaotic even though it subjectively
seems so, and even though its subtrate (the brain's neural net) is.
Nevertheless it would be almost unbelievably bizarre to imagine that
you could calculate the mental state of an 80 year old man, with all
the memories of a lifetime, without actually calculating the experiences
that led to those memories.
In Egan's story, the computer is supposed to calculate his conscious
experience of the 10th second first, then the 9th second, and so on.
Suppose in the first (subjective) second he stutters on saying the number
"one", out of nervousness. Then the memory of that stutter will be
present as he recites all the other numbers. Perhaps he will enunciate
them more carefully in order to compensate. So when the system calculates
that 10th second, it has to know what happened during the first second.
Those events will be latent in his memories during the 10th second, and
may influence his behavior. His conscious reactions to earlier events
are in his memory at later times. So I don't see how it could possibly
work to calculate the 10th second first.
Two other minor points: in Egan's story, this experiment was not being
done on "dust". It was done on an ordinary computer. It was the result
of this experiment, which is of course that there was no subjective
awareness of the time scrambling, which was supposed to lend credence
to the dust hypothesis.
Second, quantum computers cannot efficiently solve NP complete problems,
or at least they are not known to be able to. It's possible that ordinary
computers can solve NP complete problems; no one has ever proven that
they can't (this is the famous P = NP problem of computer science).
And if it turns out that ordinary computers can handle them efficiently,
then of course quantum computers will be able to as well, since they
are a superset of ordinary computers. But if it turns out that P !=
NP and ordinary computers can't solve NP problems efficiently, there is
no evidence that the situation will be different for quantum computers.
Hal Finney
Received on Thu Jan 27 2005 - 15:06:13 PST
This archive was generated by hypermail 2.3.0
: Fri Feb 16 2018 - 13:20:10 PST