Re: Measure, madness, and Max
>
>It's not clear that branching is an "event" as such. It is more of a
>process, a constant, continual process of splitting.
hat's true. I mean that the splitting is almost complete (any interfernce
term has got vanishingly small) before you can do anything macroscopically.
>I see identity in a larger sense. Consider the example proposed
>by Jacques, where the universe is infinitely large, and there are an
>infinite number of exact, perfect, indistinguishable copies of me spread
>throughout the universe. (Each of those copies exists within a region
>of space, billions of light years across, which is observationally
>indistinguishable from our own.)
I dislike this analogy. If there exist some part of the Universe "perfectly
indistinguishable" from another one, it means that this part has the same
surrounding, and by continuity that the Universe is in fact periodic, each
part having the same copy at the same distance. It is equivalent to say
that the topology of the Universe is closed (torus-like), and you can
assimilate it to a finite one. All the "copies" of yourselves will have
really the same history, in the past and in the future as well, and you are
not allowed to consider different possible evolutions. If you committ
suicide, all of your copies will committ it too and you really disappear.
>
>Or consider Wei's experiment, where we have an intelligent computer program
>and it is running synchronized on two computers. I would say that the
>identity of the computer program is instantiated equally on both
>computers.
>It is not a matter of consciousness "transporting" from one computer to
>the other. Rather, with both computers in identical states, there is
>nothing to distinguish one computer's instantiation of consciousness
>from the other's. The consciousness is not aware of how many computers
>it is instantiated on. So when one stops, it makes no difference.
For me a deterministic program is NOT intelligent. Intelligence (in the
human sense, not that of what we call improperly AI) requires a dynamical
interactivity with the environment. If the two computers are equivalent,
they are not intelligent and if they are intelligent, they cannot be
equivalent because they cannot have the same environnment. Consciousness as
we live it implies a representation of himself in a physical world (this is
exactly what distinguish human beings from other animals), which is not
realized by computers who are not intelligent enough to know that they are
different.
>At any given time, what you will experience is among the least unlikely
>events which allow you to stay alive, and which are consistent with your
>conscious state. If you don't know what the results are of the lottery,
>and if your suicide machine is sufficiently reliable, then the least
>unlikely event is that you won the lottery.
This is exactly the problem. Even if you don't know the results, the
physical world around you knows it and has been very fastly disconnected
from the world where you won. So the least unlikely world certainly not
this latter one.
Cheers
Gilles
Received on Fri Jan 22 1999 - 00:19:53 PST
This archive was generated by hypermail 2.3.0
: Fri Feb 16 2018 - 13:20:06 PST