At 11:09 14/01/05 +1100, Stathis Papaioannou wrote:
>Bruno Marchal wrote:
>
>>At 10:24 13/01/05 +1100, Stathis Papaioannou wrote:
>>
>>>1. Every possible world can be simulated by a computer program.
>>
>>
>>With the most usual (Aristotelian) sense of the term "world", this
>>assumption would entail the falsity of comp,
>>which is that I can be simulated by a computer program.
>>(I, or any of the class of observers I belong(s) to).
>
>Huh? I thought I was saying the opposite. I certainly believe in comp.
Still confusing Schmidhuberian comp and ... comp ?
Schmidhuber Comp: The universe is computable/Turing emulable
Comp: I (you) am (are) computable/Turing emulable.
It can be argued that Schmidhuber comp entails comp.
But alas comp entails (a priori) the negation of Schmidhuber comp.
Because if I am Turing emulable then I cannot know
which computation currently supports me, and my expectations
should in principle be related to the most common computationnal
histories going through a description of my current state made at the right
substitution level (which exists by comp-hyp) or below.
That is, you just cannot dismiss the "induction failure problem".
So you have a measure problem. Remember that from the first person
(subjective) point of view you cannot be aware of the number of steps
"really done" so that the measure (fortunately in a sense) bears on the
infinite (maximal) consistent computationnal histories/extensions.
This introduce at least a non computable randomness in our most probable
neighborhood.
Obviously taking account some theorems in theoretical
computer science makes it possible to get many more information and
more precise account.
Schmidhuberian type of comp is comp + putting
the "induction failure problem" (and the whole "mind/body" problem)
under the rug.
>>In the spirit of your thought experiment, let me ask
>>you a "personal" question. Assume you have big motivation
>>for going to Mars. You can now choose between a 100$ and a
>>10000$ teletransporter machine (TTM). Let us assume you are not so rich
>>that this difference count (or adjust the number relatively to your
>>situation).
>>The 100$ TTM has no security and it is known that billion of copies of
>>yourself
>>will be sold elsewhere, for example to the kind of "hell" you were
>>pointing to.
>>The 10000$ TTM has quantum coded protection, so that the probability
>>is very near one that no pirate will be able to copy you.
>>Are you telling us that you will take the insecure low cost TTM ?
>
>It's a good question, and this is where the rational comes up against the
>emotional. If it were my first trip, I think I'd be very nervous about the
>cheap alternative, and I would pay the extra or avoid going if I couldn't
>afford it. However, if I had used the $100 service many times in the past
>(through choice or necessity), I don't think I would worry about using it
>again.
>
>Here is another irrational belief I hold, while I'm confessing. I am
>absolutely convinced that continuity of personal identity is a kind of
>illusion. If I were to be painlessly killed every second and immediately
>replaced by an exact copy, with all my memories, beliefs about being me,
>etc., I would have no way of knowing that this was happening, and indeed I
>believe that in a sense this IS happening, every moment of my life. Now,
>suppose I am offered the following deal. In exchange for $1 million
>deposited in my bank account, tonight I will be killed with a sharp axe in
>my sleep, and in the morning a stranger will wake up in my bed who has
>been brainwashed and implanted with all my memories at my last conscious
>moment. This stranger will also have had plastic surgery so that he looks
>like me, and he will then live life as me, among other things spending the
>$1 million which is now in my bank account.
>
>If I were rational, I should probably accept the above deal, on the
>grounds that my apparent continuity of personal identity will be the same
>as it always has been. If such a proposal were put to me, however, I would
>be horrified; and I am sure my friends and family would be too, even if
>they shared my philosophical beliefs about personal identity. I would also
>be horrified if offered the role of the stranger who takes someone else's
>place. I can't decide which would be worse.
>
>On the other hand, if I had been forced to go through the above
>transformation several times, I might get used to the idea and not be so
>worried. Rationally, it shouldn't make any difference.
Obviously! But it is so only because you dismiss the "failure induction
problem". Also: third person identity is arguably an illusion. But I hardy
doubt first person identity can ever be an illusion or that it could even
be useful to consider like it. What is painful in pain for the suffering
first person is mainly that the pain can last, and this independently of
any precise idea the first person could have about who she is.
Let us iterate indefinitely the W M duplication of a machine capable of
some inductive inference.
Obviously the one which is always reconstituted at W will believe (here in
the sense of inductively infer)
that she will always be reconstituted at W, but she is false as his
quasi-infinity of doppelgangers will (unsuccessfully) try to explain her.
Bruno
http://iridia.ulb.ac.be/~marchal/
Received on Fri Jan 14 2005 - 11:10:31 PST