- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]

From: Jesse Mazer <lasermazer.domain.name.hidden>

Date: Tue, 30 Jan 2001 15:52:12 -0500

Marchal <marchal.domain.name.hidden> wrote:

*> >> Bruno:
*

*> >>I agree that I say something shocking. At each instant
*

*> >>I am not multiplied by 10^100 like in deWitt's view of Everett
*

*> >>formulation of QM, I show that with comp we are multiplied
*

*> >>a priori by 2^aleph_0, at each instant ...
*

*>
*

*>
*

*> >I don't understand how you get this. If at time T1 I am duplicated, and
*

*> >then at time T2 I am duplicated again, then at T2 there will be 4 of me,
*

*>but
*

*> >that doesn't mean that at time T1 I was being "multiplied a priori" by 4,
*

*> >does it? At time T1 I was only multiplied by 2.
*

*>
*

*>You are right. But the universal dovetailer (UD) really multiply you
*

*>by aleph_0, each belonging to 2^aleph_0 infinite computations.
*

*>The key point is that the DU, which emulate all programs, build you again
*

*>and again, and again ... and you cannot be aware of
*

*>the (rather big) delays between your (virtual) reconstitutions.
*

Well, if your program is kept frozen between duplications, I agree that you

won't notice the delays (although again I point out we're assuming that the

flow of consciousness is real, which doesn't necessarily have to be the

case). But if the you endlessly duplicate frozen programs without

restarting them, then you won't be able to restart them after aleph-0

iterations...you must be restarting them after a finite number of

iterations, or not restarting them at all.

I suppose you could do an experiment where you create 2 copies of a frozen

program, then restart 1...then create 4 copies of the frozen program, and

restart 2...then create 8 copies of the frozen program, and restart 4...etc.

In that case, if continuity of consciousness is real and identity is

"splittable", you could say that the odds are in favor of your "next moment"

after freezing being one of the ones that took place after the largest

number of iterations.

However, you have to assume that restarted programs would have a way to tell

which iteration they part of--maybe each would see a number in front of

themselves, for example. But even if this experiment could run

indefinitely, no restarted copy would find that it had woken up on the

"aleph-0th iteration"...the vast majority of restarted copies would simply

find that the number in front of them was too large for their minds to fully

comprehend, and thus their subjective experience would only diverge into a

finite number of possibilities.

My assumption here is that if you have two simulated universes containing

identical simulated brains, and if the universes differ from each other but

the states of the two brains do not diverge, then if I have the first-person

experience associated with that brain I can't really say I'm "in" one

universe or the other. "I" am simply a pattern that has instances in each

universe, but there is not really a true answer to which instance I am, at

least not until after the two brain-simulations diverge.

So even if the UD creates an infinite number of different simulated

universes, it may still only create a finite number of possible "successor

moments" to any given moment of experience...unless you believe that a

successor moment can be arbitrarily more complex than the previous one, like

a mouse-mind suddenly becoming a human mind. This is not actually so

implausible, but you have to go into details about what form you think a

theory of consciousness/identity should be expected to take.

*>Have you read my post http://www.escribe.com/science/theory/m1726.html.
*

I read it, but I'm still not clear on what you're saying--again, I think you

need to make your assumptions about consciousness/identity more explicit.

*> >Likewise, if it is possible to experience an infinite number of distinct
*

*> >instants, then if I am duplicated at each instant, you could say that
*

*>there
*

*> >are 2^aleph_0 of me after I have experienced aleph_0 instants...but as
*

*>long
*

*> >as I have only experienced a finite number of instants, there will be
*

*>only a
*

*> >finite number of me. And I don't think it's possible to actually
*

*>experience
*

*> >an infinite number of distinct instants, at least not if your mind is
*

*> >finite.
*

*>
*

*>Like you, I don't think it is possible to experience an infinite number
*

*>of distinct instants. But I belief that the probability of your *next*
*

*>first person state depend on those infinite third person rebuilding made
*

*>by
*

*>the DU.
*

In the archive there was some discussion about what would happen if you had

an experiment where a coin would be flipped, and if it landed heads you

would later be duplicated, but if it landed tails you would not. The

question there was, would your subjective probability of seeing heads be 1/2

or 2/3? (some people also suggested that this experiment is a good argument

against continuity of consciousness over time...maybe all that exists are

various moments of experience, so there is a 1/2 chance of observing the

coin landing heads but a 2/3 chance of remembering it landing heads in the

past).

Are you saying that you support the 2/3 view, meaning that the probability

of my "next moment" depends on a kind of integral over all possible future

histories?

Jesse Mazer

_________________________________________________________________

Get your FREE download of MSN Explorer at http://explorer.msn.com

Received on Tue Jan 30 2001 - 13:11:00 PST

Date: Tue, 30 Jan 2001 15:52:12 -0500

Marchal <marchal.domain.name.hidden> wrote:

Well, if your program is kept frozen between duplications, I agree that you

won't notice the delays (although again I point out we're assuming that the

flow of consciousness is real, which doesn't necessarily have to be the

case). But if the you endlessly duplicate frozen programs without

restarting them, then you won't be able to restart them after aleph-0

iterations...you must be restarting them after a finite number of

iterations, or not restarting them at all.

I suppose you could do an experiment where you create 2 copies of a frozen

program, then restart 1...then create 4 copies of the frozen program, and

restart 2...then create 8 copies of the frozen program, and restart 4...etc.

In that case, if continuity of consciousness is real and identity is

"splittable", you could say that the odds are in favor of your "next moment"

after freezing being one of the ones that took place after the largest

number of iterations.

However, you have to assume that restarted programs would have a way to tell

which iteration they part of--maybe each would see a number in front of

themselves, for example. But even if this experiment could run

indefinitely, no restarted copy would find that it had woken up on the

"aleph-0th iteration"...the vast majority of restarted copies would simply

find that the number in front of them was too large for their minds to fully

comprehend, and thus their subjective experience would only diverge into a

finite number of possibilities.

My assumption here is that if you have two simulated universes containing

identical simulated brains, and if the universes differ from each other but

the states of the two brains do not diverge, then if I have the first-person

experience associated with that brain I can't really say I'm "in" one

universe or the other. "I" am simply a pattern that has instances in each

universe, but there is not really a true answer to which instance I am, at

least not until after the two brain-simulations diverge.

So even if the UD creates an infinite number of different simulated

universes, it may still only create a finite number of possible "successor

moments" to any given moment of experience...unless you believe that a

successor moment can be arbitrarily more complex than the previous one, like

a mouse-mind suddenly becoming a human mind. This is not actually so

implausible, but you have to go into details about what form you think a

theory of consciousness/identity should be expected to take.

I read it, but I'm still not clear on what you're saying--again, I think you

need to make your assumptions about consciousness/identity more explicit.

In the archive there was some discussion about what would happen if you had

an experiment where a coin would be flipped, and if it landed heads you

would later be duplicated, but if it landed tails you would not. The

question there was, would your subjective probability of seeing heads be 1/2

or 2/3? (some people also suggested that this experiment is a good argument

against continuity of consciousness over time...maybe all that exists are

various moments of experience, so there is a 1/2 chance of observing the

coin landing heads but a 2/3 chance of remembering it landing heads in the

past).

Are you saying that you support the 2/3 view, meaning that the probability

of my "next moment" depends on a kind of integral over all possible future

histories?

Jesse Mazer

_________________________________________________________________

Get your FREE download of MSN Explorer at http://explorer.msn.com

Received on Tue Jan 30 2001 - 13:11:00 PST

*
This archive was generated by hypermail 2.3.0
: Fri Feb 16 2018 - 13:20:07 PST
*