Re: predictions

From: Christopher Maloney <dude.domain.name.hidden>
Date: Wed, 04 Aug 1999 22:32:55 -0400

Excellent responses, I must say!

hal.domain.name.hidden wrote:
>
> Christopher Maloney, <dude.domain.name.hidden>, writes:
> > I'd like to revive an old thread, that has been bothering me a lot
> > lately. I hope you'll all agree that it's a fascinating puzzle.
> > Wei Dai posed this way back, in February of last year:
> > http://www.escribe.com/science/theory/index.html?mID=38.
> > I've read the entire thread, and I don't think the question was
> > ever resolved. Wei, Nick Bostrom, and Hal Finney were the main
> > contributors.
> >[...]
> > But there's an alternative way of computing this probability. She
> > believes that at time t1, the odds are 1/2 that she'll have seen
> > heads. She also knows that if, at time t1, she saw heads, then the
> > odds will be 1 that at all later times, she'll continue to remember
> > seeing heads. Likewise, if she saw tails at time t1, she'll
> > continue to remember seeing tails. So
> >
> > P(H,t3) = P(H,t1) = 1/2.
> >
> > So which is correct? I know which solution I prefer, but I'd like
> > to get some feedback first.
>
> I don't think the reasoning for this example is very solid. You have
> to introduce this notion of "the probability that I will remember X at
> time B, given that I remember it at earlier time A." Then you have to
> assume that it is 1.
>
> But there is no sound basis for this assumption or this methodology.
> It is based on our common experience, but we have no experience with
> copy machines of this type. I think we have to use mathematics rather
> than experience to guide us in new situations. So this reasoning does
> not seem convincing to me.

But I can't imagine anything other than subjective probabilities. That
doesn't mean that some other paradigm wouldn't be at work here, but I'm
wondering if you have any suggestions. This gets into some pretty deep
metaphysics.

Here are some ruminations on the metaphysical questions that might apply:

1. My conscious self is the result of a computational process.
    Physical supervenience is not necessary here, but since it's
    irrelevant, let's assume it for convenience.

2. I experience my memories to be consistent. I know that that
    doesn't mean that they necessarily are. It's conceivable to me
    that someone could jump into my mind and change a memory of the
    color of a horse I just saw from black to blue, and I would never
    have any way to detect it (barring other evidence). But if I'm
    a computational process, then I would expect my memories to
    be consistent from one moment to the next.

3. I experience consciousness as a "stream" which "travels through
    time". This is the shakiest of these three so far. Many
    philosophers have made a strong case that the entire phenomenon of
    time is an illusion.

So let me ask you this: are you suggesting that the probability that
I'll see heads at t1 might not be the same as that I'll see heads at
t2? In other words, P(H,t1) != P(H,t2) ?

This would be the case if the self-sampling assumption were at work
at every instant. Then P(H,t1) = 1/2 and P(H,t2) = 2/3.

But I do still have a hard time reconciling this with my feeling that
there is such a thing as a subjective stream of consciousness. If this
is what you're saying, can you be more specific about what you would
give up with regards to the stream of consciousness idea?

[more below]
      

 
> One thing you did not say was why the other answer, P(H,t3) = 2/3, was
> problematical. I believe it had to do with decision theory. In classical
> decision theory, a choice is evaluated by taking all possible outcomes,
> multiplying the possible outcome times the utility of that outcome, and
> summing the results. You then make the choice which maximizes this sum.
>
> An example where this might seem to be a problem is the following.
> You are offered a bet where you will pay $15 if the coin is tails, but
> win $10 if it is heads. If P(H) is 2/3, you might want to take the bet.
>
> The resolution is to look at when the payment is made. If it is after
> you are duplicated, and each instance of your duplicates receives the
> $10, then it is a good bet.

I don't see that this is necessarily true. At time t0, I believe that at
time t3, I will *be* only one person. If the coin lands tails, I will
still only get $10. Why should I care that a copy of me is walking
around with $10? Again, it comes down to subjective probabilities. I
will only bet if I believe that my probability of "becoming" one of the
mes that has seen heads is greater than that of me becoming the one that
has seen tails.


> If the payment is made before duplication
> (but after the coin flip) then your head-observing duplicates don't
> actually end up with $10 each, but rather $5. The duplicating machine
> does not duplicate money, or any of your material possessions. In fact,
> being duplicated in effect cuts your material possessions in half.
> So the real choice is between $5 with probability 2/3 and -$15 with
> probability 1/3, not a good bet.
>
> An interesting variant though is to introduce nonmaterial rewards.
> They used to have kissing booths at the fair. For a few dollars you
> would get to kiss a beautiful girl. Suppose that is worth $10 to you.
> Now the bet is offered; you pay $15 on tails but get a kiss on heads.
> The payment will be made before duplication.
>
> In this case I would say that it is a good bet. The memory of the kiss
> gets duplicated with you, and each of your duplicates ends up with a
> memory that is worth $10. So in this case you have P=2/3 of gaining
> something worth $10 and P=1/3 of losing something worth $15, making
> it favorable.

But here again, I believe that at time t3, I'll either have a memory of
being kissed, or I won't. And I need to know the relative probabilities
in order to make the choice.

 
> > Note that one need not bring MWI into this at all. The only big
> > assumption is the existence of a copy machine. Instead of MWI, one
> > can think of the identical experiment being carried out on an
> > ensemble of, say, 100 hapless souls Albert, Bernard, Caroline, etc.
> > At time t1, some number close to 50 will have seen heads. At time
> > t2, there will be 150 people, 100 of whom remember seeing heads,
> > and 50 of whom remember seeing tails. From a bird perspective,
> > if you picked any person at random from this group, the chance that
> > they'll have seen heads is 2/3.
>
> One difference though is that the MWI duplicates all of your material
> possessions, unlike the copy machine. That changes the answer I would
> give, as in the kissing example above. So it is not clear to me that
> copying presents issues involving subjective probabilities in the same
> way that an MWI model does.
>
> Hal

-- 
Chris Maloney
http://www.chrismaloney.com
"Donuts are so sweet and tasty."
-- Homer Simpson
Received on Wed Aug 04 1999 - 19:40:59 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:06 PST