- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]

From: Wei Dai <weidai.domain.name.hidden>

Date: Fri, 3 Apr 1998 12:18:15 -0800

On Tue, Mar 03, 1998 at 10:28:27PM +0000, Nick Bostrom wrote:

*> Yes. We haven't defined conditional probabilities; my
*

*> statement above would mean that conditional probabilities default to
*

*> the probability of the conditional event. It would be nicer if the
*

*> theory could be extended to deal with conditional probabilties also.
*

*>
*

*> In order to do thins, I think we have to be careful to avoid
*

*> ambiguity when using the term "I" at different times. Perhaps that
*

*> can be done by operationalizing the statements leading to Dai's
*

*> paradox roughly as follows:
*

*>
*

*> A1. At time 1 I will observe heads with probability 1/2.
*

*>
*

*> "If I have my eyes closed, and somebody tells me it's time 1, then I
*

*> should expect with probability 1/2 that when I open my eyes I will
*

*> see heads."
*

*>
*

*> A2. If I observe heads at time 1, at time 2 I will observe heads with
*

*> probability 1.
*

*>
*

*> Can similarly be interpreted as meaning roughly: "If I have my eyes
*

*> closed, and somebody tells me it is time 2, and I remember having
*

*> observed heads at time 1, then I should believe with probability 1
*

*> that when I open my eyes I will see heads."
*

*>
*

*> A3. At time 2 I will observe heads with probability 2/3.
*

*>
*

*> "If I have my eyes closed, and I'm told it's time 2, then I should
*

*> expect with probability 2/3 that when I open my eyes I will see
*

*> heads."
*

*>
*

*> This would seem to block the contradiction that follows if A1-3 are
*

*> taken at face value. Each translation captures the gist
*

*> of the corresponding A-statement. And I think it should be possible
*

*> to come up with similar translations for other probabilistic
*

*> sentences we might want to consider. Is that a way out of this
*

It's a way out, but doesn't it also render the AUH useless? If it can

only generate predictions of the form "If I had my eyes closed and I'm

told it's time x, ..." what good would it be?

Now I think the problem is not really with AUH but with decision theory.

As I understand it now, the relationship between decision theory,

probability, and physics is like this: the likelyhood of various physical

theories is determined by applying probability theory, especially Bayes

rule, and the justification to apply probability theory comes from the

current normative decision theory. Now the problem is that this decision

theory presupposes a metaphysical structure of the world in which discrete

events occur in a single universe, and so it breaks down when faced with a

theory that doesn't fit into this framework. In such theories the whole

notion of the probability of events would not make any sense. I'm

surprised that this problem has not been discovered earlier, since in

addition to the AUH, the MWI does not fit the metaphysical framework of

decision theory either. People do not seem to have realized that under the

MWI the traditional justifications for probability theory do not apply.

So I think the right way to go about solving this problem is to try to

come up with a new normative decision theory that takes into account the

AUH from the beginning.

Received on Fri Apr 03 1998 - 12:19:33 PST

Date: Fri, 3 Apr 1998 12:18:15 -0800

On Tue, Mar 03, 1998 at 10:28:27PM +0000, Nick Bostrom wrote:

It's a way out, but doesn't it also render the AUH useless? If it can

only generate predictions of the form "If I had my eyes closed and I'm

told it's time x, ..." what good would it be?

Now I think the problem is not really with AUH but with decision theory.

As I understand it now, the relationship between decision theory,

probability, and physics is like this: the likelyhood of various physical

theories is determined by applying probability theory, especially Bayes

rule, and the justification to apply probability theory comes from the

current normative decision theory. Now the problem is that this decision

theory presupposes a metaphysical structure of the world in which discrete

events occur in a single universe, and so it breaks down when faced with a

theory that doesn't fit into this framework. In such theories the whole

notion of the probability of events would not make any sense. I'm

surprised that this problem has not been discovered earlier, since in

addition to the AUH, the MWI does not fit the metaphysical framework of

decision theory either. People do not seem to have realized that under the

MWI the traditional justifications for probability theory do not apply.

So I think the right way to go about solving this problem is to try to

come up with a new normative decision theory that takes into account the

AUH from the beginning.

Received on Fri Apr 03 1998 - 12:19:33 PST

*
This archive was generated by hypermail 2.3.0
: Fri Feb 16 2018 - 13:20:06 PST
*