Re: being inside a universe

From: Wei Dai <weidai.domain.name.hidden>
Date: Fri, 12 Jul 2002 15:02:37 -0700

On Fri, Jul 12, 2002 at 10:20:16AM -0700, Hal Finney wrote:
> Another very surprising aspect relates to the word "causal"
> in the title. Apparently the world of decision theorists has
> undergone a schism based on some seemingly obscure issues relating
> to Newcomb's Paradox. A sample page describing this paradox is at
> http://members.aol.com/kiekeben/newcomb.html.

I'm sure you've heard of the phrase "correlation is not causation". Yet
before the formulation of causal decision theory, there was no way to
express within the framework of decision theory that you do not believe
some correlation indicates a causal relationship. In Newcomb's Paradox,
with causal decision theory you're still allowed to take only one box (if
you believe that taking only one box causes more money to be in the box)
however you're no longer forced to. You can now take both boxes and not be
considered irrational. So this is yet another instance of decision theory
being generalized to be usable by more people.

If you look at the history of decision theory, you see that it keeps
getting generalized so that more choices can be considered rational.
(See http://www.escribe.com/science/theory/m3675.html.) That
makes more sense if the purpose of decision theory is to justify the use
of probabilities in science, rather than to constrain people's actions by
labeling certain choices irrational. It basically says you can use
probability theory even if your utility function is not linear in
wealth, even if you don't believe in objective probabilities,
even if you don't believe correlation equals causation (and I argue even
if the value of an exerience to you depends on what your copies
experience, although I still haven't had time to write down why in
light of the thought experiment I posted earlier in the thread
"self-sampling assumption is incorrect").

> I have looked ahead a little bit at this part of the book, and
> unfortunately it appears that the discussion of causality is rather
> abbreviated and mostly brings in some rather complex results from the
> literature. The philosophical literature on causality is quite large
> in itself and probably some familiarity with that would help to see how
> it relates to decision theory.

Perhaps Pearl's book Causality that Tim mentioned talks more about
causality itself. (Although I haven't read it myself yet.) I noticed that
Joyce references Pearl's book in one of his papers so their ideas of
causality are probably quite similar. But I think Joyce book is
sufficiently self-contained on this topic. I had to read the relevant
sections several times, but I did eventually understand what he means by
causality.

> Overall I am not sure how to relate the ideas in this book to the issues
> we deal with. One connection is that the causal aspect sheds light on
> soem of the paradoxes we have discussed, such as the paradox of Adam
> and Eve which Nick Bostrom covers in his thesis. In this situation
> Adam wants a deer to come along so he can eat it, and to arrange this
> he plans never to mate with Eve if that happens. If the deer doesn't
> come along then he will mate with Eve and create the whole human race,
> in which case it is highly unlikely that his particular observer-moments
> would be chosen. So he reasons that from his perspective, since he is
> in fact experiencing his observer-moments, it is likely that no large
> human race will be produced and hence that the deer will come along.
>
> There are various resolutions to this seeming paradox, but based on this
> book I can see it as a Newcomb problem. Adam is giving himself evidence
> that the deer will come, he is not causing the deer to come. So from
> the perspective of causal decision theory, he is wrong to try this plan.

That's a very good point. You should forward that observation to Nick
directly. I think he is no longer following this mailing list very
closely.

> However this reasoning does require you to accept the causal prescription
> in all Newcomb problems, which may be a harder nut to swallow than other
> solutions to the Adam and Eve paradox.

If you're not completely convinced that causality needs to be taken into
account in making decisions, then by all means read the rest of the book.

> More directly, the book provides various proposed axioms for rationality
> and choice. I have tried to consider how they might relate to the
> existence of multiple universes, but I don't see much connection.
> It seems that one is still faced with the simple prescription of choosing
> the outcome that maximizes the expected utility, with considerable
> freedom for the utility functions.
>
> One question is whether we can restrict or limit the decisions people make
> in some of the paradoxes of copying. I am going to have some duplicates
> made, and then they will have some experiences, and then some of those
> duplicates get further duplicated, etc. Can we constrain the decisions
> a rational person will make in these experiments? For example, if he
> is given a chance to be duplicated to two people, and one will have
> something good happen and one will have something bad, can we argue that
> he should agree to this based on a 50% mixture of the utilities of the
> two experiences? If we can make these kinds of statements, or even some
> weaker ones, then I think this theory may be helpful in shedding light
> on the issues we have dealt with.

As I argued earlier, I don't think we can make these kinds of statements.
But again, it makes more sense if you think of the purpose of decision
theory as providing meaning for probabilities, rather than restricting
people's decisions. For example, in my recent exchanges with Bruno, I've
been arguing that probabilities related to his concept of first person
indeterminancy are not really meaningful, because they can't be used in
decision theory (at least I haven't figured out how) and do not seem
necessary for it.
Received on Fri Jul 12 2002 - 15:03:47 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:07 PST