Re: Time as a Lattice of Partially-Ordered Causal Events or Moments

From: Hal Finney <hal.domain.name.hidden>
Date: Wed, 4 Sep 2002 10:08:24 -0700

I think on this list we should be willing to seriously consider the
many-worlds interpretation (MWI) of quantum mechanics as the ontology for
our universe. In particular, we should not assume that wave function
collapse is anything more than an illusion caused by decoherence
of formerly interacting components of the universal wave function.
Almost all of the supposed paradoxes of QM go away if you eliminate wave
function collapse.

There are a few objections which I am aware of which have been raised
against the MWI. The first is its lack of parsimony in terms of
creating a vast number of universes. We gain some simplification in
the QM formalism but at this seemingly huge expense. The second is its
untestability, although some people have claimed otherwise. And the
third is that it retains what we might call the problem of measure,
that is, explaining why we seem to occupy branches with a high measure
or amplitude, without just adding that as an extra assumption.

The point is, all of these objections apply equally to the more
ambitious multiverse models we consider here. Our multiverse is even
more profligate than the MWI; it is if anything less observable; and
the problem of measure is at least as acute.

If we're willing to seriously consider the prospect that "everything
exists" and specifically "all universes exist", I think it makes most
sense to take the MWI as the model for our own universe, when working
within the multiverse framework. By the metrics we typically use for
universe complexity, basically the number of axioms or the size of a
program to specify the universe, the MWI is in fact simpler and therefore
more probable than the traditional interpretation.

This is one of the things that bothered me in Jurgen Schmidhuber's paper,
Algorithmic Theories of Everything,
http://www.idsia.ch/~juergen/toesv2/toesv2.html.

His "Example 1.1" at http://www.idsia.ch/~juergen/toesv2/node2.html
describes a universe like ours that has wave-function collapse, which
requires the continual creation of a large number of random bits.
He proposes the use of a pseudo-random number generator as the source
of this randomness, and later explores the possibility that this would
imply that we could detect patterns in quantum randomness, and (if I
understood this part right) that quantum computers would not work.

IMO this line of analysis is misguided (note, I'm not saying that the rest
of the paper is wrong). None of these perhaps-unlikely observations
is a true prediction of assuming his Speed Prior as the foundation
for a universal computation engine. It would make much more sense,
in terms of his overall approach, to assume that we ourselves live in
a universe whose physics is based on the MWI. Such a universe has a
simpler physical description (so it seems), therefore a smaller program
and higher measure. And it requires no information creation, hence no
need for a random number generator, neither true nor pseudo.

Quantum randomness does not exist in the MWI. It is an illusion caused by
the same effect which Bruno Marchal describes in his thought experiments,
where an observer who is about to enter a duplication device has multiple
possible futures, which he treats as random. If Schmidhuber would adopt
this model for the physics of our universe it would improve the quality
of his predictions.

Hal Finney
Received on Wed Sep 04 2002 - 10:10:02 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:07 PST