Re: why not high complexity?

From: <juergen.domain.name.hidden>
Date: Wed, 30 May 2001 17:05:31 +0200

> From: Karl Stiefvater <qarl.domain.name.hidden>
> Date: Mon, 28 May 2001 00:11:33 -0500
>
> O OO OO Max Tegmark suggests that ".. all mathematical
> O O OOOO structures are a priori given equal statistical
> OO OO O weight" and Jurgen Schmidhuber counters that
> O O OOO "there is no way of assigning nonvanishing
> OOO O probability to all (infinitely many)
> OO OOOOO mathematical structures"

Not quite - the text in "Algorithmic Theories of Everything" says
"... of assigning nonvanishing _equal_ probability ..."

> and he then goes on
> O (i think) to assign a weighting based upon
> OO OOO time-complexity.

Most of the weightings discussed in "Algorithmic Theories of Everything"
completely ignore time, except for one: the speed prior S derived from
the assumption that the universe-generating process is not only computable
but also optimally efficient.

Concerning time-independent weightings: Different computing devices
(traditional Turing Machines, Enumerable Output Machines, General Turing
Machines) reflect different degrees of computability (traditional monotone
computability, enumerability, computability in the limit). This causes
various weightings. All favor short descriptions, given the device.

> O O OOO i have to say i find Tegmark's argument more
> OOOOOO O persuasive - i can't see why the great
> O OOOOO programmer should be worried about runtime.

Even if He completely ignores runtime, He still cannot assign high
probability to irregular universes with long minimal descriptions.

> O OOOOO OO furthermore, i feel intuitively that the
> O O universes ought to have equal weight.

Some intuitively feel the sun revolves around the earth.

> OO OO such a sort of probability can be defined, of
> O O O course, by taking the limit as finite subsets
> OOOOOOOO O approach the full infinite set. as long as we
> O O OOO get the same answer regardless of the order in
> O O O which we grow the subset, the limit can be said
> OOO O O O to be defined.

??? - There is no way of assigning equal nonvanishing probability to
infinitely many mathematical structures, each being represented by a
finite set of axioms.

Maybe the intention is to assign measure 2^-n to all histories of size n.
That would imply our environment will dissolve into randomness right now,
because almost all continuations of its rather regular history so far
are random. But instead the universe keeps following the nonrandom
traditional laws of physics, thus offering evidence against this measure.

> O O O the problem is - such a view predicts that we
> O O OO live in a universe of high Kolmogorov complexity
> OO OO O - not low complexity.
> O O
> OOOOO OO O but i don't see why this is such a surprise
> O O - living in such a universe, we ought to see
> OO OO O events occur which we cannot effectively
> O OOO O O predict. but that is exactly what we do see.

Practical unpredictability due to deterministic chaos and Heisenberg etc
is very different from true unpredictability. For instance, despite of
chaos and uncertainty principle my computer probably will not disappear
within the next hour. But in most possible futures it won't even see the
next instant - most are maximally random and unpredictable. Any irregular
future, however, must have small measure, given the rather harmless
assumption of formal describability or computability in the limit.

http://www.idsia.ch/~juergen/everything/html.html
http://www.idsia.ch/~juergen/toesv2/
Received on Wed May 30 2001 - 08:17:16 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:07 PST