- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]

From: Hal Finney <hal.domain.name.hidden>

Date: Tue, 4 Nov 2003 09:17:54 -0800

[Note that I take the liberty of replying only to the list, so that

senders of earlier messages in the thread do not receive multiple

copies of my messages.]

David Barrett-Lennard writes:

*> An interesting idea.
*

*>
*

*> Where can I read a more comprehensive justification of this
*

*> distribution?
*

Juergen Schmidhuber, who originated many of these concepts, has a

number of articles on distributions. He is not a big fan of the UD

but I think it is appropriate for what seems to me the simplest model

of computing, Turing equivalent machines. He has a page on the UD

at http://www.idsia.ch/~juergen/icmlkolmogorov/node2.html and if you

look around his site you can find much more.

Chaitin's work on Algorithmic Information Theory is related as well,

http://www.cs.auckland.ac.nz/CDMTCS/chaitin/. He works with a measure

of complexity based on program bit length that implicitly points to

the UD. Chaitin's most famous result relates to the constant Omega,

the probability that a random program will halt, where again the programs

are implicitly selected according to the UD.

*> If a number of programs are isomorphic the inhabitants naturally won't
*

*> know the difference. As to whether we call this one program or lots of
*

*> programs seems to be a question of taste and IMO shows that "probability
*

*> calculations" are only relative to how one wants to define equivalence
*

*> classes of programs.
*

The idea is that if all program strings are equally probable (admittedly

a problematic assumption given that programs can be infinitely long),

that implicitly leads to the universal distribution as a measure.

This solves the problem you raised (about why the universe is lawful)

in a very natural way.

*> I would expect that the probability distribution will depend on the way
*

*> in which we choose to express, and enumerate our programs. Eg with one
*

*> instruction set, infinite loops or early exits may occur often - so that
*

*> there is a tendency for simplistic programs. On the other hand, an
*

*> alternative instruction set and enumeration strategy may lead to a
*

*> distribution favoring much longer and more complex programs. Perhaps it
*

*> tends to complicate programs with long sequences of conditional
*

*> assignment instructions to manipulate the program state, without risking
*

*> early exit. Importantly such "tampering" doesn't yield a program that is
*

*> isomorphic to a simple one. We seem to have a vast number of
*

*> complicated programs that aren't reducible to simpler versions. This
*

*> seems to be at odds with the premise (of bits that are never executed)
*

*> behind the Universal Distribution.
*

If I understand your concern, this has been shown to be not a problem.

Any universal computer can simulate any other using a prefix of fixed

size. Therefore the size of minimal programs for any two universal

computers can vary by only a constant, and the UD's determined by these

two computers will similarly agree to within a constant factor.

Hal Finney

P.S. In http://www.escribe.com/science/theory/m3666.html I discussed a

way in which the UD might actually give us reason to expect the universe

to be surprisingly unlawful.

Received on Tue Nov 04 2003 - 12:23:08 PST

Date: Tue, 4 Nov 2003 09:17:54 -0800

[Note that I take the liberty of replying only to the list, so that

senders of earlier messages in the thread do not receive multiple

copies of my messages.]

David Barrett-Lennard writes:

Juergen Schmidhuber, who originated many of these concepts, has a

number of articles on distributions. He is not a big fan of the UD

but I think it is appropriate for what seems to me the simplest model

of computing, Turing equivalent machines. He has a page on the UD

at http://www.idsia.ch/~juergen/icmlkolmogorov/node2.html and if you

look around his site you can find much more.

Chaitin's work on Algorithmic Information Theory is related as well,

http://www.cs.auckland.ac.nz/CDMTCS/chaitin/. He works with a measure

of complexity based on program bit length that implicitly points to

the UD. Chaitin's most famous result relates to the constant Omega,

the probability that a random program will halt, where again the programs

are implicitly selected according to the UD.

The idea is that if all program strings are equally probable (admittedly

a problematic assumption given that programs can be infinitely long),

that implicitly leads to the universal distribution as a measure.

This solves the problem you raised (about why the universe is lawful)

in a very natural way.

If I understand your concern, this has been shown to be not a problem.

Any universal computer can simulate any other using a prefix of fixed

size. Therefore the size of minimal programs for any two universal

computers can vary by only a constant, and the UD's determined by these

two computers will similarly agree to within a constant factor.

Hal Finney

P.S. In http://www.escribe.com/science/theory/m3666.html I discussed a

way in which the UD might actually give us reason to expect the universe

to be surprisingly unlawful.

Received on Tue Nov 04 2003 - 12:23:08 PST

*
This archive was generated by hypermail 2.3.0
: Fri Feb 16 2018 - 13:20:08 PST
*