- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]

From: <juergen.domain.name.hidden>

Date: Wed, 20 Dec 2000 16:32:47 +0100

*> From everything-list-request.domain.name.hidden Tue Dec 19 22:34:43 2000
*

*> Please explain in what sense FAST is the most efficient way to compute
*

*> everything. I couldn't find a formal definition of this idea. What makes
*

*> FAST the most natural choice for defining S? How about an alternative to
*

*> FAST, which does the same thing except that in phase i it executes 2^i
*

*> instructions of all program prefixes satisfying l(p) <= 2^i? Why not use
*

*> this algorithm instead of FAST?
*

It's all in Section 6. Please read 6.1 to get the basic idea, read 6.2

to understand why Levin Search and FAST are optimal. FAST computes the

n-th bit of each universe x as quickly as the fastest algorithm for x

(save for a constant factor). Your alternative does not.

As a consequence, with only countable time and space requirements,

FAST computes each universe computable in countable time.

*> A related issue is what effect changing the model of computation has on S.
*

*> For example, changing from a classical TM to a quantum TM makes some
*

*> strings much faster to compute, so S would seem to depend very much on the
*

*> details of the model of computation. Suppose the "great programmer" had
*

*> access to a computer that has an instruction to duplicate itself and run
*

*> two programs in parallel (and each child computer can also duplicate
*

*> itself), and suppose there's also an instruction to concatenate the
*

*> outputs of the child computers together. Then we can compute everything
*

*> much faster than FAST, by duplicating the computer whenever a new bit is
*

*> read from the program tape. In fact, with this computer the above
*

*> alternative to FAST (which we can call FASTER) only uses 2^i time units in
*

*> phase i.
*

*> A speed prior based on FASTER would not lead to the conclusion that any
*

*> apparent randomness is pseudorandom, and if the above model of computation
*

*> is used, it would also satisfy Postulate 6.1.
*

Just like the "infinitely accelerating computer" discussed here earlier.

But you might have noticed that the paper does not assume the existence

of TMs more powerful than general TMs (quantum physics does not require

those). Duplicating forever leads to uncountable sets, which we reject

for lack of evidence they are necessary to explain the world - they make

the explanation much more complicated than necessary.

The paper is about what's possible in the constructive realm.

All the results are consequences of constructive definitions of

describability. Nondescribable things are beyond description (Section 8).

Juergen Schmidhuber

P.S: version 2.0 now available (added references, improved readability)

TR IDSIA-20-00, Version 2.0, 20 Dec 2000; 10 theorems, 50 pages, 100 refs

(based on Version 1.0, Nov 2000, http://arXiv.org/abs/quant-ph/0011122)

ftp://ftp.idsia.ch/pub/juergen/toes20.ps.gz (gzipped postscript, 182K)

ftp://ftp.idsia.ch/pub/juergen/toes20.ps (postscript, 478K)

ftp://ftp.idsia.ch/pub/juergen/toes20.tex.gz (gzipped latex, 52K)

ftp://ftp.idsia.ch/pub/juergen/toes20.tex (latex, 155K)

ftp://ftp.idsia.ch/pub/juergen/toes20.dvi.gz (gzipped dvi, 84K)

ftp://ftp.idsia.ch/pub/juergen/toes20.dvi (dvi, 208K)

http://www.idsia.ch/~juergen/onlinepub.html (various formats)

PS: I am also seeking two new postdocs, please see the announcement in

http://www.idsia.ch/~juergen/postdocs2001.html

Received on Wed Dec 20 2000 - 07:45:06 PST

Date: Wed, 20 Dec 2000 16:32:47 +0100

It's all in Section 6. Please read 6.1 to get the basic idea, read 6.2

to understand why Levin Search and FAST are optimal. FAST computes the

n-th bit of each universe x as quickly as the fastest algorithm for x

(save for a constant factor). Your alternative does not.

As a consequence, with only countable time and space requirements,

FAST computes each universe computable in countable time.

Just like the "infinitely accelerating computer" discussed here earlier.

But you might have noticed that the paper does not assume the existence

of TMs more powerful than general TMs (quantum physics does not require

those). Duplicating forever leads to uncountable sets, which we reject

for lack of evidence they are necessary to explain the world - they make

the explanation much more complicated than necessary.

The paper is about what's possible in the constructive realm.

All the results are consequences of constructive definitions of

describability. Nondescribable things are beyond description (Section 8).

Juergen Schmidhuber

P.S: version 2.0 now available (added references, improved readability)

TR IDSIA-20-00, Version 2.0, 20 Dec 2000; 10 theorems, 50 pages, 100 refs

(based on Version 1.0, Nov 2000, http://arXiv.org/abs/quant-ph/0011122)

ftp://ftp.idsia.ch/pub/juergen/toes20.ps.gz (gzipped postscript, 182K)

ftp://ftp.idsia.ch/pub/juergen/toes20.ps (postscript, 478K)

ftp://ftp.idsia.ch/pub/juergen/toes20.tex.gz (gzipped latex, 52K)

ftp://ftp.idsia.ch/pub/juergen/toes20.tex (latex, 155K)

ftp://ftp.idsia.ch/pub/juergen/toes20.dvi.gz (gzipped dvi, 84K)

ftp://ftp.idsia.ch/pub/juergen/toes20.dvi (dvi, 208K)

http://www.idsia.ch/~juergen/onlinepub.html (various formats)

PS: I am also seeking two new postdocs, please see the announcement in

http://www.idsia.ch/~juergen/postdocs2001.html

Received on Wed Dec 20 2000 - 07:45:06 PST

*
This archive was generated by hypermail 2.3.0
: Fri Feb 16 2018 - 13:20:07 PST
*