This is responding to an old post by Bruno. I have only just really
started understanding what this was about. I believe that my argument
applies to the White Rabbit problem arising from the Universal
Dovetailer Argument. You see, it doesn't just rely on the universal
prior, but on the fact that the compact, lawful universes will be
surrounded by a cloud of non-compact universes, indistinguishable from
the compact ones, but outnumbering those that are
distinguishable. This is due to the finiteness of the "concept space"
of the SAS. In fact, taking things to the limit (the 2^\aleph_0
computational extensions) the set of White Rabbit universes will have
measure zero in the total set of such continuations, even though the
lawful universes are relatively less numerous than the WR universes.
Cheers
>
>
> Russell Standish wrote:
>
> >I will need to reread your thesis and white rabbit paper before
> >commenting on your criticisms much more. However, I'm a little
> >surprised by your following comment, because it seems to me that I
> >solved the WR problem in first person only. (Not that I started out
> >trying to do this). Maybe I'm solving a different WR problem :)
>
> Well, your "error" (with comp, IMO), is that you still attach the
> first person to a third person body/universe. This is why you
> think that the universal prior are enough to solve the WR (white
> rabbit problem).
> By comp we survive multiplication of oneself, and our experience
> doesn't depend of the time we are reconstituted. That is why,
> concerning our first person experiences, we must quantify the domain of
> indeterminacy (due to the "natural" multiplication given by universal
> dovetailing) on the set of all relatively consistent extension or
> computational continuation of oneself.
> There is 2^aleph_0 such computational continuations.
> In that case universal prior are not enough, and, as I was saying to
> Wei Dai, the littleness of the "originel explanation" is not enough.
> Two other phenomena must occur and be explained: the depth of the
> computation and the explosion of the number of relative parallel
> universe (computation). Note that MWI + decoherence theory" solve
> that problem, because decoherence is essentially a
> (super)entanglement of the object with environment including the
> (third person) observer, and that explain the big numbers of very similar
> worlds we can expect to be.
> But with comp the fact that [MWI + decoherence
> theory" solve that problem] does not solve the (WR) problem unless
> we explicitely derived QM from the set of relatively
> consistent computational extensions of oneself.
> It seems Schmidhuber does not realise that QM is a confirmation of comp.
> He seems to be glad having a computational interpretation/view of MWI.
> He does not realise that comp by itself implies the communicable
> observable
> indeterminacy and the relativeness of states.
> Now comp implies a priori a much more big indeterminacy, especially if
> you include the "pure" observable-but-not-communicable indeterminacy,
> so...
> ... we must explain the absence of third person white rabbit + the
> absence of first person white rabbit.
> Universal prior make the disappearance of the first person view of the
> third person rabbits, and it explain why white rabbits does neither appear
> in our sharable collection of experiences nor in our scholar manuals.
> But universal prior still doesn't explain why I, personally, should not
> expect departing slowly and continuously from these laws and observing
> personal and first-person-only white rabbits.
> Note that with a strong NON-comp axiom you can "solve" that problem
> easily by attaching (ad-hoc-ly) the first person to the
> "material and univoquial " third person. (But what would that mean ?).
>
>
> Russell Standish wrote also to Alastair:
>
> >I'm glad you [Alastair] understand, because it is a subtle point.
> >Tegmark is
> >embedded in Schmidhuber,
>
> If by the whole mathematics, you mean A whole set of consistent
> mathematical theories producible by (consistent) machines, I can agree.
> (The problem is that there are a lot of such sets, including
> orthogonal one which are mutually exclusive).
> So this embedding is difficult to make precise.
>
> >but Schmidhuber is but one element of
> >Tegmark.
>
> Yes, sure. And whith Church Thesis this embedding can be made precise
> independently of the mathematical fuzziness of "Tegmark's Whole Math".
> I.e the embedding is formalism independant.
>
>
> >This naturally implies an infinite recursion of Tegmark's ensemble
> >containing an element which generates whole ensemble over again, just
> >as Schmidhuber's ensemble contains the "Great programmer" generating
> >the whole ensemble again.
>
> Yes.
>
> >There doesn't appear to be any problems with
> >this remarkable fact though.
>
> I hope so. Nevertheless, that remains to be seen.
>
> Bruno
>
>
----------------------------------------------------------------------------
Dr. Russell Standish Director
High Performance Computing Support Unit,
University of NSW Phone 9385 6967
Sydney 2052 Fax 9385 6965
Australia R.Standish.domain.name.hidden
Room 2075, Red Centre
http://parallel.hpc.unsw.edu.au/rks
----------------------------------------------------------------------------
Received on Mon Dec 20 1999 - 20:36:16 PST