hypotheses of everything

From: Jacques M. Mallah <jqm1584.domain.name.hidden>
Date: Mon, 16 Nov 1998 18:32:23 -0500 (EST)

        Hello. Here's my take, as advertised. This is a public forum, so
I hope Pr. Tegmark doesn't steal any more ideas :-)
        BTW, I haven't quite figured out the archive yet. Really, it
ought to be on the web, as text.

        To me, it made sense that any symmettry in math, ought to be a
symmettry in nature; I called it the postulate of symmettry. Closely
related was the zero information postulate (ZIP). Of course, it's not
really a new idea that 'maybe every possible thing exists'. The questions
are: What is the precise mathematical statement? Does it really have
no free parameters (ZIP)? What predictions does it make, and do they
match our observations?
        Ok, first, it is not enough to say that all mathematical stuctures
exist. There are infinitely many, and the predictions you get depends on
how often you count members of each type. What is needed is a way of
parametrizing the space of all mathematical stuctures, and then arguing
that the proper way to count is to assume a uniform distribution in those
parameters. I don't know of such a way, but even if there is one, the
problem is it is not likely to be the only possible way. So we have free
parameters, if not.
        That said, it might still be a promising approach to try to
simplify theories of physics (although not all the way down to ZIP) by
embedding them in some such parameterized ensemble.
        As it turns out, at the time I was looking at this stuff, I did
not realize that causal relations play a critical role in
computationalism. Some of my conclusions needed updating due to that, and
I think the result is interesting.
        I looked at the digital sector, namely the Turing machine sector,
due to increased tractability.
        The most promising parameterization for this sector seems to be
assuming a uniform distribution in the space of Turing machine programs
(i.e. laws of physics, as opposed to being uniform in the space of
solutions to those laws **which is another way of parameterizing**).
        You might think that the typical program would have infinite
length. That would be bad, since we need a mechanism to make the laws
simple like the ones we know. Fortunately there is a trick, which is that
a Turing machine can encounter a command like 'loop back', and everything
after that command will never be acted upon. So, in fact, the probability
of any finite program of length l will go as 1/2^l, since the bits after l
are chosen arbritrarily.
        But there's another problem. A program can 'cheat' by, for
example, building a simple but large nerve net and cycling through all
possible initial conditions; some of them are bound to allow
consciousness. The extreme case would be a program which just says 'run
all possible programs'.
        However, if we assume all the original programs are running at the
same speed, the thing to do would be to look at the total number of
conscious computations implemented by each as time goes to infinity. (For
more on what I mean by an implementation, which is a nontrivial concept,
see my web page on interpretation of QM.)
        The 'probability' of a program giving rise to our observation is then
(N_c/t) 2^-l, where N_c is the number of conscious implementions, t the
number of time steps, and N_c/t presumably fixed for large t which can be
achieved by perioidic 'restart' commands if needed.
        This may suggest that a program which simulates simple equations,
without needing a lot of data for initial conditions (which would make a
huge difference in l) would be most likely. Darwinian evolution might be
likely, as a way of getting complexity without complicated initial conditions.
        Could our universe be a computer simulation of such a nature?
Quantum systems require exponential computer time as a function of
simulated time, but the number of implementations might diverge at a
similar rate, so that's not a solid argument against it. A way to
disprove it would be to find a program that maximizes the above
'probability' better than such a simulation. I used to have arguments for
why it would surely fail, but sorry, they're a bit out of date. Maybe
you guys could help me show it fails. Even if it does not fail, it is
more my idea than Pr. Tegmark's more general and ZIPed one.
        Anyway, you get the idea. I don't know how the corresponding
treatment for the analog sector might work.

                         - - - - - - -
              Jacques Mallah (jqm1584.domain.name.hidden)
       Graduate Student / Many Worlder / Devil's Advocate
"I know what no one else knows" - 'Runaway Train', Soul Asylum
            My URL: http://pages.nyu.edu/~jqm1584/
Received on Mon Nov 16 1998 - 15:41:53 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:06 PST