Re: Tegmark is too "physics-centric"

From: Bruno Marchal <marchal.domain.name.hidden>
Date: Tue, 09 Mar 2004 16:15:44 +0100

Hi Stephen,


> It seems to me that COMP is more general that computationalism since it
>seems to include certain unfalsifiable postulations that are independent of
>computationalism per say, AR, to be specific.



A can be unfalsifiable, and B can be unfalsifiable, but this does not entail
that A & B is unfalsifiable. Take A = "god exists", and B = "God does not
exist" as an exemple. I don't know if AR per se is unfalsifiable, but I do show
that comp is falsifiable. But you don't have begin to criticize the proof ...


>My own difficulties with
>Bruno's thesis hinges on this postulation. I see it as an avoidance of a
>fundamental difficulty in Foundation research, how to account for the 1st
>person experience of time if one assumes that Existence in itself is
>Time-less.



I would like to stay modest, but this is well explained once you realise
that the simplest thaetetus definition of knowledge, where

                     (I know p) = (p and (I prove p))

leads directly toward an antisymmetric form of branching time modal theory,
(S4Grz), very akin to Brouwer's theory of time/consciousness.



> This is somewhere else that I trip over and fall in my thinking of your
>work, Bruno. Is this "no mechanism can compute the output of any
>self-duplication" a classical version of the "no-cloning" theorem?



Not so directly, but yes I do think those are related. (But it's a little
out of the scope we are presently discussing).



> Does my comment above about how to bridge this gap of emulating a brain
>and emulating the entire universe? If it does it would seem to dramatically
>increase the computational power requirements of the emulating computation
>on top of the exponential slowdown.
> One technical question I have about this is: if we assume that the
>emulated universe is finite, what would be the equation showing the required
>computational power of the emulator given an estimate of the total
>algorithmic and/or information content of the universe?


The main idea on which most agrees in this list is that the information
content of the "everything" (or the multiverse, or UD*, ....) is zero.


> Additionally, what are we to make of results such as the Kochen-Specker
>theorem that show that given any quantum mechanical system that has more
>than two independent degrees of freedom can not be completely represented in
>terms of Boolean algebra?


You should say "can not be completely represented with a logical morphisme in
a boolean algebra. But that does not entail that some other representation will
not work. This is obvious: the theory "quantum mechanics" *is* a boolean
theory;
the hilbert spaces are classical mathematical object, etc. Or better, take the
Goldblatt theorem (which plays a so prominent role in my thesis). It says that
(where B is some modal logic):

Quantum Logic proves a formula A iff
the classical modal theory B proves []<>A.

It's like the theorem of Grzegorczyk which says that Intuitionistic Logic
proves a formula A
iff the classical modal theory S4Grz proves []p.

The transformation A =====> []<>A is just not a morphism in the
Kochen & Specker sense.

The "physical reality" I extract from comp cannot itself be embedded in a
boolean algebra, apparently (I have not yet a totally clean proof of that
statement, but let us say that only a high logical conspiracy would make
boolean the "arithmetical quantum logic").

Bruno


http://iridia.ulb.ac.be/~marchal/
Received on Tue Mar 09 2004 - 10:18:23 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:09 PST