Re: Paper+Exercises+Naming Issue

From: Benjamin Udell <budell.domain.name.hidden>
Date: Sun, 15 Jan 2006 13:04:02 -0500

Russell, list,

[Russell]>>> The particular Plenitude I assume (ensemble of all bitstrings) is actually a completely uninteresting place to have a view of (it has precisely zero informational complexity).
[Ben]>> Is this kind of Plenitude (ensemble of all bitstrings) more or less Tegmark's Level IV of all mathematical structures? (I.e., if it's different, does the difference involve a restriction to discrete or finitistic structures or some such?
[Russell]> It does correspond to Tegmark's level 4, but Tegmark's proposal "All mathematical structures" is rather vague. I have interpreted his proposal as "all finite axiomatic systems". This is in fact a subset of my ensemble (well basically Schmidhuber's ensemble) of all descriptions (since an FAS is a description), yet one can also describe the entire ensemble of descriptions by a finite method (the "dovetailer"), hence one can find the ensemble of all descriptions contained within Tegmark's.

The "dovetailer" keeps sounding like a powerful idea. I do remember that it has often been mentioned here, but somehow I failed to pick up a sense of what it was really about. Was there a message to the Everything-List in which it was explained so that non-experts can understand it? I'm not asking you to track that message (or series of messages) down, but if you or somebody remembers around which month it was, that should be enough for me to find it. Or is there a link to a Webpage with such an exposition?

[Russell]> Note, however that the relationships going both ways do _not_ imply equivalence between the two ensembles. This is described in my paper "Why Occam's Razor", as well as talked about on the everything list.

[Ben]>> ....
>> IV. possibility waves (variational principles)
>> III. probabilities for various outcomes
>> II. information, news, outcomes, events, interactions, phenomena
>> I. evidence of causes/dependencies (dependencies, e.g., emission --> open slit --> hit)

[Russell]> I'm somewhat sceptical of your associations here, but it is possibly because I don't understand what you're getting at. You may need to develop this some more.

I think I may have made it sound more like my own idea than what it actually seems to me. First, here's the part where I haven't thought that I was going out on a limb:

Level III varies across quantum branchings. Level II varies across times and places along a single quantum branch in such a way that its features come out the same as Level III's features.
Where the experimental setup remains the same, successive particles emitted make, collectively, a pattern of hits such that the pattern corresponds to the probability distribution for particle hits. The pattern consists of variations of hit times and locations along a single quantum branch, such that the pattern reflects the variations across quantum branchings. So there you see Level III & Level II aspects.

At Level I, individual histories are especially important and that which happens is partly attributable to "idiosyncrasies" of one's Level I universe, such that one must study its individual history and explain things by historical causes. I'm not sure whether I've said enough there, so if the following is redundant, I apologize. If there's something arbitrary about a Level I universe's constants and initial conditions, then there is variation, across Level II, among Level I universes or inflationary bubbles (I'm saying "Level I universe" instead of "Level I multiverse" because you've said that Tegmark's use of the word "multiverse" isn't standard; but if "universe" has a technical sense here that confuses the issue, then I don't mean it in that technical sense). But whether it's an issue of constants and initial conditions or of something else if anything at all (Tegmark seems unsure), a variation across inflationary bubbles in a Level II universe means that there are aspec!
 ts of our Level I universe which are part of a pattern of variation such that our Level I universe is not a representative sample of instances (like the pattern of accumulated particle hits) but instead a single instance (a single particle hit). To establish what are these "arbitrary" aspects, we vary the experimental setup and seek the constants and infer relationships between patterns of outcomes and the variations among initial conditions of various experiments. Now we're establishing the significance of variations across various evolutions of the possibility waves into which the various experimental setups were factored and finding that some relationships (or aspects of relationships, the particular quantities involved, etc.) seem arbitrarily to impose themselves -- seemiingly arbitrarily set constants of nature etc. (Again, if the fundamental constants and initial conditions of our Level I universe turn out not to be somewhat arbitrary and turn out not to mark a variat!
 ion across Level II, then there would be to seek out other thi!
 ngs whic
h do mark such variation, though I have no idea what and I admit that the overall picture seems weakened in such a case.)

Now, Level III's quantum branchings represent probabilities. With Level II, the "hits" in their pattern along a single quantum branch are pieces of information quantifiable in terms of what was their probability before they happened. If one does not have the Level III probability distributions, then the patterns are news of those distributions. But if one does have those probability distributions, then what remains news is their appearance as a "biased coin" (if that's the right metaphor) which we explain by assembling and fitting together the logical puzzle pieces of history and initial conditions, with our our Level I universe explained as being an instance of Level II variation and as not being a fully representative sample of Level II variation.

Now, here's where I've thought that I was going more out on a limb. I've extended this pattern of correlations backward in order to suppose that there's some sort of association between Level IV and the possibility wave in its evolution. I've noticed that the formalism for this possibility wave involves variational principles, optimizational equations, in crucial ways. It's about least action, mathematical stationary points, etc., not that I understand this stuff at all well. I've noticed that the series "optimization, probability, information, logic" has, among its pure mathematical correlates, respectively, many-to-many relationships (graphs, extremization, etc.), one-to-many relationships (antiderivatives, integrals, measure), algebra, groups (many-to-one relationships and functions), and one-to-one relationships (ordered structures). In other words, though I'm out on limb in saying this, I didn't need the particle/wave's career in order to formulate that series, instead!
  it seems to me that it's not a random series but instead has its own logic. Actually I arrived at it independently of considerations about particle experiments. (Well, "optimization" is huge and not all of it is a deductive mathematical research area, they experiment with ants and so on. But some of it seems to be. Maybe the others would claim that they're just already doing that which Chaitin called on mathematicians to do, slacken the rigor and explore :-) ) Anyway, I tend to think that there's an interesting, not-too-fragile, and, in its way, exhaustive pattern there, and such seems appropriate insofar as we're talking about the whole shebang, one in which, in some sense, everything exists. (I'm not saying that I believe that Tegmark is right or that the idea that "everything exists" is right -- I'm agnostic there). Interestingly, even apart from these grand cosmologies, the pattern does seem to be there with regard to stages in the particle's career in an experiment.

And, with regard to Level IV, that's also where I feel out on a limb. I can see, in a layman's vague way, that variations of mathematical structure would involve variations of what count as shortest paths and extrema, and that this would ramify into the possibility wave insofar as it involves optimizational equations. This also may mean that the "biased coin" which I mentioned reflects not only variation across various Level I universes, but across the mathematical structures of Level IV. I tend to think that these two kinds of "bias" would be rather distinct, but I don't know how to think about it. A difference in the mathematical _structure_ of the "coin," as opposed to a difference in its probability distributions (as reflecting quantitative differences, across Level I universes, in relations among constants) would, I guess, be something rather stronger than a mere "bias." A different _structure_ of constants is a whole different "coin"?

I wouldn't know how to modify this in accordance with your version of Level IV, my understanding of it is significantly weaker than my already vague understanding of Tegmark's Level IV.

But I haven't noticed anybody here talking about variational principles or optimizational equations in any connection, much less in relation to Level IV. (While there is an obvious echo of optimization in applying Occam's Razor to Level IV's mathematical structures, this doesn't seem to involve any application of mathematical extremization, variations, Morse Theory, etc., so it seems not really the same thing. It's certainly not the only echo between a mode of inference (present instance: surmise, simplest explanation) and a mathematical formalism (extremization, shortest paths, etc.).)

Well, that's quite enough. I wish I could have made it briefer. Thank you for your patience.

Best, Ben Udell
Received on Sun Jan 15 2006 - 13:10:12 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:11 PST