Brian Tenneson wrote:
>
> Thanks for your reply. I have a lot to say, so let me try to rate my
> breath, as it were.
>
> 1. It is nice to hear a human say this is uncharted territory.
> .
> .
> I think my main improvement, while not really coming close to really
> answering my question, was changing the goal from prove Russell's
> Theorem is not always true to asking the question "Is Russell's
> Theorem true in all logics?" A bonus seems that now there is a
> theoretical physics, by way of the MUH, motivation for answering this
> question.
>
This is an important task. As I mentioned, the direction of
concepts-progress is: 'towards maximum generalization' -- even
absolute generalization, if you will. An encompassing single
notion, or limited group of notions, that imply 'all else'.
Simplest principle(s). Matching the sensibility connected
with a 'theory of everything'. Simplest qualia.
> 3. On that note, Physics/Philosophy actually what inspired me to go in
> this direction. I was mainly, back then when this idea of trying to
> find a consistent universal set theory occurred to me, trying to
> answer a intended-to-be serious argument against the existence of the
> universe.
>
> I was stunned at the notion that someone was trying to prove the
> universe does not exist. I think they were asserting some form of
> solipsism.
>
> In a nutshell, here was their argument. My opinion is that it is not
> at all formal but very clever and probably persuasive but, ultimately,
> like the many clever "proofs" that 1=2 and such. It's just going to
> be convincing to those who aren't vigorously attacking the argument,
> which I soon did.
>
> <begin their argument for the non-existence for the universe>
> Definition: To contain means <insert something most people would
> accept here>. The notation and word for 'is contained in' is
> is<in.
>
> Thing and exists are undefined or ... acceptably defined only be
> common intuitive sense of what a thing is, but neither formally (in
> her argument)
>
> Definition: the universe (call it U) is a thing that has the property
> that it contains all things, notated by (x) (x is<in U), where x is a
> thing.
This in itself is a problematic conjecture (presumption). So fundamental
in fact that no past or current analysis has enunciated the criteria-error.
(The reason for this is illuminated by Benj Whorf's linguistics analysis
circa 1936 ... which paraphrasedly states that, absent experiential recognition,
systemic information self-insulates on itself.) In this case, the presumption
is that perfect quantification is possible; and in -that- basis, that probability
valuations are designatable (fixed), for all situations and scenarios possible.
That is -not- the 'generalized case'. Those presumptions, which classical non-FL
math is built on, is closely-defined and therefore godelianly incomplete.
Specificly - there are at least two non-considered factors in conventional
computation: all possible simpletemporal-conditions, and, gross-set and
sub-setS of relations that 'exist' when the entire spectrum of simpletemporal
conditions are included.
This situation stems from the mathematical principle: "Simplfy". Yes, it
helps remove extranous information-noise and makes some certain relationship
clear and identifiable. But it also -removes- from thoughtful consideration
the information resident in and analytically important about - the total
mathematical environment.
Let me give you a pragmatic example with at least two ramification
implications that conventional analysis/presumptions -totally miss-.
Consider the gaussian-mean curve. It has been classically analyzed
to death; all things about it considered: complete/known.
That is a major deficiency/error.
Consider the equation form that produces the standard-deviation
curve. It is known; isolated, independent.
Now consider any 'real events' that produce and mimic/map the curve.
I like to use two, each which highlight two missing-consideration
factors. First, is random test results from some 'standardized'
exam. If you set up a criteria for accurate/innacurate answers,
the resulting spectrum is typically the standard deviation curve.
The time sequence of the answers registered is open, just the net
patterned result. In otherwords, the distribution curve misses
two essential input-factors: the reason the testing event happened,
and, the time frame of measurement. The testing event is 'factored
out' -causal impetus/energy- brought to zero/one, the time frame of
testing is 'factored out' (brought to zero/one).
Second is a pachinco apparatus, with balls falling though a
matrix of pegs. Run enough sample events and you reproduce
gaussian mean-distributiuon curve. But there are at least
two missed factors/presumptions. One is the presumption of
component ordered-relations. And relatedly, the presumption
of a stable universal impetus-field being present; it is
assumed, taken for granted, and ... 'factored out'. The
gravity/gradient field the apparatus resides in.
Take the pachinco apparatus made of wooden pegs. Placed the
board non-orthogonal to the gravity gradient field. Run
samples with matched-to-pegsize ball bearings and you get
the traditional result.
Run the sampling again, but use bowling balls. The 'standard
mean' curve is now and every time - a straight line. This is
an extreme, but ALSO IMPORTANT limit potential of the
standard deviation. Run the sampling again, using perfectly
elastic/reflective particles and some runs produce the
refraction patterns seen during the early atom-nucleus
investigations - particles curved or shadowed by their
reaction to encounters with the form or fields of the
atomic nuclei. Run the sampling again with the balls and
pegs matched as originally, but place the board flat on the
ground orthogonal to the 'motivation' field of earths gravity;
or place the apparatus in far outerspace in the appropriate
'initial configuration for starting a test run'. In both these
environmental situations/orientations -- nothing happens. No
'curve' or results get done. In otherwords, absent an
impetus/gradient, no 'standard deviation' profile is produced.
This means that there is a MISSING FACTOR which we discount
in computation work, because we presume SOME gradient or impetus
is 'always present' and need not be considered for existing ot
not. It means that for general statement accuracy a 'g' gradient
factor should be always written and stated ahead of the standard
equation form, to prevent its presence and potential impact on
evaluation be missed, dissed, and left out for those critical
conditions where the 'rate of event' or 'production of event
conditions' is ignored and remissly overlooked.
So right off the bat, the analysis of statistical potential
IN ALL THINGS is deficient by not correctly including one or
more time parameters (which coincides with: simultaneous
consideration of all-states, both when an entity or relation
is identifiably present, AND, when it/they are NOT.) This means,
that standard traditional statistical analysis is a SUBSET,
a limit set, of QM statistics which considers both existential
and neg-existential factors (potentia) .. SIMULTANEOUSLY.
Calculations are already deficient by not counting null-state
as a unit factor possibility. and null-state for each and
every non-null option.
When this is done, factorial enumeration counts become
insignificant. When null states are -included- in states
count, by the time you get to three existential non-null
count, the statistical alternatives of combinatoric options
is OVER 100 alternative distinct 'relations states with
considered potentials'.
And this is just with re-analysis of -standard- non Fuzzy Logic
(which I prefer to call Zadeh Logic in respect and honor of
its delineator/designer).
When you open the option parameters beyond 0,1 (which is no
less important than Complexity which now explores and uses
non-wholenumber 'dimensions' in exponents .. 'fractal dimensions')
you explore the fuller Stochastic space that logic must obligatorally
address and speak to as well.
.
.
.
.
Consistency and completeness then require re-review in a fuller
and larger context. that the previous bounded-logics were, and are not,
capable of dealing with.
There is no longer: A and not-A. There is conditional-A, probable-A,
never-A, partial-A, and ... each of these if/when/ever in union with
time parameter(s). I.e., non-Abelian sequencing alternatives of all
factors needs to be included as well. (got a headache yet? :-) )
For example, consider the 'options space' and 'options-potentials space'
of the universe at any given moment (this generally ties in with
multi-verse concepts, but puts a bit more meat on its bones ; and with
critically present errors in entropy analysis as its currently performed).
At any given moment, scenario events and causal results-states, not only
open up and future enable potentials that did not exist moments before,
there is the SIMULTANEOUS extinction, preclusion, closure and PREVENTION
of equally or larger domains of states-options can can no longer possibly
exist.
Current analytical methods totally ignore considering such plural unbounded
potentials subsets; especially on a scale that includes cybernetic relations
and transfinite spaces for existence, and alternative-existences, states,
and for relations/performance spaces concurrent with 'extancy'.
Re-worded: at any given moment and depending on which parameters and
extancy/potential set one uses as criteria for analysis .. some relational
entropies are increasing, while others are proportionally DEcreasing.
Entropy is not monolithic, there are categorical sub-domains, AND, there
are local regional proportionally changing domains. AND entroepy is not
exclusively 'thermodynamic'. Probability states differentials are evaluable
and patternable .. there is gradientable sensibility assignable to all
sorts of parameters. Each and every one has its own, and comparable to others,
entroepy aspect/gradient.
These and more are absolutely Zadeh Logic options, that standard logic,
computation, physics, conventional analysis is not built to evaluate;
and are deficient because of that.
Godelian incompleteness theorems - when generalized - wholly miss
important information, ignore relational constants, that are
superior and universal.
.
.
.
I know I didn't address your 'universe doesn't exist' logic review.
You were exampling a viable logic/analysis that is problematic and
illuminates for you the possibility that logic as currently practiced
harbors inconsistencies and errors, and you want to explore other
possibilities. I understand that. I agree with the anomaly you
identify and gave you reason to explore 'something else' and question
the 'what is'.
My above remarks showcased a few of the anomalies -I- recognized and
the conclusions I reached on re-review of ideas/understandings.
I -know- you are on a correct path of thinking/exploring.
Lots of great possibilities are ahead of you.
I found some wonderful things, like how to determine the
ratio of any-dimensioned sphere, volume::surface area,
without having to do any exotic calculus calculations.
I discovered that the Heisenberg Uncertainty principle
is a direct statement of spacetime geometry, a particular
limit equation of relativity options. In otherwords,
QM has a direct connection with Relativity. (!)
All sorts of new-realities are in the math. The uncourageous
and overly habituated practitioners could never discover them.
Good luck on -your- journey of discovery.
Jamie Rose
cc: RR list
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Sun Mar 23 2008 - 23:46:58 PDT