THE INTEGRITY PAPERS NUC Group ceptualinstitute.com

 

#11 Bulletin #11
April 16, 1998

Zero is 'Infinite' not Empty

There is a multipronged dilemma facing science in this last decade of the 20th century of the common era calender.  In guise, it is still the millenias-old issue of coming to terms - literally - with infinity.   Leibnitz and Newton were the first to successfully challenge such "limits thinking" imposed ever since Zeno.   Fourier, Cantor, Brouwer, Gamow, Feynman and other mathematicians - especially Chaitin - carried on the challenge, building formidable equations and perceptions which made the extremes of existence accessible to comprehension and interaction.

Russell and Whitehead, and then Gödel, countered with "proofs" that transfinite induction -- a qualitatively nice tool under most circumstances -- really had quantitative limitations when held to random examination, and so carried the implication that sentient comprehesions would always be trapped in and as part of a volume of information, superiorly and surroundingly encased in unexperienced unknown, a black-box out-there beyond the reach of induction.  This struggle has boiled to a head now.

I began dealing with it myself in the late 1960's and early 70's using the old terminology of the day, speaking about it as "behavior space", recognizing that nested interacting levels of physical construction could have the same mathematics applied to them independently - even statistical and calculus representation.   At some point, I reasoned, it would become mandatory to syntegrate the domains.  Infinite partitioning on the level of spacetime for large masses eventually requires mathematical connection with infinite partitioning on the level of sub-atomic particles.  And all else in between.   More than finding useful applications of mathematics on several orders of physical reality, finding meaning and real mathematical connections accessing  from any one to any other.    Syntegrating pluralities of infinities.  

So here in rests the burden and the challenge.  Complexity studies have opened the door, and though many recognize the ultimate issue, pragmatic headway is slow.   Chaitin (Omega Function) is one of the first to produce a tangible function (which I predict will prove to be of as great significance as the calculus itself).  Bohm, alone and with Hiley, have broached the topic in regard to quantum theory.  My own writings eventually congealed in "Understanding the Integral Universe" in 1992 and "Integrity: Getting past Gödel" (1995), with the transductive reasoning that for any existential system to be, it must exist in an extended environment -- be it physical, mathematical, or any else -- which shares some factor of compatibility.   Andre Linde dealt with this possibility by exploring and comparing the functionality/survivability of possible universes which have different physical constants.  Universes of quite different extraordinary structure and behaviorings existing in interactable region with one another.  Such region being "existence space".

The point that I was making at the time was this:  relying on simplistic induction -- the stepwise fashion of getting from here to there, whether physically, conceptually or mathematically --  misses the superior quality of existence which is that no there is reachable unless it has a reachableness -- can be attained and interacted/integrated with as future conditions permit.  Thus, instead of the transboundary region being ignominious and informationally "null", we know something about that which has yet to be experienced or interacted with or incorporated as "known".   We can expliticly specify a quality of a part of existence which Zeno, Russel/Whitehead, and Gödel expressly deny to sentient comprehension.

Well, we could dwell on this competition of perspectives for quite a while.  Or we could move on.  And since that's the better course, lets do it.

What is our obligation, now that we've opened up and claimed this new conceptual territory?  We use this transfinite induction  -- which I shall call mathematical transduction (gödel-limit leap frogging), for short --  and explore what it means statistically (behaviorally in time and in space) and computationally.  First, there are two reigning computation groups, abelian and non-abelian.  Since we are going-for-infinities here, we note that for identical membership, non-abelian rules give sets with more information -- more distinctive states -- than abelian rules.  One intermediary thesis then becomes:   abelian organizations are subsets of non-abelian organizations.

We move on.

Physics and mathematics rely dearly on "patterns".  When we now say that we are going to construct a mathematical computation structure that can cope with specific information which we readily qualify but can't in any sense predictively quantify, it seems that we are in deep trouble.  But, then again, maybe not.  Maybe we should take cues from our predecessors.   I would submit to you that this problem already occured in our history ten's of generations ago.  It was dealt with by introduction of the symbol nought (ie, zero).  Prior to that, mathematics only dealt with non-zero positive numbers, and combinatory symbols for them.   Invention of "zero" was to make use of the concept of  "there exists in this place the potential for a positive real number to exist".

I am taking you into an exploratory review of the essenses of our mathematics now.   Something we never usually think about, and take as for granted as we do the beating of our hearts.   For 99.9% of our lives we never have to give a thought to the dynamic that keeps our lives going.  We do and accomplish other, most extraordinary things, just because we don't have to concern ourselves with the life-force.   And so it is with the current extraordinary ediface of our mathematics.  We don't need to look back to question that which we rely on everyday.   But, I submit, if we newly re-appreciate that foundation, we open doors for ourselves that resolve current dilemmas.

Zero is not just a number.  It is a placeholder for potential.   It has an identity and existence even as it has no content (by conventional standards, which too, may change).  Sure, we use it as a number.   But it has a utility far beyond its numerical one.  We already have a tool "representative of" the one we need now, to deal effectively with Complexity and infinities of potentials.  If we only treat adjacency in regard to options and possibilites it would be tantamount to a chess player considering only the next immediate set of possible moves, without giving thought to subsequent moves by the opponent, and subsequent subsequent moves in return and so on.  The option spaces   I propose we need for dealing with intergrative complexity of plural levels of information and entropy in constructively meaningful ways are the temporally-environmental depth of extended eigenspaces. 

This requires a function/symbol which does not disappear when it goes to zero-content.  It is a stochastic function which allows that a complex system, such as one labeled "tournament", can exist whether there are two competitors or two billion, and that such function can be used as a denominator factor which does not negate    generating-equation (producing a defined "complexity" on the other side of equality) even if the probability of any member's existence goes to zero.    As specified in , two recursive communications alone are sufficient to enact a "complexity".  The entropy gradient of  those communications are inverse to the entropy gradient of the "complexity".

Now, the only way for a probability value to remain as place holder and not zero-out the whole of the probability group is for zero to operate as the value one, otherwise treated as identity.  In other words, 0 = 0, but, (0) = 1.  We already utilize related concepts every day.  We introduce "f(x)/f(x)" -- substitute -- for "one", in those equations where appropriate or useful.   Now, do we actively perceive that all possible   "f(x)/f(x)" 's are present when we manipulate reduced clean forms of equations?  No, of course not.  Yet if we consider the nature of option spaces, they are there none the less.  And a related operator already exists in standard mathematics.  Any equation raised to exponent-zero is treated identically as if it were raised to exponent-one.  We already have, by definition, 0 = 1.

The Integrity function of "communication optionspace" (tracable by its entropy values) dis-Integrate the existence of a functional complexity on the other side of  "equality" when, as denominator, all membership or probability goes to (0) = 1,  goes to "one", pure "potential".  

This redefinition  -- actually extended conditional definition -- enables and supports the application of a new class of Power Laws.  Power Laws typically are the reductive mathematics we find having application in a wide spectrum of areas .... but still be restricted to within bounded dynamics.  Eg, a given statistical formula can be equally applied as well on animal population distributions (speciation) as on commodity distribution in a network, or grade scores or molecules or quantum particle space.  The suggested new class -- Proportional Power Laws, which include the factors discussed above -- are applicable between levels and bounded domains - which are usually defined by common data units.   It is a pattern dealing with the inter-relational organization of systems regardless of the diversity of data units or bit-sizes involved.   We now have a computational accessway do realisticly deal with all the infinities and transfinites we may encounter,whether in the strictly mathematical sense, or in the pragmatic everyday world of common experiences.

 


Ceptual Institute - integritydot.jpeg (6802 bytes)        THE INTEGRITY PAPERS  (LINKS TO CEPTUAL READINGS)

                         GENRE WORKS (OTHER WRITERS)
            
POETICS
     
        MINDWAYS (GLOBAL URLs)
                          

 

| What's New |

email: