{aside excursion....

The date is February 27, 1994 as I write this small addendum. I am about to take you on a diversionary ceptual side trip. I hope you enjoy the view <g>!

 

THE COMPLEXITY GRADIENT

 

I am about to lead you on another one of my convoluted journeys of thought. This one prompted by several Prodigy messages between myself and Jim Feigenbaum (physicist) this past week or so (Jan 10, 1994), and a nice note Dan Quinn (author "Ishmael") wrote to me on a copy of issue #5 of his "if" newsletter. Jim started mentioning the possibility of different "classes" of differential equations. I immediately responded that is exactly what my ¼ and ¾ fluencial-functions represent, even if their enunciation isn't complete yet. He balked at the idea, claiming he hadn't read that in my text yet, and that he had thought of the possibility "independently" from me. (I really don't care, as long as he recognizes the propriety of looking for the next level of differential equations!)

 

Quinn wrote some encouraging words. The math was way beyond him but he was sending me his newsletters and looked forward to my response. That spurred me to try to think of ways to describe the similarities of our ideas without resorting to math. The image that popped in to my thoughts was how he described the situation in "Ishmael" how an imagined sentient social-culture of Amoebae would perceive themselves if they were at the "top" of the existing evolutionary scale. They (if they had such capacities) might invent and organize a "story" that saw themselves as the "goal" of creation {as some humans do}, when, if they had a larger frame of reference (ie, expanded cognition of evolution in time and space) to gauge themselves by would recognize themselves as only part (albeit an important one) of an ongoing evolving process.

Both Jim and Dan were discussing phenomena relevant to "nested organizations". This made me recall the small reference I included in the #'d sequence of this book. "Sets" are both definably "distinct", and, "environment" ; a "set" plus its "immediate encompassing environment/bound" create some next larger set, ad infinitum. Eventually "sets" and "environs" become equally infinite. Except. The posited "original kernel" (whatever that might be).

The original reference in this book focussed on two slightly unequal groups becoming "identical" as they enlarge towards "infinity". What I recognize now, after thinking about Jim's & Dan's remarks, is that small kernel is probably another crucial key to what drives the creation of "complexity"!

I have to back-track a moment to tell you about contemporary-resident thoughts ("memories") that play a role in my ceptualizing. First, comes some ideas I have carried since my earliest intuitions led me to even exploring these things at all. Without being able to quote a clear time and place, I can only remember the intuition that a simplified binomial version of Pascal's triangle might be an important part of the mathematics of physics. Some years afterward (1967 I think) I was toying around with an HP-45 scientific calculator. I plugged in various known constants and took them to several levels of root forms and other combinations. All of a sudden, transcendental #s transformed to recursive sequences. Something akin to 36 x 101 x 1001 x 10001... As I think about it now, I know that I might have stumbled on to something significant, or a chance artifact of the algorithms used as part of the calculator's design. Anyway, I wrote some notes, and tucked them away in my jottings and memories.

Then I found a book in 1972, E.W.Beth's "Foundations of Mathematics", which discussed Brouwer's work in designing Intuitionist Mathematics. Section 137 refers to generalized solutions to the equation (a'!x + b' = 0) {a variant to "ax+bi", the format of complex numbers}. Beth mentions its important relevance to elementary algebra and geometry per Tarski(1939). He also states that the formula has a unique solution when a'=a+1 and b'=b+1 (a conceptual reference to conditional sequencing that extends towards "infinity"). The solution is:

-101 # x # - 90
       90              101  

My first excitement in 1972 was in seeing a Pascal binomial ("101") show up in this "fundamental" expression. What its presence in the expression also insinuated was that there is a natural asymmetry inherent somewhere in mathematics and existence. I made a mental note of that quality, then put it too in the back of my mind.

The next piece of this puzzle is the paradigmed "asymmetrical dynamic" which is seen as the "force" which creates/drives "complexity". I reasoned that these several balanced asymmetries might be somehow connected. And, they are somehow all tied together with quantum mechanics and statistical probability.

Finally, I would ask you to recall one of the principle expositions of the Ceptualist Perspective. The " Bridge" that surmounts Gödel's premises: of two distinct set forms, each restricted in processing information flow across bound-limitations - much like a 20th century form of Zeno's Paradox. In fact, that is just the fallacy which Gödel's Incompleteness Theorem is built upon. Gödel implicitly accepts that, contrary to actual experience, a "nominatively derived" boundary can never be exceeded. In his sense, "infinity" is pragmatically unreachable and therefore not-exceedable and therefore eternally "incomplete". That "incompleteness" becomes a "boundary" relative between all-known or knowable information. But, "incompleteness" does not equate with the negative-complement of what is "knowable". That is, it is inaccurate to designate "infinity/finiteness" as Aristotelian opposites, and therefore it cannot define the qualities associated with "knowable". There are qualities which necessarily have the capacity or nature to exist in any and all systems regardless of whether they are open-infinite or closed-finite. It is those qualities which the Bridge affirms. Qualities which surpass the restrictions of Gödel's tenets. Qualities "superior" to any limitations posed on them.

In opposition to Gödel's notions, the extended premise flowing from the Ceptual Perspective is that any sub-quantum of information necessarily can be evaluated relative to each, to either, or to every domain condition. A datum's presence can reasonably be explored relative to any of these several diverse conditions. Besides giving us a valid platform from which to designate distinct "entropies", it also gives us a formidable conceptual basis to justify the arising of "Complexity", according to standard statistical methodologies!

This early morning (Jan 15,'94) I woke in the dark hours, thoughts bubbling with those images, and more. What event could spontaneously create or generate a state of probability imbalance and asymmetry? What event creates and supports the negentropic gradient that encourages and fosters complexity? The answer felt obvious and extraordinarily simple!

Any temporal event creates the apex of the Binomial Pascal Triangle (ie, "1") - which is the "1" of intuitionistic math; which is the original kernel-set that gets infinitesimally small and insignificant; which is the "motive" asymmetry of Complexity.

Let's call such an event the "first occurrence". In any conceivable system or set which exists as total raw "potential" .... completely un-actualized (to borrow a psychological reference from Maslow) ... that would include all possible universes where the constants such as the speed of light "c", Planck's Constant "S", etc, could take on an infinite possible combination of real values .... then the Total Entropymaximum coincides with the 100% statistical probability of all possible future quantum states.

The moment - the instant - any event occurs, such as the "big-bang", the Shinn-nu Event, the "probability" of all possible quantum states or situational events no longer is symmetrically identical and equal, because such an actualization or event occurrence reduces the residual statistical "potential" pool. Like dipping successive chips into a bowl of salsa and eating the combination - we reduce the quantity of available potential (uneaten salsa) and place some into another category entirely - metabolized food.

The Bohm, Feynman, Gell-Mann concept of "probability of the past" is just this idea. They perceived it in regard to causality and even acausal probabilities which can link events dis-continuously. Alternatively, the "first" and "subsequent" event occurrences create a total events pool which becomes increasingly asymmetrical and imbalanced - restricted and perpetually driven away from "pure total potential". The pool of possible states becomes diminished because actual states have been realized and fixed. Some statistical ranges of events have been actively precluded from existing.

Again, in line with the principle Ceptual Dynamic, several interacting entropies comprise and affect each other here. As Entropy of Distribution goes from "minimum" to "maximum" via dispersion (towards heat death) the result is termed "cooling": a negentropic process-aspect that further reduces the original "potential". As cooling proceeds, the proportional probability of each event in the residual pool of events increases (when held in comparison to the original "total potential"). So, at least 2 event probability ranges function simultaneously. The first, as regards all possible event states. A second, as regards all remaining potential states. In the first regard, the probability of any individual state is great in the negatively infinite .... infinitely "improbable" at any given instant. However, the probability of any individual states increases towards the "predictable" as the set of remaining possibilities gets smaller.

Accretion forces (such as gravity) have the same affect as cooling: The possibility of interaction-events increases as slower speeds and/or increased duration of mutual proximity of forces-particles enables the increased probability of any given event. That is, increased probability enables and encourages Complexity and causality-resembling interactions.

In this scenario the several "entropies" continue developing at the same time. Here, entropy coincides with thermodynamic probability states (and "noise"),evaluated separately (yet interlinked)for 2 or more boundaries - nested per each other - thus the relative differences in what their respective "environments" are will be reflected in different evaluations of their distinct entropies. Past performance thereby affects and determines localized probability pools by placing an upper limit on all "remaining possibilities". We recognize this gradient as "time". All modeling and calculations presume and presuppose an implicit "bound". In fact, particular suppositions may not be identical, and need to be better explicitly designated for any given set of equations.

Temporal existence co-generates the probabilities of events and strengthens their potential occurrence negentropically - complex events become more likely as

1) localization increases relevant to proximal duration

2) temperatures reach a cool range that coincides and coordinates several constructive scales: entropies of electron clouds with atomic and molecular size-ranged entropies, for example.

 

The several-domain entropies discussed (eg electron cloud, whole atom interactions, fluid reactions, biological communities, etc) can be discussed and coordinated vis a vis time distribution force-domains. Each has an intensity and normative range of function associated with its activity and energy/information paths/linkages. Primarily, this corresponds to what we measure for the 4 fundamental "forces". They can all be crossed correlated to find how, when and where their distinct total and partial entropies overlap and allow mutual activity. Using our previous example as a Newtonian mechanical model, outer shell electrons of separate atoms will interact to participate in building molecules only when their atoms' relative velocities (a combined "entropy" quality per the total phase-space the atoms occupy - when considered together) permit the comparative shell entropies to interact and stabilize.

Another way to envision their correlation is to compare extraneous electron linear momentum with orbital momenta required or associated with any atom's several electron shells or the functional connecting-shells of bound molecular groups. The overlap of their statistical ranges will establish what we can translate to as the range of molecular temperatures within which we can expect to find biologically-complex metabolic reactions. Imagine two ice skaters speeding towards each other. Whether or not they can reach out and grab each other and transfer their individual momentums into an elegantly flowing spin holding on to each other depends upon the structural ability of their hands, arms, muscles and bones to encounter and rechannel the mutual energies as much as it does on the gross direction and momentum vectors each brings. If the energy they bring in can be handled by the energy/information ranges that their appendages can deal with, they will twirl. If the directions of travel or relative speeds are inappropriate relative to their construction and constructive capacities then they will either pass each other by, or collide, leaving the Heisenberg Intersection either intact or modified by the exchange.

 

 

The coordinated small field "entropies" can be modeled. This is a crucial step in the GUT model - the Ceptual Paradigm. Their functional overlaps can be determined, and "net-entropy" gradients discerned.

Past scientific assumptions about Entropy in the universe assume that the broad general model is valid everywhere at all times. The image that results is one of the universe de-volving into an inert energy poor frozen heat death. Matter dispersed throughout infinite space, exhausted of momentum and totally diffused.

And yet, a primary negentropic force will still be present : Gravity. Gravity, after a vast amount of duration, will accrete and recombine all residual matter/energy into another penultimate Blackhole.

Negentropy - not entropy - will hold final "obvious" dominion over existence. At that stage, the diversity of conditionally dispersed locales could reasonably be labeled "complexity" and the accretion process a "moving toward" total noise and increasing "probability". Negentropic gravitational attraction will eventually create a universal state of maximum entropy again .... pure potential.

During our present phase of universal processes, complexity dominates as the principle rule of behavior (not the accepted wisdom of "exception to the rule"), because the proportional probability of specific complex events is always increasing. Even under present conditions negentropy is the pervasively predominant dynamic.

 

Given events "preclude" a much larger grouping of probable occurrences than they "enable". So, while event occurrences open up an increasingly larger domain of possible events (which may occur because of the pre-staged conditions), comparatively, the pool of "all else" grows enormously on the order of higher Cantorian infinities.

A related phenomena exists in asymmetric spin "handedness", confirmed to exist among elementary particles by Dr.C.N.Yang. The universe seems to have more particles spinning clockwise than counterclockwise. The universe seems to exhibit some kind of fundamental asymmentry rather than an even bi-symmentrical 50/50 balance).Without any parameters that could cause a preference for either handedness of particle reactions, the first order of possible causes would be the random conditions that existed at the Shinn-nu Event (the "big-bang"). The ratio of possible handedness reasonably runs the spectrum from 0/100 to 50/50 to 100/0 and every possible ratio in between. At the instant of Shinn-nu, one and only one of those ratios would "fix", and the handedness values of all following reactions would stem from it.

Handedness might be a factor resulting from, and a measure of the quality of, the asymmetry we call "time". The temporal gradient, in coordination with the several other "fixed" constants, which gets expressed as the subtle handedness we detect. The statistical probability of even-handedness is not dominant over any other value-state. So the handedness preference should indicate the variable states existing prior to Shinn-nu. Much like the phase-space qualities that are generated at the instant an orbital body exceeds or is released from its angular momentum configuration. Certain parameters get "fixed" (such as linear direction and velocity) by the cogent-event, even though those parameters were only part of an infinitely large statistical pool of possible states, prior to the cogent-event's occurrence. Handedness values could have been similarly "fixed" at/by the Shinn-nu Event.

We might therefore, have the smallest window by which to glimpse conditions pre- the Shinn-nu Event. Consider: if the intensity of all the matter and energy of the universe coalescing into the "ultimate" black-hole is so incredible and overwhelming, then what opposite forces or conditions might have there existed (possible irregularities and imbalances) which overcame those incredible binding forces?

.....}retracement:

Events, Heisenberg Intersections, etc., will perpetually encourage determinism and complexity out of all that is possible. After that, there exists overriding standard patterns of behavior that seem to corral any perturbations, and indeed all systems exhibit them. The problem we face, even when we use the broad statistical freedom of Quantum Mechanics in trying to evaluate and understand these behaviors, is overreliance on closed and rigid mathematical models; incorrect boudaries or misapplications. It is the Potential, the ability to successfully respond to an as yet unspecified set of encountered conditions ...(the enlarged set of degrees of freedom)... which hallmarks dynamic systems. In a sense, some systems require the "unknown" in order to
ffunction to their best Potential. Possibly, even adapting, by substantial alteration of operational construction, in order to successfully respond to further sets of conditions.

I wrote a letter back in November of 1992 trying to get a reading of my work by members of the Santa Fe Institute. It was written in answer to the initial negative reply I got in response to several overture letters and phone calls. I think a quote from that letter is appropriate here:

"Let me make a philosophical observation to you. The conceptual course that current science is following, epitomized by the rigid mathematizing of systems' behaviors, is on a collision course with open-ended biological behaviors. It will end up making short-shrift of the qualities that biological systems display ... such as the creation of art, literature and even mathematics itself. By denying that a dynamically open and real organizational relationship exists, by reducing specific "behaviors" to mathematical models and therefore ignoring that type of open-ended dynamic, because there is too much information to handle, "life" will be reduced to only the happenstancial models at hand. It will shut the door on "potential" that may not be able to be envisioned at any particular "here and now"."

"Consider this hypothetical, Dr.--- : Suppose you were a space traveler from a planetary culture in the distant past. Imagine that you and your companions chance upon a planet teeming with extraordinary life. Huge "reptilians" dominate the entire planet. With that set of information & observations - could you (or would you even bother to!) project a scenario that would see the total demise of those dominant creatures, and that 65 million years or more in the future - from where you and your perceptions are - that this very planet would be dominated by life forms that are to be as significant as yourself, descendent from and incredibly dissimilar to a tiny mammalian creature scurrying around in the bushes ... that you may, or may not, even make notice of!"

My point was, that an interactive evaluation process that does not leave the door open for adaptive access to alternative information and "boundary" determiners, is doomed to failure. The evaluation process must be as dynamically open as the information it anticipates processing. In addition to processing whatever is the information and goal scenario at hand. This means that "the tendency towards Integrity maximization" is the only model-dynamic available that can accommodate all possible information and adaptation scenarios.

For example, modern Aristotelian attitudes seem to teach us that if there is a problem, then there is a solution (something incorrent can be made its opposite: something correct). In truth, solutions are not always mathematically simple and we are forced to acknowledge secondary processes - which are also Integrity relevant - that might be as important in acheiving some desired goal or condition.

Let's consider for a moment the human psychological concept of "greed", or its more general form: "obsessive/compulsive" behavior (especially in regard to Maslow and his "hierarchy of needs"). Each "need level" has process(es) associated with it, that accomplish some satisfaction-goal. At some point in on-going cognitive information activities, the "process" can become so identified with the "need" itself that, in a counter-intuitive twist, as the need becomes satisfied, the associated process slows, then stops, and, the need-process matrix becomes "incomplete" again(!)...thus regenerating a requirement to re-initiate the "process" part of the matrix. A regenerative self reinforcing feed-back loop becomes established ... an obsessive/compulsive behavior loop. A person can so fiercely associatively link the techniques of goal attainment with the goal itself, that even after the goal is reached, and the process used no longer necessary, the cognition of not having a process in place anymore becomes reminiscent of the original condition when things started. Lack of a process was one of the conditions that existed at the outset and infers that striving for the goal hasn't begun yet. If the process is missing the goal might be missing too, and goal seeking behaviors could reasonably start all over again. How the matrix of events criteria is framed can make all the difference in the world. Recognizing when to stop is as important as getting started.

Additionally, any organism's self-awareness-identification (its intimately bounded stimulus-response rete) with any given need-process matrices ... that is, any organism's self-awareness (to whatever extent available or appropriate) that its own actions affect and control the outcome of a situation it may encounter... i.e., that to an important degree it controls the processes and outcomes of needs satisfaction - are all relevant to a sense (however subliminal) of control and power in the larger "organism-need-process" matrix, and actually empowers the organism ... increasing its total security and dynamic stability in its environment ... by just that ability: to be able to secure control of events, rather than having to respond to the whimsy of external unknowns.

Two dynamic, distinct, yet interrelated ffunctions - at moments of evaluation operating in potentially counter-entropic ffunction with each other - yet both seeking to maximize some state of system Integrity for the organism or person. Several processes working to achieve behavioral homeostasis. All, very importantly dependent upon which "environmental bound" is being considered or stressed at any moment in the matrix continuum!

Insurance companies and their insurees, for example, ffunction at their best level of stable-dynamic-interaction when there is a significantly large domain of "non-knowledge" about future events or conditions. Elsewise the character and quality of the relationship changes significantly and becomes something quite different. Instead of being Insurance, the relationship becomes Assurance. Policies are supposedly given only to those persons who are still indeterminant as to their eventual health claims.

If science research links genetic propensities to any particular people, then the risk is no longer being spread out and shared among the general population. A disproportionate burden is typically transferred to those at higher risk. This enables an Insurance Carrier to not risk draining the resources of the company (a plus for the Integrity of the company in and of itself) and to continue providing their intended purposes and benefits. However, the benefits no longer fall to the general populace, and target groups may not be in a position to be able to shoulder the financial burden the Companies now say they no longer wish to protect, since the probability is so great that some health tragedy will definitely occur. When this kind of deterministic knowledge is accumulated in too many different areas of health or natural disaster that is supposed to be "insured against", the system collapses because the Integrity balance for all parties involved is thrown off beyond the Matrix's ability to re-adapt. Then, if the energy flow - in this case, the money - is so intricately woven back into the even larger commercial organization matrix of the society at large, by being "invested" in property or economic production (all requiring a steady flow of monies through all parts of the system at all times) - the potential for total collapse becomes extreme.

I know this got a little convoluted, but the bottom line is that the Integrity of a system crucially depends on the entropic distribution of whatever sub-components it is built upon. In social commerce, it is the entropy of money: having it stay in kinetically stable distribution throughout the devised system. And, the ability of the larger system to redistribute currency, as conditions require...as it is contrived to be able to respond to currently not-yet-existing situations.

Systems, organic systems (and "social" systems being no less "organic" than interconnected protoplasm) require constant energy/information interchange in order to survive. If bank interest rates are attractively low so that borrowers evaluate it to be advantageous for themselves and their potential to do so, yet, the lenders restrict their lending only to those who can absolutely guarantee to reimburse the system without any risk of failure (let alone "minimal" risk), that level of required deterministic behavior also strangles the social organism. Terminates its ffunctioning, as surely as some overt hemorrhage where money (or blood) does not get rechannelled back into the "body" of the organism. That is why "balance of trade" is so important in commerce. If there is an imbalance beyond which equitable flow cannot be restored, something drastic will always be the result.

In the overall picture of kinetic systems, what is at work here is the Bridge!... the ffunctional potential of every atom, organism, system or thing to perform at some future unspecified place or time, in response to a possible range of conditions (as yet unknown or fixed) ... and survive ... potentially enhancing the next responses, and the next, and so forth. Until we can incorporate fluidly open "conditional" factors that affect each system's behaviors we will never comprehend what the universe is all about. To pre-judge that having all "prior" knowledge is the way to predict behaviors in an ultimate clock-work like deterministic way, as the only acceptable measure for understanding things or having a complete and valid mathematical system, is to miss the point of what the universe IS.

Even a quantized Schroedinger atom has a "ffunctional behavioral range"... beyond which it can no longer handle the energy/information it encounters in its environment...and it will disrupt. "Limits", per se then, should be viewed not so much as a wall or barrier to how systems behave, but should be recognized for what it tells us about what goes on "within" those limits...and whether those limits are mutable in any way...and under what other external or internal conditions. When we find any system at a given moment in time, place and energy states...the more important question is to determine the condition it is in relative to its other possible states of existence. Then and only then can we speak reasonably about what will happen or "might happen" next.

I ascribe no "consciousness" to atomic particles. I do promote evaluating them in terms of cognatively-responsive-bounded- systems. Even if they are seen as following highly specific mathematical regularity when reactions occur. The primal quality of their construction is as much the potential to be in those possible states, as it is the realization of those states.

The subtle distinction is not that we know where and how things are at a given instant, but, what the latitude of behavior available is... from that moment on! Where as, having more and more information is tantamount to an entropic process that grounds stability in sure predictability (a system may exhibit stable plateau states), its overall ffunctional Stability (capital S) resides in the range of energy/information it can ever process. And that includes "rates" of energy/information transcription. If the rate of information flow is too fast, relative to constructive capacity, there could be a system overload. Similarly, the capacity to hold and store energy or information might also be over-taxed for a given system, before the information could be processed or utilized.

Of prime importance then, are the capacities and structures of the systems themselves. How much can the organization handle? Can sub-processes shunt or redirect or control the flow rates .... into the system, through it and back out again? Can the system "re-organize" itself? Can it grow, adjust, adapt, restrict? Do potential growth states enhance the system? Do they cancel or negate prior ffunctionings? How much affect does a single action have on the whole organization, and does that vary over time or sequence? These are things that Fuzzy logic can handle better that Boolean logic. At the very least, both techniques must be simultaneously applied to achieve the best ends.

A mammalian embryo in vivo has a very specific bio-construction that assures its survival in the womb environment. Dynamic balances are simultaneously at work in the mother's body, and at the placental interface, and "inside" the fetus, and in its immediate amniotic fluid environment, that ensure the safety and ongoing stability of the fetus. Eventually, relational biochemical states are reached that cause the expulsion of the baby from the womb (or causes the hatchling to break out of its shell). Conditions that at one moment represented safety and life, now become life threatening as certain barrier limits are reached. If the child or young organism stays in the womb too long, both the parent and its potential replacement risk death.

The larger environment and conditional changes then come into active consideration. The baby now must adjust the way in which it processes energy/information from its "environment"...relying upon pre-existing potential to handle the changes. Changes which are unpredictable for their specificity: not knowing if birth would be into a congenial gaseous environment - since it could take place in cold air, hot air, dense sea-level air, Himalayan rarified air, moist humid air, dry desert air, clean air or polluted air. However, the constructive ability to draw stability and growth and maintenance supportive energy from an anticipated range of gaseous oxygen-nitrogen-carbon dioxide possibilities...that latitude of behavior based on non-specific yet anticipated conditions...is what is important to recognize. Being prepared with the capacity to deal with the unknown, things unpredictable from individual past experience alone.

What happens next is something that we can only appreciate if we consider the boundary of an individual to be something beyond their own physical condition. The physical structure of an individual exists only because of the adapted-to experiences of previous generations of individuals - which adaptively survived in the spectrum of oxygen rich environments - and ended up passing along forms and organs and processes and skill potentials with the ability to deal with a whole varying stream of possibilities.

After "birth", each subsequent breath, each subsequent intake of biochemical food nourishment, not only stably maintains the organism, but in some cases alters it in some substantive way, that leads to growth and new levels of stability and different levels of abilities...that need to be holistically evaluated at those new plateaus. Certain levels of complexity are required in a cortex for example before information/energy can be processed in a more complete way. Potential precedes the capacity to act. The net value of any Potential can only be evaluated by survival after the conditions are encountered. And a Potential can never be underestimated ... because we may not be familiar with the range of conditions that will bring out the best available response of that Potential. Or even if the Potential is in its final form, or is only at one plateau in its evolution that might not be fully actualized until a thousand generations have passed and other modifying conditions have been encountered.

Again, it is crucial to acknowledge that the "environmental bound" will affect the import of exactly which Integrity-dynamic takes precedence. Destruction of the placental interface will (at one "organism-need-process" matrix) result in the failure of the infant organism to survive. Destruction of the placental interface at some alternative "organism-need-process" matrix will result in assurance of the survival of the organism.

Dynamic systems may also be in metamorphic transition. If the mathematical formulas cannot model that kind of extremely substantial alteration - specifying the conditions required for alteration and what the resulting changes will be (like predicting a butterfly from a caterpillar), then the models cannot help but fail. Relationships and Potentials may be only nascent and inconsequential in certain Matrices that we evaluate, yet are the "seeds" of fundamentally different crucial behavior determiners in some future Matrix.

Vis a vis conceptual modeling, our mathematical language must change to accommodate Temporality, and must be as organically open to adjustment, change and potential alteration as the physical reality ffunctions it parallels.

To comment again about information, Integrity, and social organic organizations: as social creatures, humans have built crucial cultural institutions based on "ignorance" ... relying on the potential to respond to situations (environmental conditions in the broadest possible sense) as they occur.

Insurance, is based upon "what if". So is our social order in general. "What if" I can't take care of or feed myself? We strive for survival together. What if I get sick? There will be resources to help you get better. I have some extra money and I want to make more. So I will invest it in a merchant vessel sailing to the Indies. But it sure would be nice if others joined the venture. My exposure to loss would be minimized "if" anything untoward happened. And others would profit equally "if" things go well.

It is not "sure knowledge" of things that created the social institutions of our world. It was "anticipation" in deference to the unknown. Great armies are built when there are ambitious aims. But great armies have also been built in anticipation of "possibility". Potential created upon Potential. To achieve aims in light of unknown adversaries and what they will or will not do.

Most of the pleasures we seek are in games of chance. Testing our skills, our knowledge resources, against the possible outcome of a given competition, or the surety of the performance of a mechanism like a motor car, or plane, or of the muscular effort of an athlete, or whether a company will produce its product well enough or whether people will want such and such a product and buy it and allow the workers to get paid and the producers to make a profit so that they can share with investors.

A dear and life long friend of mine, Sally Shanbrun, literally "dreamt" this up one night, some 20 years ago. A game called "Line Five"87. It is deceptively simple and combines the children's games "battleship" and "tic-tac-toe". But that unity, in one fell swoop, analogues information systems in general, and the behavior of organisms in particular: person to person, country to country, business to business, spy to spy.

The game is played on two fields on either side of a vertical common partition (or on separate monitors or linked terminals). Any move that places a marker into a player's location on his or her own board, is at the same time setting a marker into the opponent's playing board. Each player can only see their own playing area...not...the opponent's. The game is played with markers color coded on each end. Double blank, red-blank, blue-blank, red-blue. A set of specially color-coded dice are rolled which determine the marker combination that is to be placed during each turn. As markers are placed, each board can only have its own color or a blank, showing. If "Blue" rolls a red-blank, she must set the marker with red showing on the opponent's board and blank on her own. The net object is to line up 5 of her own color markers in a row, before the opponent does.

The players are constantly sharing and expanding information through a singularly common experience domain and mechanism: the dice roll that fixes a marker. Both are known to be motivated by the same dynamic impetus. Each must act and respond to a partial set of "known data" and rely upon memory to correspond with events. You must not only play "your" game, with only your memory to aid you, but you must play your opponent's game, too. Being able to recall every involved marker placement, and strategies that might be developing on both sides. Sometimes you play markers which only aid your own cause. Sometimes you are required to help your opponent. Sometimes neither, sometimes both.

"Line Five"© is one of the simplest game models to incorporate all the fundamental components of a dynamic information rete. The Oriental game of "Go" displays more bio-dynamic boundary activity, but doesn't incorporate the degree of statistical uncertainty or strategies that can function at odds with perceptions. In Line Five© the first dozen or so placements are easy enough to remember, but after that, it becomes possible to create false leads and impressions. Life isn't just chess-like straight forward "move-countermove"...the "level playing field". It involves strong dynamic "hidden agendas". Activities that are only partly what they seem. Systems acting in coordination with others, yet fundamentally focussed on their own success. Communication and informational transactions on many interrelated levels.

Choosing another game model, life on this planet is like a table full of poker players (Slobodkin,1973. personal conversation). Individually the goal is to gather the most resources (power, money, knowledge, etc.) to ensure existence, to enable growth, to be a "Darwinian" winner. But, "winning" is only meaningful when everybody stays in the game. If too many players drop out, the real game ends. For planet Earth....life might cease. The larger strategy is to keep diversity and open options...to keep the energy flowing, to enlarge the information network, to maintain the Integrities of various sub-systems in order to support the Whole.

These are incredible examples of the basic dynamic functions of active systems. All participants seek the same goal. If this were a never ending series of tournaments, the successful "winning" of each game-encounter is not important in and of itself. Rather, it becomes the function of a latitude of behaviors that result in achieving the "next" stable plateau ....which is not an end unto itself...but furthers and allows continued participation in the process. That is the importance of "winning".

As we rush headlong toward treating the planet as a "unitary economy" we run the risk of pushing each economic sub-cyclic system toward entropic extension and reliance over the entire global network. The subtle affect is that we diminish what might be termed the "external economic environment". The situation is akin to a manufacturing company growing so large that it has developed all possible markets in its sphere of ffunction. Two things happen. First, there is a limit to the "growth" behaviors (ala Malthus; Club of Rome). Second, in order to maintain the activity-expectation level that its components might have become dependent on, each sub-economic area starts vying against each other for the limited net-available resources. The holistic system heads towards dysffunctional imbalance beyond its ability to regain stability.

This is why transnational anti-trust mechanisms should be established. Not to promote divisive "nationalisms", but to maintain viable competitive markets. Not all sectors will prosper equally at all times. There will be normal negentropic imbalances. But this is natural. It must be understood that any temporary shift in wealth does not confer "superiority" or "inferiority" to any group of people. "Control" and "power" can shift in the twinkling of an eye. A bit of sentient modesty can go a long way to furthering the success of human social organizations.

The Entropy of the market place distributes wealth and information and creates dynamic kinetic stability by increasing the interconnectedness of peoples and things and money flow. And the fluidity of that entropy...the interest "rates"...of banks, money lending, taxes, earnings, et al...all go toward the Integrity of the economic systems and organisms. All organisms need structural organizational Potential ... to adapt and respond ... at existing levels ... and maybe even improved ones. Over-aggressive formalizing of individual persons behaviors (reducing the latitude of responsive actions) in deference to the Integrity of the larger systems is another dangerous situation that must be handled with care.

We have built entire planetary social institutions based on this! Part of it is physically driven and subconsciously innate ... bio-metabolism is driven by a zillion interconnected processes all seeking kinetically stable states, while continually ffunctioning in environments shifting energetically to different energy/information configurations... all requiring adaptive maintenance responses.

When we get to the higher cognitive levels of existence, these forces are still at work. We place high value and priority on having the potential to adapt to the unknown...using skills, efforts, or knowledge previously gained...believing that lives and systems are more securely stable - (have higher Integrity) - if there is greater formalistic determinism involved; yet, realizing that Integrity also increases with the variability of degrees of freedom. A large percentage of our existence being unknown and undetermined and unspecified.

"Existence" requires the constructive latitude to encounter and process additional energy, additional information, and respond in ways that enable continuation of the ongoing form-states and processes. In some cases, such as metabolic pathways, to even permit complete destruction of an existing molecule, as long as the continuum of molecular rearrangements allows for its eventual reconstitution - in order to repeat the whole process loop, and thus maintain the Integrity of the continuum process itself. Examples are the Kreb Citric-acid Cycle, or the Cytochrome Transport System that step wise raises the energetics levels of specific molecules (individual electron levels of atomic shells of special molecular configurations), in anticipation of the energy being released as muscular activity. The increased entropy of certain electron shell clouds creates the negentropic storage of energy in certain metabolically useful molecules. That stored energy is then released when the molecules are exposed to environmental energies - either neuronal impulse or a biochemical interaction with some other molecule - that pushes the stored electrons beyond the holding atom's or molecule's capacity to retain them. As the electrons cascade through the sequences of biochemical pathways, work is done. And events are accomplished - such as turning a steering wheel to avert an animal in the road and keep it from being run over and killed.

A wonderfully intricate orchestration of similar and fundamentally simple processes that en masse produce extraordinarily nested layers of behaviors. All of them oriented toward some state of kinetic stability ... Integrity.

This is quite obviously a different approach than those being taken by most investigators. I am proposing that quantum plateau states be examined with broader general dynamic considerations in mind.

If we chart comparative energy states as if they were loci relevant to localized entropy processes, and allow a concept that ... where as general Entropy diffuses energy through a greater volume of space along with diminishing complexity and heat loss ... in this instance, lower energy plateaus stabilize instead to smaller spacial regions. That is, as energy states diminish per se ... and follow the general concept of entropy: diffusion and diminishment of all components ... the statistical space of where that energy exists actually becomes more localized and better specified (compared to the denotable space of higher energy quantum plateaus). This partial derivative of the energy states therefore behave in a negtropic manner.

Molecular and organic complexity really stems from this situation. Electron shell interactions operating predominately entropically, yet resulting in crystalline and organic molecules coalescing negentropically. The negentropy of the spatial partial derivative component of electron behavior effecting a "joining and localizing" of all Center-of-Masses present in the interaction. Shells localize into a smaller region relative to a single nucleus; shells also localize to minimal regions when multiple atomic nuclei (atoms) are involved. We effectively see a negentropic building of multi-atom molecules, etc., a process which seems on the face of it to run counter to the general rule of Entropy.

A simple example of this process at work - that introduces an additional aspect that "acquisition lowers the energy threshold of a transaction and makes it easier to acquire at the next transaction step" - is Markonikov Addition. Markonikov Addition deals with the ease with which Carbon atoms bind with Hydrogen atoms. A Carbon atom can nominally combine with 4 Hydrogen atoms. The force required to bring the atoms together until they cohere together is a reasonable value for the first Hydrogen. Less energy is needed to add another to create CH2. Less energy again, to create CH3. And even less is needed to create CH4. An entropic diffusion of certain partial derivatives reduces the resistance to combine.

Importantly, a similar ffunction-relationship appears with Bose quantization on the subatomic level. James Feigenbaum described it to me when we were discussing some possible locations for symmetry breaking: "The matrix element for putting another photon into a given state is proportional to the squareroot of N, where N is the number of photons already in that state". Jim went on to make the point that "the matrix element for a photon to leave that state also goes as the squareroot of N, with a proper accounting of whether N is the final or initial number. ...the matrix element is the same for both the forward and reverse reactions". I asked Jim to take those observations one step further. I suggested that if the matrix element energies are charted versus the values of (N) the resulting graph would show a strong directional bias. That bias would be the intrinsic asymmetry factor he was looking for, and would be tantamount to one of the compelling drift factors that the Integrity Paradigm posits that results in the building of negentropic complexity. For any given (N) the forward and reverse matrix elements might be "equal", but for all (N) a pattern emerges: the Integrity pattern of negentropic complexity.

On the level of human commerce we already understand this phenomenon, though it is talked about as if it were intuitive conventional economic folk-wisdom: "the more you have the more you get", "the rich get richer, and the poor get poorer", "it takes money to make money", "the folks offered the biggest credit lines are the ones who don't need it", ad naus.(!)

Several things are accomplished with this broad holistic perspective. The principle one being that we don't have to postulate any additional extraordinary "force" such as an amorphous extraneous "attractor" (ala complexity mathematics).

 

Dynamic stability - as a concept - becomes more broadly understood as a constantly open variable process rather than a fixed steady state of thermodynamic balance. The "steady-state", in fact, becomes only one mechanism of "stability". It is my perception that we should appreciate "stability" in the broader sense, and attempt to define systems in terms of self-relevant maintenance ... seen as the ability to function within limits of energy transfer. That is, ffunctioning within a latitude of internally and externally oriented energy flows {comparative states}.

That is why I find the work of Dr. Rudolph Marcus, the Nobel winner at Caltech, intriguing, and particularly pertinent here. One of the cogent aspects of the Integrity Paradigm says that successful self-maintenance Integrity survival rests as much upon the rates of transfer of energy and information, as it does on the quantity. That is, a system's innate construction determines how much information it can process, and how fast, before the construction of the system is over-stressed.

A molecule, or a complex particle per se, can encounter just so much external energy before valence bonds are broken or an atomic particle is decomposed and the energy restructured. Either way, the important part of the dynamic is not so much at what energy value the de-composition occurs, but that those values represent a limit beneath which there persists a wide range of normal continuous behaviors and interactions.

As there are acknowledged variations in electron transfer rates, per Dr. Marcus' work... they are derived from aspects of electron shell variances not dissimilar to the subtle van der Waals forces ... this only adds support to how the Integrity paradigm models these regions of information transfer.

I have been saying for over 20 years that at some point in the future, I would like to see bio-metabolic pathway molecules evaluated for the EM fields they present to other metabolic molecules. EM fields will vary around such molecules in the same way that a planet's EM field will vary, depending upon the location and tilt of the internal metallic core as well as the presence, strength or absence of external solar winds and the like. DNA, RNA, and other protein structure binding sites will be understood more clearly. Stereo configurations give us only an instantaneous snapshot rather than a dynamic pattern for how the binding sites work. Metabolic pathways described as a flow-chart of compatible EM region-site states is far superior. Oxidation and reduction are site-specific for each molecule, by form and by EM values. As "process" they establish relative gradients for electron transfers through the molecular soup/environment. A gradient which follows strong quantum nodal values co-organized under local entropies.

It is interesting that Virtual Reality studies are now being done to realize these phenomena. But they are cyberneticly linking human interactive experiencing of the EM fields with individual molecular interactions, and skipping over specific schematic value-mapping, even though those are the number-values that they are really manipulating (that data being generated and stored inside the computers doing the modeling). Hopefully, someone is going to coordinate that data some day in a metabolic map.

The only "information" being shared with the people wearing the Virtual Reality interfaces are the "sensations" of what the interacting molecules supposedly "feel" as their EM fields encounter each other....a complex version of what an EM field feels like to someone bringing two North or two South poles of a bar magnet together...the rubbery, fluid, organic energy-presence keeping the bars apart. In this case, researchers are directly "feeling" the blended-value forces of attraction and repulsion, building up a physiological appreciation for how these molecules interact. But only on a one by one reaction basis.

However, there is research proceeding in another area of Chemistry which can be brought to bear here. The new field of Computational Chemistry is using the power of the current generation of computers to do all the quantum calculations necessary to literally "build" new and different molecular configurations that nature and traditional linear sequence chemistry have not. This is an off-shoot extension of the Virtual Reality work, but is more specific. Carbon atoms configured in 18 atom loops, or bi-level graphite, or Buckyballs and Fullerene chains of arbitrary length bounded by complex caps are just some of the examples to date. (They refer to traditional chemistry as zero-dimensional ... building specific linear orientations one at a time, whereas Computational Chemistry is taking into account several quantum-determined degrees of freedom at the same time.)

These physical effects and research accomplishments show the penultimate importance of Rudolph Marcus's work. I will refer to, quote from and comment on that work (my remarks in { } brackets) as presented in the English edition of "Angewandte Chemie" vol32no8, 1993: Nobel Lectures "Electron Transfer Reactions in Chemistry: Theory and Experiment".

"In transition state theory, a quasi-equilibrium ... is then calculated with equilibrium statistical mechanics." In 1938, Eugene Wigner "used a classical mechanical description of the reacting system in the many-dimensional space (of coordinates and momenta). Wigner pointed out that the quasi-equilibrium would follow as a dynamical consequence"...depending on limited recrossing of the transition state. {JNR: This is a partial component of gradients that get established.}

"In practice, transition state theory is generalized to include as many coordinates as are needed to describe the reacting system. Further, when the system can tunnel quantum mechanically through the potential energy barrier (the pass) separating the two valleys {stability energies}, ... the method of treating the passage across the transition state region needs, and has received, refinement. (The principle problem encountered here has been the lack of "dynamical separability" of the various motions in the transition state region.)" {JNR: In other words, transition state regions are loosely specified localizable bounds ... with a latitude of energy values that fluctuate in regard to multiple variables that can affect it. Besides the flux relative to center-of-mass separation between atoms, electron cloud responsiveness towards the molecule it initially resides in, etc., the wave-state of an electron that permits tunneling, et al, means that there are several paths ... and energy states of those paths ... through which electron transfer occurs. Each with its own particular relative-entropies gradients.}

However, " a somewhat different picture of the reaction is needed." After studying a 1952 symposium paper by Libby where Libby indicated that a particular reaction studied produced unexpected results because the "environment" was not one congenial to the reaction at hand because there was not enough time for the molecules to re-orient out of the way, Marcus continues: "I realized that fluctuations had to occur in the various nuclear coordinates {JNR: therefore, presenting a different EM "face"}, such as in the orientation coordinates of the individual solvent molecules and indeed in any other coordinates... . With such fluctuations, values of the coordinates could be reached which satisfy both Franck-Condon and energy conservation conditions...". {JNR: The reactants are simultaneously "participant", "environment" & "boundary".}

"The theory proceeded as follows. The potential energy Ur of the entire system, reactants plus solvent, is a function of the many hundreds of relevant coordinates of the system .... which include ... the position and orientation of the individual solvent molecules (and hence their dipole moments, for example), and the vibrational coordinates...particularly those in any inner coordination shell of the reacting ions. ... No longer were there just the two or so important coordinates that were dominant in a reaction." {JNR: Though an atom or molecule is ostensibly spatially designated by the predominant expression of the outer location of the electron cloud/shell, the behavior of an atom or molecule ... every "boundary definable" system ... is affected by information and energy transferences and transcriptions both internal to that bound and external to that bound.}

"Similarly, after the electron transfer, the reacting molecules have the ionic charges appropriate to the reaction products... . These two potential energy surfaces will intersect if the electronic coupling which leads to electron transfer is neglected. For
a system with N coordinates this intersection occurs on an (N-1)
{ JNR: ! } dimensional surface, which then constitutes
in our approximation the transition state of the reaction." { JNR: We currently discuss the universe as n-dimensional, whereas according to my premise to include a zero-th dimension (fluence), it ffunctions as n+1. This is a direct corollary of the above statement.}

"...electronic coupling, electronic motion, nuclear motion" added in ... require "a rather different approach..."

"...what was then needed was a method of calculating the electrostatic free energy G of this system and its still unknown polarization function Pu(r). I obtained this free energy G by finding a reversible path for reaching this state of the system. ... I was able to find Pu(r) ... and ... G for the transition state. ... and the reaction rate calculated." { JNR: Marcus was able to specify a process rate and a process direction, based upon the differentials of existing quantum and ion values. This included "vibrational" components, l, and center-to-center separation distances.} There is a component Qh, "where Q is the charge transferred ... and h is the activation overpotential, namely, potential difference ... relative to the value it would have if the rate constants for the forward and reverse reactions were equal." .... "When |Qh|< l, most electrons go into or out of quantum states ... that are near the Fermi level." {JNR: There are variations which occur because of high exothermic activity.}

Marcus then applied a linkage : "a linear response approximation in which any hypothetical change in charge of the reactants produces a proportional change in the dielectric polarization of the solvent." He has now applied a central limit theorem that gives a better approximation than simple perturbation theory.

The result were geometric quadratic functions displayable as parabolic free energy plots. Very distinct for each reaction. "It was important to use the free energy curves, instead of oversimplified potential energy profiles, because of the large entropy changes {!!!} which occur in many electron transfer cross-reactions..."

Several subtleties became apparent from Marcus's work. One was "cross-relation" which defines fine rate distinctions around equilibrium. Another is that the activation free energy component DG* can be either positive or negative as it pivots around -l. And likewise produces an "inverted region" when -DG0 > l. The net effect is that the free energy barrier DG* is decreased. {JNR: Such "free energy barriers" are therefore not fixed and firm. There is a variable latitude of energy levels which still constitute a path region .... but the region is very flexible and dependent upon the EM configurations that are presented by environmental molecules. The propensities for electron transfer are dictated by the internal-oriented configurations relative to external-oriented presentations. All effecting entropy changes and transferences...in local regions...and in the overall, creating transaction paths describable as "entropy gradients".}

Marcus continues: "...there are now detailed experimental and theoretical studies in photosynthetic and other protein systems."!!!! He continues with remarks about the nonpolar nature of protein environments, the forward reactions, the participation of inverted regions, long-range transfer, comparations of back-transfer which give direction impetus to chained transfers, and small step stability nodes.

The Integrity Paradigm looked at gross-system behaviors. It deduced that local entropy dynamics could be used to re-interpret the motivation and direction for those behaviors. It even intimated negentropic complexity formation as a consequence of regional localized entropy. It was a noble and well thought out deduction. Marcus has given it a "presence". Right where it counts, on the atomic level. Right where I always felt is could best be "proved". Its applications elsewhere should be more than obvious now.

Rudolph Marcus's work shows that the universe is indeed an elegant place. Extraordinarily vast in subtlety and interconnectedness and complexity. And, comprehensible with the understandings at hand. "Autocatalysis" not required.

Autotrophic behaviors, however, are a way of evaluating and understanding complexity dynamics. In the case of botanical phototrophic responses, auxins are produced on the side of the high energy interaction ... the side where photons are arriving from. Plants in general use light as part of the photosynthetic process of food formation. In this instance, the light is not limited to excessive food production only in the cells closest to the light source. The light helps produce the auxin growth-retardants on that side. The net affect being that comparative growth on the far-side of the light source is higher, and the plant - as a holistic organism - moves or grows "toward" the light ... where the overall plant structure has better access to the light source.

What I am trying to indicate by this reference is that "photosynthetic-predominated" analysis alone would imply greater growth strictly on the near-light-source side. The growth impetus is not the result of a singular dynamic, and therefore runs counter to simple intuitive observation.

This is directly analogous to negentropic complexity development. The Complexity Dynamic is not the result of a singular function. It is the net-result of an opposite gradient ... entropy ... which expresses itself on the next higher level of organization.

What predominates in dynamic biological metabolism is the chain of increasing or diminishing energy levels. It has always been my personal contention that the Kreb Citric Acid Cycle and the several variants of the Cytochrome Transport System utilized by all living organisms, plant and animal, are the clearest examples. Aided by the initial external input of solar energy, the innate entropy gradients of interacting atoms and molecules produce an ever increasing flow through of energy storage through such molecules as ATP by using local partial differential entropy maxima to stabilize and maintain that storage situation.

 

The following are selections from Grolier's "MultiMedia Encyclopedia, Ver 1.5, (1992), "Photosynthesis" {and related electron-transport topics}:

"Since Robert Hill's research in 1937, a tremendous amount of work has been done to elaborate on the concept of a light-activated electron-trans-porting photosynthetic chain. It is clear now that two light reactions in higher plant photosynthesis exist and act in series. Both involve chlorophyll in a reaction center. When the first of these reaction centers absorbs a light quantum, the chlorophyll becomes oxidized and is capable, in turn, of oxidizing water by removing hydrogen atoms and releasing oxygen. The electron removed from the chlorophyll during this first light reaction passes down a chain of electron-transport proteins to the second reaction center chlorophyll, at which site absorption of the second light quantum causes the electron to leave the chlorophyll and eventually reduce NADP to its energy rich form, NADPH. In addition, ATP is produced during the electron-transport process and is used, together with the NADPH, to drive a complex series of enzyme-catalyzed reactions that incorporate the carbon of carbon dioxide into various complex organic products.

Each electron, residing on the reduced primary acceptor after the light reaction, is passed along a chain of biological electron carriers and arrives at the reaction center chlorophyll of photosystem I.

It seems likely that at least three major components, plastoquinone, cytochrome (a c-type cytochrome), and plastocyanin (a blue copper -containing protein), are involved in the electron-transfer sequence. As electrons pass along these components, energy is released and is used to drive the formation of ATP from its precursors, ADP (adenosine diphosphate) and inorganic phosphate.

The process is termed photophosphorylation and is presumably similar to the process of oxidative phosphorylation, which occurs when ATP is produced during respiration in mitochondria. It seems that the electron transfer creates a gradient of hydrogen ions and a difference in potential across the membrane in which the components are located. These components subsequently are discharged and are considered speculatively to drive the formation of the ATP.

"On reaching the reaction center chlorophyll of photosystem I, a second quantum of light is absorbed and causes the electron to reduce another electron acceptor, ferredoxin, against a chemical potential gradient. Ferredoxin is a protein whose active site contains iron and sulfur. It is an extremely powerful reducing agent once it has accepted an electron, and it is used to reduce NADP to NADPH."

"An alternative fate of the electron on the ferredoxin is to return to the oxidized reaction center chlorophyll of photosystem I by way of a further chain of electron-transport molecules. The details of this chain of carriers is thought to involve a b-type cytochrome together cytochrome f and plastoquinone. Energy is also released as the electron passes along this chain and can be used to drive the formation of ATP from ADP and inorganic phosphate. This is termed cyclic phosphorylation because the electron passes around a continuous loop of carriers."

Corollary sequences and metabolic pathways exist in animal (vs.plant) tissue to create molecules that store energy via step-wise entropy up a the analog scaling of canal-locks. One of the funniest connections between both flora and fauna tissue is that one of the more efficient groups of electron-transport carrier molecules (albeit, slightly more primitive ones) are found in spinach and familial plants. Popeye's creator was right on the mark. Spinach is a premier source of energy for muscular activity. {This author has patent pending on concentrated forms of the bio-molecules as an energy source to promote human health, strength and stamina.}

The next thing for researchers to quantitatively approach is to translate "stored energy" into "information". The kinds and quantities of "information" can be clearly distinguished for each level of ffunction. -Apollonian holographic transcription of information means that energy encounters will retain data and react to data beyond the local Integrity structure. For example, a Marcus envisioned molecule now incorporates a range of behavioral flexibility. That flexibility is "information" retained in the qualities and abilities of the molecular structure to "respond" to a wide variety of molecular EM presentations from the environment. Consciousness crosses the threshold from "stimulus-response" to "controlled behavior". Very pertinent, very specific, for each molecule or grouping of gradient interlinked molecules.

Massive crosslinked pathways begin to accumulate extraordinary information quantae. And the extent of "conscious behavior" depends on the fitness, vigor and magnitude of integral feedback loops. It is a process of Synergy. Simple components combining to make responsive-entities which are so much more than the simple sum of their parts. Old style reductionist thinking would say that this makes Life only a conglomeration of mundane mechanistic operations. Quite the contrary. The way we or any life forms experience "living" is exactly what our perceptions tell us it is. We are not randomly quivering blobs of protoplasm. We are thoughtful, conscious, spiritual, emotional, energetic, goal-oriented, striving sentient creatures. We seek to endure. We grow biologically and mentally. We do not need to consciously control every biochemical reaction that we are made of ... an antigen/antibody binding site does not need to fixate the location of every electron or nuclear particle within its whole molecular matrix - only work with the level of information it is capable of - the corresponding molecules it shares its space with. Ceptual models, languages, societies, physical health, economic welfare, art, creativity, hopes desires and dreams - these and more are Real for us. In analog. In fact. Qualities that are equally as real and extant - existing as Potential - in each and every atom and sub-atomic particle.

 

[end Part 15]      2025 Copyrights ceptualinstitute.com