The Evolution of Complexity.

Introduction.

To simplify our brief discussion of this most complex theme we are adopting the philosophy doctrine of the excluded middle proposing that the exposition is meaningful and truthful, i.e., sense phenomenal physical descriptions and metaphysical logic explications cannot be simultaneously true and not true because true, consistent, objective, coherent and falsifiable semantic beliefs is the core concept.

Thus one may ask 1) who is the observer transferring the information? ; 2) do we all experience change? 3) is the change experienced linear in its progression? 4) do we all witness the same quotidian existential reality? 5) do we all consistently experience realities that resist being framed into symbols or sentences to ontologically describe or epistemologically explain their meanings? 6) do we all give priority to the biological well being before being concerned with psychic happiness or social acceptance?

Consistent with the doctrine of the excluded middle we find it convenient to use a digital binary code, yes or no. Thus 1) is yes, the observer is always a human being with all the ontological perceptual and/or epistemological conceptual (combinatorial) limitations the human brain consistently exhibits. 2) yes, as consistent recorded history demonstrates. We conveniently epistemologically conceptualized the ‘time’ function as a measure of change. 3) no, because we all experience recursive cycles in nature, night and day, seasons, etc. Consequently the cycle is not circular but a vortex spiral as it generates a new cycle. 4) no, because existential reality is in the brain dynamics of the beholder where the relative position of the observer in relation to the observed and the internal body proper (neurohumoral) and external information input into the human brain influence the language exposition (language structure) 5) yes, as statistically evidenced in all humans interviews. This experience generates beliefs. 6) no, because recorded history evidences the lives of prophets as behaving against self interest even though the biological imperative drive is in control in the overwhelming majority of the human species as consistently observed. As if our human species were reflexively and subconsciously driven to stay alive as observed even during unconscious sleep states when we track the exact position a crawling insect on our face before slapping it. This documented behavior constitutes evidence on the joint participation of both genetically inherited and environmentally acquired control as consistently observed. This behavioral feature we share with evolved subhuman species where the biopsychosocial drive (BPS) is in control.

How do we then reconcile the multiple complex content of interacting variables such as the amount of data, code, text or transfinite radiation input into the dynamic human brain where the information is stored, processed and linguistically or instrumentally transferred as adaptive motor responses to the varying contingencies at play? The short answer is that there cannot be absolute certainty but we can settle for probabilities of occurrence adopting the convenient tools of a probability calculus as refined with Bayesian considerations. Enter quantum theory where we do need to consider the fundamental role played by the instruments used in the measurements that may cause ‘entanglements’, ‘simultaneous (‘spooky’) control activity at huge distances’ and other wonders of creative human activity. Interestingly, in the process the human mind is arriving at a theory of everything (TOE) by adopting a hybrid epistemontological approach based on the mediation of a mathematical physics vector analysis language using the ‘bra-ket’ notation.

Argumentation.

…Surprisingly, the evolution of complexity can be equated with the evolution of a quantum theory based on probabilities and its amazing predictive value. It has provided meaning, truth, objectivity and coherence to an otherwise supercomplex and individualized existential reality evolving every instant.

The probability of gaining a deeper and meaningful insight into the workings of nature is predicated on the real possibility of accessing the information content in a closed system with boundaries. If the information sources recedes into infinity it becomes inaccessible. Consequently, we need to work with assumed closed systems allowing the observer to precise the temporal course of specific objects/events. This is accomplished by substituting transfinity coordinates for such objects/events for the unreachable ever receding infinities in a vacuum. Nature abhors the vacuum. This is accomplished by explicating the properties of a given system by assigning a projection operator for each relevant property in Hilbert Space or subspace thereof. We also need to characterize the spatio-temporal course of events as being not random but stochastic events in nature to statistically assess their many probable behaviors. This way instead of characterizing the spatio-temporal unit evolution of a system as preceding its measurement and collapse, we now assign a Schrodinger function to each coexisting individualized event. For this conceptualization to work it is necessary to assume a linear, time ordered sequence of events so that the projection operators on a tensor product can be mathematically transformed into additions. Besides, only those objects/events that may be assigned the probabilities of 2n possible outcomes can be given a physical interpretation so that it supports the required Boolean algebra. The particular state of a system in space time, when each relevant variable is individualized, is conceptualized as the evolving trajectory of a point singularity in phase space with independent and exclusive, albeit interactive existence in a coarse grained phase space containing varying subsets of arbitrary sized cells. Applying the binary code again, when a given cell has a value 1 it means the singularity point is inside the cell in question or not (value 0) where Bi and Bj indexes these cells such that the integral sum of Bi = 1 and the product BiBj = alpha ij Bj represent the spatio-temporat evolution of the complex system. The metaphysical logic analysis requires that this tensor product of the successive phase spaces be conceptualized as having a volume with an a priori probability of 1. However quantum theory conveniently requires the use of Hilbert Spaces instead of phase space in order to individualize each variable and evaluate their relevant interactions.

The spatio-temporal evolutionary time arrow progression of a complex system from t1 t 2 can best be represented using Heisenberg orthogonal projection operators, the mathematical notations details of which (P(t) = e{Ht P(t0) e{Ht: ) is beyond the scope of this brief essay. Suffice it to say that the orthogonal representation allows for the assignment of probable relevance and weight of a calculation result. The weight of a linear progression between two points in space time is represented as their tensor product mathematically transformed into additions of their phase states histories. It also allows, in our opinion, to structure observations in terms of a time-independent reference invariant state as being influenced in its perception by an observer by the interactivity of the various variant relevant states interacting.

But existential reality is not exclusively linear, it has a cyclical recursive component that cannot be ignored. Therefore we need to assume that the human brain linearizes the information input so it can reach the appropriate processing brain neuronal networks. Unfortunately a consistently reliable, credible fundamental biopsychosocial (BPS) model theory of human brain dynamics cannot depend exclusively on a ‘measurement interpretation’ and we need to assume the co-existence of real and virtual processes. We anticipate major problems in marketing our speculation on the existence of dark baryonic bosons DNA/RNA receptors because the phase space may be coarse-grained and dividing it into a set of individualized cells of arbitrary size that are mutually exclusive and jointly exhaustive may constitute a mathematical intractable nightmare. Furthermore, our long range goal of our BPS model of brain dynamics becoming a fundamental theory of everything (TOE) cannot rest exclusively on measurement-based interpretations that restricts its reach beyond the mathematical formalism of experimental physics. So much for the current fanciful attempts to cajole quotidian existential reality within the confines of the probabilities generated by artificial intelligence ‘algorithmic complexity models’, i.e., robotizing human true reality leaving out anything irreducible to language representations such as ethical and moral considerations, affect or the belief in transfinite sources of relevant information causally efficient in controlling human biopsychosocial (BPS) equilibrium. Thus is our concern when calling for the joint presence of the psychotherapist and the theorist to partake in the evaluation of the underlying brain pathologies as narrated in the expressed language structure of the patient. The truth content of an underlying pathology can be manipulated in any medium whether expressed in symbols, sentential logic or a garden variety language structure.

On the other hand if you dream about a BPS fundamental TOE then there are two important elements to take into consideration: the relation between a framework and

quantum / Bayesian conditioning logic reasoning and whether the framework rules are not only convenient but also relevant and compatible as a minimum. In our case, as briefly noted above, it requires the tensor products of the sets of 2-d subspaces projected into Hilbert space as amended by Wheeler’s ‘delayed response’ in transactional exchanges to accommodate the wave/particle superposition of states recorded by the grid interferometer detector results. Intuition is a poor resource in guiding quantum mechanic’s experimental designs (“spooky actions at a distance”) The von Neumann approach is taking the distinctive results of quantum measurements as its point of departure for developing the formalism ‘after the fact’ without taking into account the previous state of the system preceding the measurement. Needless to say that macro cosmology and micro sub Planckian micro states both requires a closed system approach and need to be reconciled before quantum gravity and the conjecture we make about a dark bosonic polynucleotide receptor takes hold as the putative link between a transfinite source and the human brain neocortical premotor phase space acceptor. ..

But, absent information about the preceding events before the measurement that collapse the wave function gives us the opportunity to model a fundamental BPS TOE approach if we split the Schrodinger wave function into two inseparable parts, one representing an invariant system and the other representing the variant, interacting environmental circumstances, R_(t) that may deceptively modify the perceptual evaluation of the invariant element. It should be noted that the Schrodinger function itself does not generate a binary code element and needs to be complemented with a density matrix element.This we do by adopting the Diract analytic tool expressed in bra-ket notation, as published elsewhere. The metaverse and a themal bath vat housing a living brain both may qualify as the ‘environmental circumstance’. This approach is necessary to minimize the number of relevant and necessary variables and while insufficient to deduct such quasi-deterministic detailed laws from first principles, it minimizes the operational conumdrum of handling such so many variant, interactive complexities as they evolve from state to state in space time. This way, the invariant of coarse grain in the macro domain of discourse have the largest probabilities are those

ascertainable with the classical equations of motion making the goal of decoherence more probable thereby minimizing deviations from those predictions based on their measurements. Operationally it makes easier to distinguish the individuality of the summed over (tensor products) participating variables causing decoherent distorsions (noise, interference, dissipation, etc) in the phenomenological evaluation. By trading off with the appropriate analog equations of motion, hydrodynamic, energy, momentum or other conserved quantities variables, we can couple better with the elusive features of the successful quantum theoretical micro sub-planckian domain until we achieve their reconciliation. Keeping in mind the absolute truth content feature as the guiding goal in the analysis, the main barrier to a reconciliation is that a mixture of interacting quantum states modified in its expressions by the environmental circumstances, tags different probabilities to the participating components and the actual features of the instrument conducting the measurement selects only one of many possibilities as expressed by the resulting signatures as tracks, ionization patterns, bubbles, frictional heat dissipation, etc. or in the case of human language semantic syntax structure, as reports plagued with subconscious confabulations, lies and other distortions from the truthful etiology. Without a shared language there is no way to distinguish

**what is the case from what is thought to be the case. As it turns out, quantum theory probabilistic calculus has provided the most reliable scientific methodology so far. Maybe this is the end of the line for the human mind to scrutinize the relevant but invisible features of our existential reality.

Summary and conclusions.

This essay aims at providing arguments for the adoption of the Epistemontological hybrid biopsychosocial BPS model of brain dynamics as extended to satisfy the requirements to be considered as a fundamental theory of everything TOE pertaining the mesoscopic domain of human existential reality in general. Specifically it emphasizes the importance of how, in the author’s opinion, understanding how the unavoidable evolution of complexity modifies our phenomenological perception of our own human existence in this 21st. century. At the forefront comes the ontological aspect of our model especially when under the overwhelming pressure of the radical branch of the physicalist reductionist persuasion try to restrict epistemological cognitive analysis to scientific methodological tools based exclusively on ontological valuations of the truth value content derived from allegedly ‘objective’ measurements assigning quantitative values to the properties measured while deliberately excluding metaphysical ingredients of crucial and important relevance. This is not to be construed as ab initio rejection of the Von Neumann model of reality or the Copenhagen standard approach. On the contrary, in our hybrid model they constitute the most reliable interpretation of the macro world of deterministic classical physics as compared to the quasi-deterministic nature of the quantum theoretical domain of spooky conjectures and speculations albeit their impressive success in their behavioral predictions of systems beyond sensory threshold in the micro sub-planckian and the macro cosmological domain. Our focus is situated in between both extremes of the reality spectrum, at the mesoscopic level of organization where assigning a value to a measured property means that the property posseses such value as falsifiably, consistently and coherently documented by all observers at all times. This pragmatic approach is well suited as a starting point of a more detailed investigation at all levels. This strategy has applied to the quantum interpretation of reality. But we should continue in search of the defining structure/function of relevant invisibilities beyond audiovisual or instrumental threshold of detection. level

value. Ultimately, the attribution of properties to a system is true if and

only if, ontologically their probabilities of occurrence must correlate to objective and intrinsic properties of the physical systems being considered. What is truth? Are qualia real? If, as Von Newman suggests, the truth content depends on the measure instrument used, how can anyone epistemologically represent a particular instrument? Absolute truth is in the brain of the beholder. Because of our mesoscopic focus, Tarsky’s characterization of the truth value of propositions as controlled by underlying subconscious mental states especially when expressed in languages with different syntax structures, is routinely validated in the real world. The abstract human being does not exist independent of his individualized circumstance that follows him like his shadow. Language codes for the circumstantial brain representations that guide behavior and the incorporation of knowledge about self, others and the environment.

A predominantly subconscious routine life style often impedes the distinction between false and true beliefs. This is especially so when, paradoxically, both may consciously coexist!

Finally, those members of the physicalist faith who exclusively and blindly rely on mathematical physics reductions of reality should be reminded that their commitment to the truth of a scientific claim anchored on statistical surveys, a measurement or a metaphysicl logic deduction implies tacit acceptance of all aspects of the underlying framework, data collection, adequacy of the measure instrument, etc. where many aspects are epistemologically irreducible or ontologically defective. So much for scientific methodology skepticism. We suggest again an epistemontological hybrid approach integrating classic and quantum theoretical perspectives when in harmony with the going orthodox interpretation of real time existential reality such that amount of data, the code or text that is stored , transferred, received or modified by the relevant medium is accounted for in the final result.

References:

1) http://philsci-archive.pitt.edu/4549/2/FinalCH.pdf

2) http://plato.stanford.edu/entries/information/supplement.html

3) Blog site: https://angelldls.wordpress.com/;

Dr. Angell O. de la Sierra, Esq.

In Deltona, Florida Winter 2013