In this essay I will attempt to approach Whitehead’s conception of Process, starting from a Substantialist paradigm and gradually replacing that with Whiteheadian notions. I will only focus on a basic formulation of concrescence as the full paradigm requires a great deal more explanation than is warranted here.
One long standing idea of physics is that subatomic particles of a given type are indistinguishable from others of their type except for their state. For example, any one electron is no different than any other electron. The state of a particle is the set of valuations for each of the particle’s attributes with respect to the rest of the universe. These include position, velocity, mass, spin, etc.. Some of these are considered intrinsic, such as spin and charge, while others are contingent on the particle’s embedding, its place in the universe, which assumes a spatio-temporal apparatus to describe that embedding. These attributes are not all precisely and independently determinable because of the Heisenberg Uncertainty Principle. However, the formalisms of Quantum Mechanics account for the Uncertainty Principle and thus the state of a particle can be expressed in a well defined manner. The pairing of a particle’s state with its species would seem to be a complete description.
Whitehead would say, “not so, according to the error of simple location.” This mistake holds that it is not coherent to consider anything to hold a location in spacetime as independent of everything else in the universe. The error of simple location is an example of the Fallacy of Misplaced Concreteness which proscribes making a reality of an abstraction. Here it is the abstract use of a coordinate system assumed in the equations of mechanics being reified to become a property ascribable to a particle-state. Whitehead continues to expound on this fallacy to eventually conclude that a particle can only be defined in terms of its relations to everything else in the universe.
There is however an internal incompatibility of QM that does not need to refer to Whitehead’s fallacies and yet achieves the same conclusion. This incompatibility occurs when we consider quantum entanglement. When two particles are not entangled, their states can be independently defined and thus, in a sense, the species-state object could be placed in space-time independent of anything but a chosen frame of reference, notwithstanding Whitehead’s admonition. The states can “follow along” with their respective particles. However, if these particles are entangled, then their states are not separable and must be considered as components of a singleton system state to fully capture the physical outcome of subsequent interactions. But if the states cannot be somehow attached to the particles independently, where exactly does the state itself reside? The formalism doesn’t allow for a broken lovers’ locket style of separability that could be rejoined when again they meet. Whitehead would give us a tut-tut and point to the physicist’s notebook, properly placing the map in the atlas rather than in the territory. But to the particles themselves, it would seem that they must somehow “remember” their previous tryst. Taken to the extreme, every particle must keep accurate records of every encounter it has ever made with every other particle. The substantialist ontology has no provision for such a bookkeeping in situ, yet the relationship has been empirically demonstrated as Einstein’s “spooky action at a distance” in action. The only reason it appears as an apparition to Einstein is precisely because the ontology he works from is inadequate to account for all observable phenomena. The metaphysics assumed by the Substantialist’s simple particle formalism does not follow from the model itself.
The substantialist reification of a ‘hairless’ particle is based on the success of the Cartesian/Newtonian/Maxwellian technique of describing particles in a simple system of coordinates and applying a set of field equations to predict outcomes. This is an example of Whitehead’s admonition against the narrowness in the selection of evidence, although in his use he is referring to the excision of spiritual experience. This is an example consequence of the Fallacy of Misplaced Concreteness in that there exists a cognitive dissonance between the Substantialist conception and observable phenomena which gives rise to the durable meme of the quantum world’s weirdness.
It is clear from the entanglement observations that while the formalism of an extrinsic system state may be useful for calculations, the ontology must be usurped by one that can present the history of a putative particle when it comes time to resolve that history with other historical lines presented by other putative particles at the nexus of an event. Such an ontology is postulated by Whitehead’s actual occasions, with their complete concrescent history, perfectly embodying the necessary state information to properly account for the historical strains. Here we have an ontology of records that does not need a physicist on hand with pad and paper. For the time being however, we will continue to work from the Substantialist model until we arrive at a congruence to Whitehead’s model.
The term ‘event’ in physics has classically been used to denote a collision between objects or the absorption or emission of a photon. A simple extension of the Substantialist paradigm accounted for quantum uncertainty by reintroducing the notion of an ‘event’ as the collapse of the wave function. Without any apparent causal chain to explain why the wave collapses in the spacetime location that it does, physicists have stopped short of making any metaphysical claims beyond a stochastic explanation. All that is known about an event is the subsequent states of the interacting particles. The paradigm has that the states have been completely determined during the event and no further accounting needs to be done. The states are then a degenerate outcome, a collapse of all that fed into the event into the new states. But as stated before, this is insufficient to properly generate a metaphysics of entanglement. There’s no ontological residence for all the necessary information to repose within.
If we accept that for an event between two entities to entail the reconciliation of their respective histories, we have to consider what it means for there to be unreconciled details, or for that matter, what it even means for there to be a difference in the histories. If we start with an event that gives rise to a pair of entangled particles, we can rightly say that at the point of departure the particles have a common history. Going forward, the two particles will diverge in their encounters and thus their respective chains of events will emboss different sets of histories from their differing social environments. When finally the particles reunite in a subsequent event, that event must reflect that these are not independent particles, but that they are constituted by histories that are partially identical and partially different. In the classic conception of a particle, the fact that they share a partially common history can be described using an entangled system state that is an adjunct to the pair of particles. But now we will consider that a particle can carry its historical record, and so no such extrinsic state overlay is needed. Going forth from the conciliatory event, the particles merely continue to carry that same portion of history that is common.
There is however that portion of the histories that are not shared. Here we must consider what it means for a particle to encounter another particle with a vastly different history. And in particular, we must consider that our two entangled particles may have both encountered a third particle, but in separate events. Somehow the nature of the third particle must be accounted for in the conciliatory event along with the common histories of our entangled pair. So too, a single particle may interact with a second particle numerous times, leading to numerous segments of common history being carried in the particle’s records. The only way this can happen using an intrinsic historical record is for a particle to ‘take on’ the histories of every particle it ever encounters. Our simple particle has gone from a few degenerate state variables to embodying a vast network of histories with complex common and divergent pathways.
At this point it may be useful to drop the conception of a particle entirely and focus on the events between them. Our original concept of a particle, with only a few intrinsic state variables involved in the workings of an event, made ontological sense. It travels from event to event, its state getting updated according to simple rules working on a short list of parameters. With our new conception of history-keeping however, a particle looks more like a messenger carrying every bit of knowledge of the universe known at one event to feed into a new event. The shorthand of simple state can’t convey the subtlety of even a single entanglement let alone 13.7 billion years of encounters. Our concept of a particle may still be valid, but it doesn’t add anything to our model except some notion of conveyance. Lost is the simple substance that made the concept so attractive, both as a practical means of calculation and as a metaphysical object. And as an approximation, a system of simple particles with adjunct system state tables is entirely useful for predictive modeling. It should be sobering to realize however, that in the field of quantum computing, where the adjunct state table may span a mere thousands of entangled particles, it is not possible for a classic algorithmic computer to model the system, not given billions of years and a galaxy’s worth of matter to build the computer.
If however, we look at what it means to be an event, we can now posit that it is a unique combination of histories of the universe - a perspective on some of what has transpired so far. An event in its formation takes in other events - other histories - and combines them into a single history which is constituted by a multiplicity of ingressing histories. We might go so far as to describe an event as an experience of the universe, a universe composed entirely of other events. This last point is a metaphysical watershed. Just what is a universe of such events? If an event is an experience of a portion of the universe, and the universe is composed entirely of events, then the universe is the set of all experiences of itself.
The term ‘event’ however has the drawback of connoting something that is localized in space and time. In our conception of a web of interconnected events, we haven’t made any requirement as to such location, even though we are keeping in the back of our mind that events obey a sense of order that we have classically understood to be a consequence of temporality. Accordingly, it is useful to drop the term ‘event’ and use the term ‘experience’ instead. This allows us to focus on the metaphysical aspects of our concepts and defer the inevitable reconciliation with the substantialist understanding of a spatio-temporal ground of being. It also allows us to begin considering what it means for our human sense of experience to be a composite of trillions of subatomic event-experiences.
Returning to the notion of a universe that is composed of the set of all experiences of itself, we note that such a set as described does not comprise the set of all possible experiences, only those which have actually occurred. But we know that new experiences are occurring, at least as we see it in our temporal mode of existence, so the universal set of experiences is increasing. But this implies that the universe is drawing on a pool of possible, not yet actualized experiences, somehow converting them into real experiences. Our universe is thus more than just its history. It includes something of possibility, but a possibility limited by what has already transpired. There is a relationship between what has occurred and what is possible, that binds these into a coherent framework. Without such a relationship we could not experience anything of durability, just a random haze devoid of meaning.
It becomes useful at this point to define Reality as the set of actualized experiences, Potential as the set of possible, but as yet unrealized experiences, and Process as the mechanism by which Potential is converted to Reality. So far, we can say the universe is composed of these three: Reality, Potential, and Process. Note that there is no expression of the Laws of Physics in this formulation, no Standard Model, no General Relativity, and no grounding spatio-temporal context with which we express those concepts. These will have to be accounted for in the explanation of Process, or at least demonstrated to be emergent phenomena of Process. However, the formulation so far is able to metaphysically accomodate what we know of entanglement in a way that the prevailing paradigm does not.
Our new notion of experience and how experiences flow one into the other is a rudimentary version of Process. This version has been derived from the substantialist notion of a particle that travels from one event to another, which in turn was framed in a context of space-time embedding. We would like to understand this in the reverse direction, starting from Process as ‘experience absorbing experience as a set of historical traces’ and arriving at the ‘event-particle-event within spacetime’ ontology. We do not as yet have anything in our new model that might do this, so we will add a dollop of dimensionality as a measure of the difference between two ingressing experience-histories. This is not to say that we are adding anything ontological, but rather that we can reduce the complex differential history to a degenerate form of simple parameters that are congruent to the familiar classical forms of spatio-temporal measure. Other classic measures such as momentum may also yield to this reductive abstraction. Again, the true metaphysical process is the ingression and reconciliation of a vast historical difference, which we can describe in the degenerate shorthand of Cartesian/Newtonian/Maxwellian/Einsteinian/Schrodingerian terms, as long as we are looking at “disjoint” substance. A simple analogy would be to consider the trace of a person walking around a city before stopping at a cafe on the corner. The event end-points of departure and arrival delineate an interval which can be described as a simple spatio-temporal vector of translation, while the true path is a long series of translations down one path after another. The person’s experience may include a great many facets of their journey such as people they met, but our observation of the endpoints sees only simple translation in space and time. We might be amazed that at some later date we discover that our person knows someone else that we know, but that is because we have elided all of their experience along their journey. Such is the way Substantialism elides the full experience of Process. Until the hard empirical results of entanglement were observed, the paradigm could push such radical empiricism aside to concentrate on events between particles.
However, in order for this reverse transform to be complete we need something of an atomic unit of spatio-temporal translation that can be ascribed as a feature of the experience to experience movement. We can find this in the Uncertainty Principle which defines the Planck interval as a minimal spatio-temporal unit of measure. Anything is possible as long as it occurs within a space smaller than this interval. For example, a so-called virtual particle can pop into existence, interact with objects more macro than this interval, then pop back to the aether. The Planck interval is essentially a threshold between the Real and the Possible. The entire structure of Quantum Field Theory, the Standard Model, and the spacetime/information calculations of black holes operates with this threshold in mind. Precision measurements of fundamental constants demonstrate that whatever is possible beneath or within this threshold must be considered when predicting those constant values in the “real” world. Using this fundamental interval as the most atomic quantum of spacetime that can be carried with an experience to experience Process, we can imagine the macro scale spatiotemporal ground of the substantialist paradigm as an emergent phenomena of the reconciliation of experience histories.
We now have a pair of transforms between Substance and Process that are useful for understanding how these systems relate at the atomic level. With the paradigm of an experience-history ontology we have gained an intrinsic property that accounts for quantum entanglement. By placing the Planck interval somewhere in the event to event process we have gained a mechanism for emergent spacetime that was not apparent in the substance paradigm. But we haven’t yet included all of the features of Whitehead’s concrescence model, in particular those that derive from having a Mental-Physical polarity in which concrescence occurs. Neither have we managed to depart from a deterministic flow of reality, having only replaced event-substance-event with experience-combination-experience. There is no intrinsic requirement for choice to enter into either paradigm to make it work. We now turn to how to add the component of free-will in a coherent way that will buy us a more secure justification for the inclusion of the Planck interval.
In both substance and process models an element of indeterminism is included. In the substance ontology this shows up in the Schrodinger Wave Equation. Here we find the Planck interval forcing an incomplete description of the outcome of a prediction. Squeeze one state parameter to an arbitrary surety and its complementary state will flow into an inversely proportional indeterminate extent. The Copenhagen interpretation of the wave equation’s spread is that it represents a probability density. This is experimentally verified through a large number of trials which, in aggregate, show a likelihood map of outcomes. However, the interpretation does not apply to a singleton event which by definition has a 100% probability of occurring as it has, regardless of any prior field of unknown outcomes. In the usual application of wave mechanics, a world of possibility is opened up as a particle proceeds away from an event. The notion here is one of projection, one of a shape of potential determined by the past. There is only the Real of what has happened in the past and a tightly bound extension into the future which remains purely stochastic even as a contoured range of possibility. Most importantly, this is seen as a purely atomic level phenomenon, not extensible to macro objects. Very little complexity in a system of particles will lead to quantum decoherence, a process whereby the indeterminism is squeezed out by the bulk properties of the system. This can be directly traced to the reification of the degenerate state variables of simple particles. The more “random” interactions in a complex system, the less traceable are the myriad interconnected relationships between the particles. When we ignore those relationships, we can identify a state variable such as polarization as taking on a value from 0 to pi. Like our example person perambulating the city, we ignore their excursion with all its interconnections and find them either at home or at the cafe. It is neither convenient nor practical to keep track of the radically interconnected experience of a particle so we create an abstraction to make it manageable.
Similarly, when we attempt to find an abstract model based on this paradigm that could account for free will and devise experiments to demonstrate its existence, we discover that an intensely complex system of equations would be needed to translate simple atomic state variables into a predictive framework that can be measured. The Substantialist habit of narrowness in the selection of evidence precludes the most obvious demonstration of our own agency. This habit is declared as metaphysically foundational for scientific epistemology. The strong abhorrence to Rationalism that was born as Scientism grew forth places empiricism prior to conception. But the empirical method of Scientism is firmly cemented in the Real and does not extend to the possible. The evidence of quantum indeterminism cannot support more than a stochastic interpretation because the empirical method is woefully inadequate for making determinations beyond such an interpretation. To be sure, it is difficult to imagine how an urge of free will would translate to subtle shifts in the projected probability wave of a simple event-particle ontology. That the intention, “I want to go there” should somehow cause trillions of particle interactions and wave collapses within my body to coherently bend in just such a fashion as to effect my desired movement seems ludicrous. Instead, the paradigm asserts a hard causal determinism, implying that the intention is determined as well.
Turning to the Process model, we have similarly placed something akin to the Planck interval somewhere within it. Again, we consider that it represents some threshold between the Real and the Potential. In this case, our field of the possible contains experiences that have not been actualized. If we take a similar approach to the Substantialist model, we can consider something like a wave equation that projects forward from actual experiences into the field of possible experiences. Now however, we aren’t considering the stochastic selection of an isolated event with a limited effect upon the degenerate states of a pair of particles, we are considering the selection of an entirely new perspective on the universe - a new experience of the universe in self reflection. This is not the simple advance of an isolated event whose selection and effect seems of a vastly distant scale from the complex sense of agency that we experience as humans. Instead, the experience model forces a complete inversion of the complexity scale, wherein a single atomic experience is vastly more complex than the simple, isolated experience of a single human as we are accustomed to. This is another metaphysical watershed.
How is it that we don’t experience the vastness of the universe if we are composed of trillions of atomic experiences, each having access to the fullness of the universe? The answer lies in how all those atomic experiences are composed. First and foremost we must understand what it means for two experiences to be combined in a coherent way. It is not a simple sum, but rather a reinforcement of those parts of the histories that are common and a dissonance where they are not. In a sense, we can consider the common parts to be a “truth” in that all of the combining experiences are in agreement that this is the way the universe has unfolded so far. This sense of truth is not a binary determination but rather has a value representing the degree to which the histories agree. Histories that run deep form intense “habits” to use Whitehead’s terminology. These are very hard to steer away from because so much of our composite experience is based upon them. At the other end of the spectrum are those histories that have run divergent paths and are not so ingrained in our composite experience. In fact they may be entirely at odds with each other, representing a composite path that cannot remain in dissonance for very long before the tide swings one way or the other. This is perhaps the source from which the idea of Schrodinger’s Cat arises.
Considering what I experience of the universe, I can see that I have a localized perspective, that my “view” is occulted by objects that intervene between myself and objects beyond them, and I have a sense of durability and continuity. Yet I also have a sense of freedom to move within the constraints of physicality. Looking at how all of the atomic experiences flow into my own local society of experiences, I can consider my current experience and that of an intervening object as they come together. The object’s experience contains not only its experience of me and all the experiences that lie between us, it also contains its experiences of occluded objects along with how the experience of those objects are transformed by the intervening object’s full experience of being in between myself and the occluded objects. I get a measure of the distant object along with the intervening object’s measure. Combined, these must cancel in a manner that occludes my experience of the distant objects, yet still maintain something of history so that the distant objects yet exist in my universe. The key here is that these are not really objects as we normally think of them in a Substantialist way. Rather they are unique experiences of the universe in full complexity, some parts fully coherent and some incompatible, some reinforcing and some canceling. It is from this complex combinatorial process that our experience becomes localized.
We now return to where in the realms of Real and Potential these dissonant strains of experiences reside. It doesn’t take much thought to realize that they should be placed within a spectrum along an axis between the Real and the Potential, or what Whitehead calls the Physical and Mental poles. To the degree that the experiences are coherent, they are located closer to the Real pole. To the degree that they are incoherent and therefore undecided, they are deeper into the Potential pole. This brings up the curious observation that there is a space in which the Potential gets sorted out into the Real, and this space is what we have brought in as the Process version of the Planck interval. In the Substantialist model we saw that in order to make accurate predictions we must consider all the possible events that can arise behind Planck’s kimono. In the Process model, we can similarly sort through all the possible experiences to find what is most coherent, and when these coalesce into hardening experiences closer to the Real pole, they carry along with them a morsel of the Planck interval within which that sorting occurred. Behind Planck’s kimono is a timeless and spaceless realm of possible experiences, each of which contains a unique perspective on the entire universe. That is quite an extensive abstract field that is radically more infinite than the residual, hardened Reality we believe we exist within.
The process of selection of a future is the essence of free will. Our experience of what it means to formulate an intention and see it through maps perfectly onto the Process model which posits possible future experiences and charts a path through them. This mapping is far more intuitive than imagining how an extremely complex substance ontology could support a similar ease of congruence. In addition, the possible futures allowed at the atomic level in the substance model have nothing to do with meaning as we understand it. This has caused no end of consternation in philosophical discourse which has attempted to bond wave collapse, which is the only element of the paradigm which is non-deterministic, with free will.
I have attempted to replicate the foundations of Whitehead’s process by working from a deficiency in the metaphysics implied by quantum mechanics. By merely adding the requirement that quantum entanglement be properly ensconced in ontological entities rather than kept track of in an extrinsic formalism, I have shown that something akin to Whitehead’s conception naturally falls out. I have not covered much that is needed to fully flesh out Whitehead’s model, such as a proper analysis of how light cones or occlusions emerge, or the effect of embedding experience within a protecting organism, or eternal objects, and ultimately the role and primal motive of God.
One long standing idea of physics is that subatomic particles of a given type are indistinguishable from others of their type except for their state. For example, any one electron is no different than any other electron. The state of a particle is the set of valuations for each of the particle’s attributes with respect to the rest of the universe. These include position, velocity, mass, spin, etc.. Some of these are considered intrinsic, such as spin and charge, while others are contingent on the particle’s embedding, its place in the universe, which assumes a spatio-temporal apparatus to describe that embedding. These attributes are not all precisely and independently determinable because of the Heisenberg Uncertainty Principle. However, the formalisms of Quantum Mechanics account for the Uncertainty Principle and thus the state of a particle can be expressed in a well defined manner. The pairing of a particle’s state with its species would seem to be a complete description.
Whitehead would say, “not so, according to the error of simple location.” This mistake holds that it is not coherent to consider anything to hold a location in spacetime as independent of everything else in the universe. The error of simple location is an example of the Fallacy of Misplaced Concreteness which proscribes making a reality of an abstraction. Here it is the abstract use of a coordinate system assumed in the equations of mechanics being reified to become a property ascribable to a particle-state. Whitehead continues to expound on this fallacy to eventually conclude that a particle can only be defined in terms of its relations to everything else in the universe.
There is however an internal incompatibility of QM that does not need to refer to Whitehead’s fallacies and yet achieves the same conclusion. This incompatibility occurs when we consider quantum entanglement. When two particles are not entangled, their states can be independently defined and thus, in a sense, the species-state object could be placed in space-time independent of anything but a chosen frame of reference, notwithstanding Whitehead’s admonition. The states can “follow along” with their respective particles. However, if these particles are entangled, then their states are not separable and must be considered as components of a singleton system state to fully capture the physical outcome of subsequent interactions. But if the states cannot be somehow attached to the particles independently, where exactly does the state itself reside? The formalism doesn’t allow for a broken lovers’ locket style of separability that could be rejoined when again they meet. Whitehead would give us a tut-tut and point to the physicist’s notebook, properly placing the map in the atlas rather than in the territory. But to the particles themselves, it would seem that they must somehow “remember” their previous tryst. Taken to the extreme, every particle must keep accurate records of every encounter it has ever made with every other particle. The substantialist ontology has no provision for such a bookkeeping in situ, yet the relationship has been empirically demonstrated as Einstein’s “spooky action at a distance” in action. The only reason it appears as an apparition to Einstein is precisely because the ontology he works from is inadequate to account for all observable phenomena. The metaphysics assumed by the Substantialist’s simple particle formalism does not follow from the model itself.
The substantialist reification of a ‘hairless’ particle is based on the success of the Cartesian/Newtonian/Maxwellian technique of describing particles in a simple system of coordinates and applying a set of field equations to predict outcomes. This is an example of Whitehead’s admonition against the narrowness in the selection of evidence, although in his use he is referring to the excision of spiritual experience. This is an example consequence of the Fallacy of Misplaced Concreteness in that there exists a cognitive dissonance between the Substantialist conception and observable phenomena which gives rise to the durable meme of the quantum world’s weirdness.
It is clear from the entanglement observations that while the formalism of an extrinsic system state may be useful for calculations, the ontology must be usurped by one that can present the history of a putative particle when it comes time to resolve that history with other historical lines presented by other putative particles at the nexus of an event. Such an ontology is postulated by Whitehead’s actual occasions, with their complete concrescent history, perfectly embodying the necessary state information to properly account for the historical strains. Here we have an ontology of records that does not need a physicist on hand with pad and paper. For the time being however, we will continue to work from the Substantialist model until we arrive at a congruence to Whitehead’s model.
The term ‘event’ in physics has classically been used to denote a collision between objects or the absorption or emission of a photon. A simple extension of the Substantialist paradigm accounted for quantum uncertainty by reintroducing the notion of an ‘event’ as the collapse of the wave function. Without any apparent causal chain to explain why the wave collapses in the spacetime location that it does, physicists have stopped short of making any metaphysical claims beyond a stochastic explanation. All that is known about an event is the subsequent states of the interacting particles. The paradigm has that the states have been completely determined during the event and no further accounting needs to be done. The states are then a degenerate outcome, a collapse of all that fed into the event into the new states. But as stated before, this is insufficient to properly generate a metaphysics of entanglement. There’s no ontological residence for all the necessary information to repose within.
If we accept that for an event between two entities to entail the reconciliation of their respective histories, we have to consider what it means for there to be unreconciled details, or for that matter, what it even means for there to be a difference in the histories. If we start with an event that gives rise to a pair of entangled particles, we can rightly say that at the point of departure the particles have a common history. Going forward, the two particles will diverge in their encounters and thus their respective chains of events will emboss different sets of histories from their differing social environments. When finally the particles reunite in a subsequent event, that event must reflect that these are not independent particles, but that they are constituted by histories that are partially identical and partially different. In the classic conception of a particle, the fact that they share a partially common history can be described using an entangled system state that is an adjunct to the pair of particles. But now we will consider that a particle can carry its historical record, and so no such extrinsic state overlay is needed. Going forth from the conciliatory event, the particles merely continue to carry that same portion of history that is common.
There is however that portion of the histories that are not shared. Here we must consider what it means for a particle to encounter another particle with a vastly different history. And in particular, we must consider that our two entangled particles may have both encountered a third particle, but in separate events. Somehow the nature of the third particle must be accounted for in the conciliatory event along with the common histories of our entangled pair. So too, a single particle may interact with a second particle numerous times, leading to numerous segments of common history being carried in the particle’s records. The only way this can happen using an intrinsic historical record is for a particle to ‘take on’ the histories of every particle it ever encounters. Our simple particle has gone from a few degenerate state variables to embodying a vast network of histories with complex common and divergent pathways.
At this point it may be useful to drop the conception of a particle entirely and focus on the events between them. Our original concept of a particle, with only a few intrinsic state variables involved in the workings of an event, made ontological sense. It travels from event to event, its state getting updated according to simple rules working on a short list of parameters. With our new conception of history-keeping however, a particle looks more like a messenger carrying every bit of knowledge of the universe known at one event to feed into a new event. The shorthand of simple state can’t convey the subtlety of even a single entanglement let alone 13.7 billion years of encounters. Our concept of a particle may still be valid, but it doesn’t add anything to our model except some notion of conveyance. Lost is the simple substance that made the concept so attractive, both as a practical means of calculation and as a metaphysical object. And as an approximation, a system of simple particles with adjunct system state tables is entirely useful for predictive modeling. It should be sobering to realize however, that in the field of quantum computing, where the adjunct state table may span a mere thousands of entangled particles, it is not possible for a classic algorithmic computer to model the system, not given billions of years and a galaxy’s worth of matter to build the computer.
If however, we look at what it means to be an event, we can now posit that it is a unique combination of histories of the universe - a perspective on some of what has transpired so far. An event in its formation takes in other events - other histories - and combines them into a single history which is constituted by a multiplicity of ingressing histories. We might go so far as to describe an event as an experience of the universe, a universe composed entirely of other events. This last point is a metaphysical watershed. Just what is a universe of such events? If an event is an experience of a portion of the universe, and the universe is composed entirely of events, then the universe is the set of all experiences of itself.
The term ‘event’ however has the drawback of connoting something that is localized in space and time. In our conception of a web of interconnected events, we haven’t made any requirement as to such location, even though we are keeping in the back of our mind that events obey a sense of order that we have classically understood to be a consequence of temporality. Accordingly, it is useful to drop the term ‘event’ and use the term ‘experience’ instead. This allows us to focus on the metaphysical aspects of our concepts and defer the inevitable reconciliation with the substantialist understanding of a spatio-temporal ground of being. It also allows us to begin considering what it means for our human sense of experience to be a composite of trillions of subatomic event-experiences.
Returning to the notion of a universe that is composed of the set of all experiences of itself, we note that such a set as described does not comprise the set of all possible experiences, only those which have actually occurred. But we know that new experiences are occurring, at least as we see it in our temporal mode of existence, so the universal set of experiences is increasing. But this implies that the universe is drawing on a pool of possible, not yet actualized experiences, somehow converting them into real experiences. Our universe is thus more than just its history. It includes something of possibility, but a possibility limited by what has already transpired. There is a relationship between what has occurred and what is possible, that binds these into a coherent framework. Without such a relationship we could not experience anything of durability, just a random haze devoid of meaning.
It becomes useful at this point to define Reality as the set of actualized experiences, Potential as the set of possible, but as yet unrealized experiences, and Process as the mechanism by which Potential is converted to Reality. So far, we can say the universe is composed of these three: Reality, Potential, and Process. Note that there is no expression of the Laws of Physics in this formulation, no Standard Model, no General Relativity, and no grounding spatio-temporal context with which we express those concepts. These will have to be accounted for in the explanation of Process, or at least demonstrated to be emergent phenomena of Process. However, the formulation so far is able to metaphysically accomodate what we know of entanglement in a way that the prevailing paradigm does not.
Our new notion of experience and how experiences flow one into the other is a rudimentary version of Process. This version has been derived from the substantialist notion of a particle that travels from one event to another, which in turn was framed in a context of space-time embedding. We would like to understand this in the reverse direction, starting from Process as ‘experience absorbing experience as a set of historical traces’ and arriving at the ‘event-particle-event within spacetime’ ontology. We do not as yet have anything in our new model that might do this, so we will add a dollop of dimensionality as a measure of the difference between two ingressing experience-histories. This is not to say that we are adding anything ontological, but rather that we can reduce the complex differential history to a degenerate form of simple parameters that are congruent to the familiar classical forms of spatio-temporal measure. Other classic measures such as momentum may also yield to this reductive abstraction. Again, the true metaphysical process is the ingression and reconciliation of a vast historical difference, which we can describe in the degenerate shorthand of Cartesian/Newtonian/Maxwellian/Einsteinian/Schrodingerian terms, as long as we are looking at “disjoint” substance. A simple analogy would be to consider the trace of a person walking around a city before stopping at a cafe on the corner. The event end-points of departure and arrival delineate an interval which can be described as a simple spatio-temporal vector of translation, while the true path is a long series of translations down one path after another. The person’s experience may include a great many facets of their journey such as people they met, but our observation of the endpoints sees only simple translation in space and time. We might be amazed that at some later date we discover that our person knows someone else that we know, but that is because we have elided all of their experience along their journey. Such is the way Substantialism elides the full experience of Process. Until the hard empirical results of entanglement were observed, the paradigm could push such radical empiricism aside to concentrate on events between particles.
However, in order for this reverse transform to be complete we need something of an atomic unit of spatio-temporal translation that can be ascribed as a feature of the experience to experience movement. We can find this in the Uncertainty Principle which defines the Planck interval as a minimal spatio-temporal unit of measure. Anything is possible as long as it occurs within a space smaller than this interval. For example, a so-called virtual particle can pop into existence, interact with objects more macro than this interval, then pop back to the aether. The Planck interval is essentially a threshold between the Real and the Possible. The entire structure of Quantum Field Theory, the Standard Model, and the spacetime/information calculations of black holes operates with this threshold in mind. Precision measurements of fundamental constants demonstrate that whatever is possible beneath or within this threshold must be considered when predicting those constant values in the “real” world. Using this fundamental interval as the most atomic quantum of spacetime that can be carried with an experience to experience Process, we can imagine the macro scale spatiotemporal ground of the substantialist paradigm as an emergent phenomena of the reconciliation of experience histories.
We now have a pair of transforms between Substance and Process that are useful for understanding how these systems relate at the atomic level. With the paradigm of an experience-history ontology we have gained an intrinsic property that accounts for quantum entanglement. By placing the Planck interval somewhere in the event to event process we have gained a mechanism for emergent spacetime that was not apparent in the substance paradigm. But we haven’t yet included all of the features of Whitehead’s concrescence model, in particular those that derive from having a Mental-Physical polarity in which concrescence occurs. Neither have we managed to depart from a deterministic flow of reality, having only replaced event-substance-event with experience-combination-experience. There is no intrinsic requirement for choice to enter into either paradigm to make it work. We now turn to how to add the component of free-will in a coherent way that will buy us a more secure justification for the inclusion of the Planck interval.
In both substance and process models an element of indeterminism is included. In the substance ontology this shows up in the Schrodinger Wave Equation. Here we find the Planck interval forcing an incomplete description of the outcome of a prediction. Squeeze one state parameter to an arbitrary surety and its complementary state will flow into an inversely proportional indeterminate extent. The Copenhagen interpretation of the wave equation’s spread is that it represents a probability density. This is experimentally verified through a large number of trials which, in aggregate, show a likelihood map of outcomes. However, the interpretation does not apply to a singleton event which by definition has a 100% probability of occurring as it has, regardless of any prior field of unknown outcomes. In the usual application of wave mechanics, a world of possibility is opened up as a particle proceeds away from an event. The notion here is one of projection, one of a shape of potential determined by the past. There is only the Real of what has happened in the past and a tightly bound extension into the future which remains purely stochastic even as a contoured range of possibility. Most importantly, this is seen as a purely atomic level phenomenon, not extensible to macro objects. Very little complexity in a system of particles will lead to quantum decoherence, a process whereby the indeterminism is squeezed out by the bulk properties of the system. This can be directly traced to the reification of the degenerate state variables of simple particles. The more “random” interactions in a complex system, the less traceable are the myriad interconnected relationships between the particles. When we ignore those relationships, we can identify a state variable such as polarization as taking on a value from 0 to pi. Like our example person perambulating the city, we ignore their excursion with all its interconnections and find them either at home or at the cafe. It is neither convenient nor practical to keep track of the radically interconnected experience of a particle so we create an abstraction to make it manageable.
Similarly, when we attempt to find an abstract model based on this paradigm that could account for free will and devise experiments to demonstrate its existence, we discover that an intensely complex system of equations would be needed to translate simple atomic state variables into a predictive framework that can be measured. The Substantialist habit of narrowness in the selection of evidence precludes the most obvious demonstration of our own agency. This habit is declared as metaphysically foundational for scientific epistemology. The strong abhorrence to Rationalism that was born as Scientism grew forth places empiricism prior to conception. But the empirical method of Scientism is firmly cemented in the Real and does not extend to the possible. The evidence of quantum indeterminism cannot support more than a stochastic interpretation because the empirical method is woefully inadequate for making determinations beyond such an interpretation. To be sure, it is difficult to imagine how an urge of free will would translate to subtle shifts in the projected probability wave of a simple event-particle ontology. That the intention, “I want to go there” should somehow cause trillions of particle interactions and wave collapses within my body to coherently bend in just such a fashion as to effect my desired movement seems ludicrous. Instead, the paradigm asserts a hard causal determinism, implying that the intention is determined as well.
Turning to the Process model, we have similarly placed something akin to the Planck interval somewhere within it. Again, we consider that it represents some threshold between the Real and the Potential. In this case, our field of the possible contains experiences that have not been actualized. If we take a similar approach to the Substantialist model, we can consider something like a wave equation that projects forward from actual experiences into the field of possible experiences. Now however, we aren’t considering the stochastic selection of an isolated event with a limited effect upon the degenerate states of a pair of particles, we are considering the selection of an entirely new perspective on the universe - a new experience of the universe in self reflection. This is not the simple advance of an isolated event whose selection and effect seems of a vastly distant scale from the complex sense of agency that we experience as humans. Instead, the experience model forces a complete inversion of the complexity scale, wherein a single atomic experience is vastly more complex than the simple, isolated experience of a single human as we are accustomed to. This is another metaphysical watershed.
How is it that we don’t experience the vastness of the universe if we are composed of trillions of atomic experiences, each having access to the fullness of the universe? The answer lies in how all those atomic experiences are composed. First and foremost we must understand what it means for two experiences to be combined in a coherent way. It is not a simple sum, but rather a reinforcement of those parts of the histories that are common and a dissonance where they are not. In a sense, we can consider the common parts to be a “truth” in that all of the combining experiences are in agreement that this is the way the universe has unfolded so far. This sense of truth is not a binary determination but rather has a value representing the degree to which the histories agree. Histories that run deep form intense “habits” to use Whitehead’s terminology. These are very hard to steer away from because so much of our composite experience is based upon them. At the other end of the spectrum are those histories that have run divergent paths and are not so ingrained in our composite experience. In fact they may be entirely at odds with each other, representing a composite path that cannot remain in dissonance for very long before the tide swings one way or the other. This is perhaps the source from which the idea of Schrodinger’s Cat arises.
Considering what I experience of the universe, I can see that I have a localized perspective, that my “view” is occulted by objects that intervene between myself and objects beyond them, and I have a sense of durability and continuity. Yet I also have a sense of freedom to move within the constraints of physicality. Looking at how all of the atomic experiences flow into my own local society of experiences, I can consider my current experience and that of an intervening object as they come together. The object’s experience contains not only its experience of me and all the experiences that lie between us, it also contains its experiences of occluded objects along with how the experience of those objects are transformed by the intervening object’s full experience of being in between myself and the occluded objects. I get a measure of the distant object along with the intervening object’s measure. Combined, these must cancel in a manner that occludes my experience of the distant objects, yet still maintain something of history so that the distant objects yet exist in my universe. The key here is that these are not really objects as we normally think of them in a Substantialist way. Rather they are unique experiences of the universe in full complexity, some parts fully coherent and some incompatible, some reinforcing and some canceling. It is from this complex combinatorial process that our experience becomes localized.
We now return to where in the realms of Real and Potential these dissonant strains of experiences reside. It doesn’t take much thought to realize that they should be placed within a spectrum along an axis between the Real and the Potential, or what Whitehead calls the Physical and Mental poles. To the degree that the experiences are coherent, they are located closer to the Real pole. To the degree that they are incoherent and therefore undecided, they are deeper into the Potential pole. This brings up the curious observation that there is a space in which the Potential gets sorted out into the Real, and this space is what we have brought in as the Process version of the Planck interval. In the Substantialist model we saw that in order to make accurate predictions we must consider all the possible events that can arise behind Planck’s kimono. In the Process model, we can similarly sort through all the possible experiences to find what is most coherent, and when these coalesce into hardening experiences closer to the Real pole, they carry along with them a morsel of the Planck interval within which that sorting occurred. Behind Planck’s kimono is a timeless and spaceless realm of possible experiences, each of which contains a unique perspective on the entire universe. That is quite an extensive abstract field that is radically more infinite than the residual, hardened Reality we believe we exist within.
The process of selection of a future is the essence of free will. Our experience of what it means to formulate an intention and see it through maps perfectly onto the Process model which posits possible future experiences and charts a path through them. This mapping is far more intuitive than imagining how an extremely complex substance ontology could support a similar ease of congruence. In addition, the possible futures allowed at the atomic level in the substance model have nothing to do with meaning as we understand it. This has caused no end of consternation in philosophical discourse which has attempted to bond wave collapse, which is the only element of the paradigm which is non-deterministic, with free will.
I have attempted to replicate the foundations of Whitehead’s process by working from a deficiency in the metaphysics implied by quantum mechanics. By merely adding the requirement that quantum entanglement be properly ensconced in ontological entities rather than kept track of in an extrinsic formalism, I have shown that something akin to Whitehead’s conception naturally falls out. I have not covered much that is needed to fully flesh out Whitehead’s model, such as a proper analysis of how light cones or occlusions emerge, or the effect of embedding experience within a protecting organism, or eternal objects, and ultimately the role and primal motive of God.