Whitehead’s Radically Empirical Theory of General Relativity

“The doctrine of relativity affects every branch of natural science, not excluding the biological sciences. . . . Relativity, in the form of novel formulae relating time and space, first developed in connection with electromagnetism. . . . Einstein then proceeded to show its bearing on the formulae for gravitation. It so happens therefore that owing to the circumstances of its origin a very general doctrine is linked with two special applications.”
–Whitehead (The Principle of Relativity, 3).

One of the biggest surprises for me upon reading Auxier and Herstein’s book The Quantum of Explanation was learning that Whitehead’s theory of extension (or “mereotopology” as it has come to be called) has been taken up by computer scientists working in the field of robotic vision (see for example the work of Ian Pratt-Hartmann).

“It is a widely acknowledged fact in this sub-discipline that Alfred North Whitehead’s work on extension is foundational for their enterprise. Our experience has been that Whitehead scholars are simply astounded to learn of this fact. Yet we should have expected and even predicted such a connection” (QE 90).

Guilty as charged. While I think I got things mostly right in section 3.2 of my dissertation (“From Geometric Conditions of Possibility to Genetic Conditions of Actuality”), the promising application of Whitehead’s topological scheme to robotic vision certainly brings this aspect of his project into sharper focus for me. As a radical empiricist, Whitehead was searching for a formal account of our concrete experience of projectively related extensa. We are finite creatures with limited sensory organs and processing capacity. We do not experience the world of spatial relations in terms of infinitesimal points or the geometrical schemes built up from such points. Rather, what we encounter in our immediate experiential field are the intuitive whole-part relational structures formalized by non-metrical projective geometry.

Following Einstein’s articulation of the special and general theories of relativity (in 1905 and 1916, respectively), and his problematic “mono-metric” identification of a 4-D geometrical model with physical space-time*, Whitehead pursued his theory of extension with renewed urgency. Somehow, the uniformity of spatial geometry had to be preserved, else scientific measurement would become impossible. Einstein did not appear to realize that allowing the contingent warping of space by massive objects undermined the fundamental logical requirements of measurement: that space have a necessary and universal structure (or, as Auxier and Herstein put it, “we must have a standard unit of spatial comparison for conjugacy…and standard(s) of spatial projection” so as to bring this unit into comparison with whatever we are trying to measure [QE 102]). By collapsing the difference between physical space and his favored geometrical scheme, Einstein made the structure of spatial geometry contingent upon randomly arrayed masses.

“We must know the complete distribution of matter and energy in the universe prior to knowing its geometry. But we must have a comprehensive grasp of this geometry in order to discover this distribution. As Whitehead pointed out, with General Relativity as our theory of space and gravity, we are saddled with a situation where we must first know everything before we can know anything” (QE 104).  

Einstein’s “mono-metric” model has been one of the most successful in the history of science. But because of the unexpected observations of the rotational velocity of galaxies and of cosmic inflation rates, its theoretical supremacy has begun to be seriously questioned. Some astrophysicists have attempted to save the theory by inventing “dark matter” and “dark energy” to explain the missing mass that would bring observations back into agreement with Einstein’s theory. Auxier and Herstein refer to these inventions as “an especially unhappy piece of nonsense” (QE 20). I’m sympathetic, but I wouldn’t go quite that far. To my mind, these invented entities are akin to the epicycles of Ptolemaic astronomy. In other words, these exotic and invisible forms of mass/energy (which supposedly compose ~96% of the universe) are postulated ad hoc in an attempt to “save the appearances” (as ancient astronomers used to say). Ancient astronomers were tasked by Plato with explaining the seemingly erratic motion of the planets in terms of a theoretical model composed only of uniform circular motions. When new planetary observations conflicted with the model, more circles were added (epicycles) to bring the model back into alignment with appearances. One view of science is that it is just about refining existing theoretical presuppositions to fit new observations, gradually approaching a perfect identity between model and reality. In this sense, the addition of epicycles to match observations could continue indefinitely. After all, Ptolemy’s geocentric model was more accurate than Copernicus’ heliocentric model (which itself still required epicycles until Kepler and Newton updated his math). The geocentric model is still accurate enough that modern planetarium projectors (invented in the 1920s by a company in Jena, Germany) continue to utilize it, reproducing Ptolemy’s deferents and epicycles with their internal gears and motors (see also).

zeissprojlayout

But as Karl Popper taught us, scientific theories must be subject to empirical falsification. The eternal circular orbits of Ptolemy’s model fall out of phase with the long-term evolution of planetary orbits, while the (updated) heliocentric model accommodates this evolution well. As Thomas Kuhn, another great philosopher of science, taught us, the history of science is not just about the gradual refinement of old theories to fit new observations in an asymptotic convergence of model to reality; rather, this history is also characterized by periods of revolutionary crisis as aging paradigms are supplanted by deeper, wider, more elegant and inclusive explanatory perspectives. Einstein’s genius was to bring the reigning Newtonian theory of gravity into alignment with Maxwell’s theory of electromagnetism. A deeper theory of space was born. But in a sense, despite many other successful observational predictions, empirical falsification is exactly what happened to Einstein’s gravitational theory when it failed to accurately predict the observed rotational velocity of galaxies. However, because this darling model had made a number of other accurate predictions, and because no widely accepted alternative paradigm was on hand, astrophysicists decided to fudge the numbers by inventing new free parameters, new epicycles, to bring the theory back into alignment with observations. Appearances were thereby saved, but at the cost of conjuring into existence an entire universe (or 96% of one, at least) of cold and dark, that is, unobservable, matter/energy.

Even though he did formulate a “bimetric” alternative in 1922 (QE 109), Whitehead’s problem is not with Einstein’s model. This isn’t a “scientists have been wrong before, so why should we trust them now?” argument. Science is about modeling. In some sense, scientific models are always wrong. That’s the name of the game, after all: build a model and throw it against reality until it breaks. Then study why it broke until you find a new model that doesn’t break as quickly. Gradually, more robust, inclusive models emerge. Rather, Whitehead’s problem is with the philosophically naive “model-centrism” that leads scientists to equate their favored model with reality in a dogmatically literalistic way. We should never assume the reigning physical models of the universe offer a final account of the way things are (especially when today’s two most successful models, relativity and quantum theory, remain irreconcilable). Science is not ontology: science is a method of inquiry involving the making and breaking of toy models.

The dogmatic equation of a favored geometrical model with physical reality not only undermined the logical basis of measurement, it led Einstein to dismiss our concrete experience of an irreversible flow of time as nothing more than a “stubbornly persistent illusion.” This is Whitehead’s famous “fallacy of misplaced concreteness” writ large. Einstein’s unquestioned commitment to the classical “spectator theory of knowledge” prevented him from grasping the profoundly relational implications of his new theory of space. He upheld the old Galilean-Cartesian view of a bifurcated Nature, construing our consciousness as somehow external to a cosmos that we can only ever confusedly experience. Whitehead offers an alternative, fully relational epistemology and ontology that re-embeds experience in the cosmos: we are creative participants in a cosmogenetic relational nexus.  

Instead of rushing to eliminate experience from our understanding of a relativistic (or relational) reality, Whitehead carefully examined the hidden epistemic presuppositions and metaphysical requirements of Einstein’s more specific application of relativity to the physics of light and gravitation. The result of his examination was eventually assembled in Process and Reality as the fourth category of explanation, a truly general principle of relativity: “it belongs to the nature of a ‘being’ that it is a potential for every ‘becoming'” (PR 22). Obviously, the importance of Whitehead’s fourth category of explanation (of which there are 26 others) can only be understood within the total gestalt of his categoreal scheme (which includes the category of the ultimate: Creativity; eight categories of existence, among which the most important are eternal objects and actual occasions; and nine categories of obligation). Whitehead’s categoreal scheme is laid out in Part I of Process and Reality as something like an opening credit roll listing the conceptual dramatis personae who, in Part II, will take the stage to exemplify their adequacy. But I’m not going to run through the whole dress rehearsal right now (for a helpful exegesis of Whitehead’s first four categories of explanation, see pgs. 108-110 of QE). Suffice it to say that Whitehead’s principle of relativity expresses the truth that everything co-exists in a web of relatedness, whether actually or potentially. 

Auxier and Herstein:

This is the principle that Einstein and his devotees have abandoned: not the mathematical expression of their physical model; that model is itself only an application of what has become the standard dogma of orthodox cosmology, with its narrowly defined approach to the interpretation of a truncated representation of experience. Rather, physical cosmology has left behind the full principle of relativity and its unqualified commitment to the incurable relatedness of the real. That abandonment comes in the truncation of experience at the root of their largely unexpressed theory of experience [i.e., the theory of the bifurcation of Nature]. For one cannot have a universal principle of relativity—applicable to all that is real—unless one takes experience in its real, relational totality. Experience—both actual and potential—is exactly the kind of reality that falls under the principle of relativity. One cannot take the metaphysical principle of relativity seriously unless one is a radical empiricist” (QE 110). 

In The Quantum of Explanation, Auxier and Herstein have brilliantly succeeded in elucidating the features of a radically empirical cosmology. As Whitehead reminds us early and often in Process and Reality, the purpose of philosophy is not to explain away the existence of the concrete by reduction to the abstract, but to explain the emergence of abstraction from concretion. The proper questions are: how does concrete fact participate in general form and how are general forms exemplified in concrete facts?

For a longer discussion of Whitehead’s radical empiricism a.k.a. relational realism, see my essay “Retrieving Realism: A Whiteheadian Wager.”


*It has been brought to my attention that the matter of whether Einstein thought the physics of gravitation is reducible to the geometry of space-time is not so clear cut. See for example: “Why Einstein did not believe that general relativity geometrizes gravity” by Lehmkuhl. The research continues… 

“Retrieving Realism: A Whiteheadian Wager” published in IJTS

Retrieving Realism: A Whiteheadian Wager (PDF)

Published in International Journal of Transpersonal Studies, Volume 36, Issue 1 (2017)

Abstract: This essay argues that the organic realism of Alfred North Whitehead (1861-1947) provides a viable alternative to anti-realist tendencies in modern and postmodern philosophy since Descartes. The metaphysical merits of Whitehead’s philosophy of organism are unpacked in conversation with Hubert Dreyfus and Charles Taylor’s recent book Retrieving Realism (2015). Like Dreyfus and Taylor, Whitehead’s philosophical project was motivated by a desire to heal the modern epistemic wound separating soul from world in order to put human consciousness back into meaningful contact with reality. While Dreyfus and Taylor’s book succeeds in articulating the problem cogently, its still too phenomenological answer remains ontologically unsatisfying. Whitehead’s process-relational approach invites philosophy to move closer to a real solution.

Pluralism as the Choreography of Coexistence, with William James and Co.

There’s been quite an uproar recently across the philosophy blogosphere regarding the possibility of a pluralist ontology (see Critical Animal’s recap of this cross-blog event). The multitude of angles being offered got me thinking, and eventually sent me back to William James’ A Pluralistic Universe, from which I quote below (lecture 1):

The theological machinery that spoke so livingly to our ancestors, with its finite age of the world, its creation out of nothing, its juridical morality and eschatology, its relish for rewards and punishments, its treatment of God as an external contriver, an ‘intelligent and moral governor,’ sounds as odd to most of us as if it were some outlandish savage religion. The vaster vistas which scientific evolutionism has opened, and the rising tide of social democratic ideals, have changed the type of our imagination, and the older monarchical theism is obsolete or obsolescent. The place of the divine in the world must be more organic and intimate. An external creator and his institutions may still be verbally confessed at Church in formulas that linger by their mere inertia, but the life is out of them, we avoid dwelling on them, the sincere heart of us is elsewhere. I shall leave cynical materialism entirely out of our discussion as not calling for treatment before this present audience, and I shall ignore old-fashioned dualistic theism for the same reason. Our contemporary mind having once for all grasped the possibility of a more intimate Weltanschauung, the only opinions quite worthy of arresting our attention will fall within the general scope of what may roughly be called the pantheistic field of vision, the vision of God as the indwelling divine rather than the external creator, and of human life as part and parcel of that deep reality.

As we have found that spiritualism in general breaks into a more intimate and a less intimate species, so the more intimate species itself breaks into two subspecies, of which the one is more monistic, the other more pluralistic in form. I say in form, for our vocabulary gets unmanageable if we don’t distinguish between form and substance here. The inner life of things must be substantially akin anyhow to the tenderer parts of man’s nature in any spiritualistic philosophy. The word ‘intimacy’ probably covers the essential difference. Materialism holds the foreign in things to be more primary and lasting, it sends us to a lonely corner with our intimacy. The brutal aspects overlap and outwear; refinement has the feebler and more ephemeral hold on reality.

From a pragmatic point of view the difference between living against a background of foreignness and one of intimacy means the difference between a general habit of wariness and one of trust. One might call it a social difference, for after all, the common socius of us all is the great universe whose children we are. If materialistic, we must be suspicious of this socius, cautious, tense, on guard. If spiritualistic, we may give way, embrace, and keep no ultimate fear.

The contrast is rough enough, and can be cut across by all sorts of other divisions, drawn from other points of view than that of foreignness and intimacy. We have so many different businesses with nature that no one of them yields us an all-embracing clasp. The philosophic attempt to define nature so that no one’s business is left out, so that no one lies outside the door saying ‘Where do I come in?’ is sure in advance to fail. The most a philosophy can hope for is not to lock out any interest forever. No matter what doors it closes, it must leave other doors open for the interests which it neglects.

I must admit that a similar Jamesian existential need for intimacy is the common source of my panexperiential ontology, my aesthetic ethics, and my process theology. My enactive epistemology follows from a commitment to the sort of precursive trust that makes it possible to learn from my transactions with reality (=other beings). This means it is possible to be mistaken: to be mistaken is to fail to learn from a transaction with others. To learn from my transactions is to be in right epistemic relation with others. Learning becomes knowing as alliances between vastly different beings are built and maintained. The possibility of learning implies that my knowledge and the others I am trying to get to know remain always incomplete one to the other. I acknowledge from the get go that my knowledge of you could only ever be partial. So long as you do the same, we can continue to grow together, to learn from each other. But as soon as I pretend to know you entirely, in your true reality, learning ceases. I disown you, I steal your otherness and make it mine.

We do not come upon nature (=other organisms) as complete in itself, with a duty to unveil its truth. Nature loves to hide, to wear masks. She plays with us. Getting to know her requires more than just taking her at face value. We’ve got to play along to understand how she works, since who are we but more of her masks? Human knowing is not individual minds accessing a pre-given truth about reality; human knowing is composing a common world with others, most of whom are not human. Our knowing and our being is a “choreography of coexistence,” as Francisco Varela called it.

Check out these other recent posts on realist pluralism: Critical Animal on James, Agent Swarm on Latour, and Struggle Forever on political ontology.

…..

Below is a video I recorded 2.5 years ago while reading Isabelle Stengers’ Thinking With Whitehead. 

I dwell in particular on her reactivation of the Jamesian notion of precursive trust. I also discuss enactivist epistemology, which may help clarify my remarks above.

Here is the essay on Stengers and Whitehead I refer to at the end of the video: Thinking Etho-Ecology with Stengers and Whitehead.

Life in the Pluriverse: Towards a Realistic Pluralism

Levi Bryant recently called for a cross-blog discussion concerning what he perceives to be the problematic relationship between ethnographic pluralism and ontological realism. His call was instigated by Jeremy Trombley’s post on the so-called “ontological turn” in contemporary anthropology and ethnography. Trombley articulated what might be described as an ontology of the concept, wherein concepts are not representational frames that mirror (or fail to mirror) the world, but participatory interventions that dis- and/or re-assemble our thoughts and practices. Trombley writes:

“a concept or conceptual assemblage – ontology, feminism, queer theory, post-colonial theory, etc. – enables us to understand differently, and in understanding differently, it enables us to also be differently… What the ontological turn does is…[allow] us to reflect not only on the way we represent, but on the way that we exist and the kinds of relations we compose through our practices.”

Before I get into what such an anti-representationalist ontology of concepts does to our understanding of Truth (hint: Truth is not pre-given but enacted), I should mention a few other bloggers who have already jumped into the conversation. Phillip of the blog Circling Squares (which I need to explore more!) responded to Bryant’s original post by pointing out that thinkers like Latour and Stengers (and Whitehead before them) have been articulating a rather robust form of pluralistic realism for some time now (i.e., cosmopolitics). Terence Blake of Agent Swarm also chimed in, arguing that Bryant’s “realism” seems to be no more than old-school scientism, so it shouldn’t come as a surprise that it is so difficult to square with pluralism.

Bryant believes that the social constructionist turn of the 90s was politically valuable in that it improved the social standing of many oppressed minorities. But he rejects what he perceives to be the extension of such constructionism beyond politics into ontology. Bryant writes:

“In arguing that everything is a social construction, the pluralist undermines the possibility of public deliberation about truth. Everything becomes an optional narrative or story about the world, an optional picture of reality, where we are free to choose among the various options that most suit our taste.  It’s not a surprise that so much of the philosophy during the 90s in both phenomenology and post-structuralism culminated in a theological turn.  For where everything, including science, is just a narrative or story about what being is, why not just go ahead and take a leap of faith?”

I’m not sure if Bryant intends to include cosmopolitical thinkers like Latour and Stengers in his punching bag category “social constructionist.” I don’t understand how he could. If he does insist on labeling them as such (which seems to me to just obscure their true positions–but if he insists…), then, building on Whitehead’s categoreal scheme, I’d retort that “society” for these cosmopolitical thinkers has to be understood in the most general sense as an ontological category, not simply a human “construct.” The human organism is already a society of cells, each of which is itself a society of organelles, each of which is a society of molecules, each of which is a society of atoms, each of which is a society of protons, neutrons, and electrons, and so on… Realities are decomposed and recomposed by associations between and among actual occasions–occasions which are never simple unities but are always multiple and so always “in the making.” Which brings me to the concept of “construction”: if we are working within a process ontology, construction also needs to be ontologized. Biological evolution is a gradual process of construction wherein what begins as psychological desire later becomes physiological reality (to take the example of evolution by sexual selection). The physical world is itself continually constructed by what physicists are now calling “geometrogenesis.” This is not to say that the physical world is a human construct, mind you. The picture that is beginning to become clear as a result of contemporary physical cosmology is that space and time are the co-emergent products of the real activity of pure energy, something both non-human and pre-physical/pre-extended (Whitehead called it Creativity; physicists call it the quantum vacuum). If the physical world (as described by contemporary physics) is a network of relations always “in the making,” and not some collection of pre-given particles obeying eternal laws, then a “true” understanding of it must also always remain open-ended. There is no Science or Universal Reason that might once and for all pronounce upon the nature of the Real. There are many sciences, many methods, many rationalities. Science as it is actually practiced now and in the past has always already been a pluralistic enterprise. As Latour showed in Science in Action, what ends up being called “Nature” is always a consequence of some more or less temporary settlement of controversies. Every new generation of scientists stirs up new controversies about what the aging generation thought was settled.

The cosmopolitical perspective that I’d want to defend certainly does not “undermine the possibility of public deliberation about truth”–it is (once we accept an enactivist account of truth) the condition of its possibility! It is Bryant’s position that rules out such public deliberation by insisting on declaring war on all those human societies that reject materialism. Latour has plenty to say about the vacuity of the notion of “matter,” which I’ve discussed elsewhere and won’t get into here.  Accepting a cosmopolitical form of ontological pluralism doesn’t at all require that we think of all beliefs and belief-systems as created equal. Nor does it imply that social groups “freely choose” their beliefs simply as a matter of “taste.” The ontological commitments of any given society typically emerge out of long multi-generational processes of historical development. They aren’t just made-up on a whim by individual members. Further, the world view of a social group is as integral to their their livelihood and well-being as their food, shelter, and water, not simply an optional aesthetic veneer. As Trombley suggested, belief-systems enact ways of being and are not just representations.

Ontological pluralism is a commitment to multiple realities, many of which overlap, but some of which remain (at least for now) irreconcilable. It is not a commitment to tolerance of multiple perspectives on a single reality. This latter option, as Bryant points out, would be a rather trivial form of pluralism. It is also a rather colonialist and scientistic take on the Real. Anyone trying to argue that contemporary science has somehow provided us with a unified account of an objective reality that holds true for all people in all places and times has their work cut out for them. Several hundred years of “modern” science has only succeeded in making the world stranger, more dangerous, and more multifarious than it was for ancient and medieval peoples.

Am I saying that a ayahuasca shaman’s encounter with the spirit of the jaguar is just as real as the particle physicist’s encounter with the Higgs boson? Yes, most definitely. In fact, the shaman’s encounter is way more concrete and direct than the physicist’s, since the latter has to wait for a world-wide network of supercomputers to process the information for him, which only after many repeated trials, journal publications, and so on becomes what most (but not every!) physicist will agree is something like a Higgs boson. Even after all this painstakingly detailed mediation (“science in the making”), the Higgs boson remains now and forever a theoretical construct. The ayahuasqueros’ encounter with the jaguar spirit is anything but. Sure, a cognitive neuroscientist might claim to be able to explain the shaman’s experience as a “brain malfunction” brought on by the ingestion of a psychedelic plant brew. But this remains a reductive etic description and not a complete explanation. The neuroscientist should participate in an ayahuasca ceremony for himself before he goes declaring war on the shaman. At least, this is what a pluralist ethics would entail. Such shamanic practices have functioned quite well in their own tribal context for thousands of years. Instead of assuming from the get go that anyone who doesn’t describe the world in your favored language is deluded, try to get to know them, to understand not only what their world is like, but how their world is brought forth. Follow the injunctions through which they enact their world. Then, once you’ve explored it from the inside, by all means judge their enactment, contest it, translate its features into other terms to show why it is unethical, dangerous, or misguided.

I’ll leave you with an excerpt from an essay of mine on the ethical implications of enactivism and the need for a pluralistic planetary mythos (Logos of a Living Earth):

One consequence of the enactive approach is that the Cartesian quest for epistemological certainty becomes but the expression of a particular cognitive domain made possible by the abstract languages of mathematics, precise measurements of machine technologies, and controlled laboratory environment. If the nervous system is operationally closed, its function cannot be to modestly mirror an external, objective reality, even if the modest witnesses are highly trained scientists allied with powerful instruments that extend their sensory reach. The operational closure of the nervous system forestalls a representational account of its activity, as its role is maintaining coherence, rather than correspondence, between organism and environment. New techniques may open up previously hidden worlds, as when Galileo first turned a telescope to the sky and revealed the moons of Jupiter in 1610, or Hooke first recognized cells through a microscope in 1665, but one cannot speak of finally discovering the real as if it existed independently of our bodily and inter-bodily experience of its meaning.

As Haraway has suggested (p. 199, 1997), “…objectivity is less about realism than about intersubjectivity.” She yearns for us to come to see objectivity as a way of “forming ties across wide distances” (ibid.), instead of as the privileged and modest perspective of self-invisible European men who remain somehow unpolluted by their ambiguously situated bodies (p. 23-32, ibid.). If science can claim relative epistemological privilege, it is not the result of transcending culture, but of the ever-accelerating, ever-expanding mobility and combinability of the traces scientists and their cyborg surrogates have constructed within their networks. Outside of these special networks of labs, machines, shared languages, and centrally controlled policy initiatives, scientific facts have little relevance. As Latour put it, “we might compare scientific facts to frozen fish: the cold chain that keeps them fresh must not be interrupted, however briefly” (p. 119, Latour, 1993).