I’m grateful to Tevin Naidu for getting Deacon and Levin together. They only had 90 minutes but still managed to cover a lot of territory, including where they overlap and where some tensions may exist. I first met Deacon back in 2011 during a lecture he gave on his then new book Incomplete Nature. Regular readers may not be surprised to learn that I asked him about Whitehead’s eternal objects and how his “absential” account of formal causality compared (yes, I’ve been pimping eternal objects for a long time). After reading his book, I went on to critically engage his emergentist explanation of the origins of form and aim in Physics of the World-Soul (2021).
Science and Philosophy
Their conversation begins with a shared tribute to Daniel Dennett (who passed away in April 2024). Both celebrated Dennett’s refusal to allow philosophical speculation to stray too far from the empirical data of science. As he famously put it:
“There is no such thing as philosophy-free science; there is only science whose philosophical baggage is taken on board without examination.”
I agree! I always appreciated Dennett’s books for the challenge they represented, though like Deacon, it was the way Dennett’s clarity helped me refine my own alternative positions that I found most generative. Philosophy and science should always remain in intimate dialogue with one another, but I may be more cautious than Levin when he affirms an “unflinching fusion of the philosophy and the and the data.” This is because, as Whitehead reminds us, “The history of thought shows that false interpretations of observed facts enter into the records of their observation. Thus both theory, and received notions as to fact, are in doubt” (PR 9). In other words, the data do not simply speak for themselves, as if the right philosophical theory could be read off of empirical observations. Determining what counts as relevant data depends upon the theoretical lens being used. It is thus important to avoid any simple collapse of theory and observation, lest we mistake the currently paradigmatic hypothesis and its way of interpreting the bloom and buzz of concrete experience for the final word on the way the world is. I know Levin agrees, and I don’t mean to nitpick about a quick turn of phrase, but this is an important point about how philosophy and science ought to collaborate. Dennett, while adept at integrating his favored paradigms within neuroscience, psychology, and evolutionary biology, was not in my opinion particularly good at integrating the findings of the special sciences with the broader data of philosophy, which includes spiritual, aesthetic, and existential experience. Deacon seems more attuned to these broader questions than Dennett was.
The Place of Purpose in Nature
In exploring the origin of purpose, Deacon realized that even bacteria are already too complex to treat as starting points for answering this question. In Incomplete Nature, he constructed a more idealized system, trying to show how form, memory, and repair could arise from the tangled flows of thermodynamics. Drawing on Schrödinger’s intuition that life is a special type of informational system, Deacon sought to understand how thermodynamic processes could become about something. How do blind energy flows become capable not just of information processing, but interpretation, evaluation, and ultimately sentience? I remain skeptical that you can get meaning from mere matter, no matter how loopy the matter gets, but Deacon’s attempt is one of the most sophisticated I’ve come across. He rejects the idea that life can be adequately understood as a list of traits and instead tries to understand living purposes as emergent from the combinatorial novelty of self-maintaining thermodynamic loops that become about themselves. His approach is inspired by C. S. Peirce’s semiotic ontology, where “representation” is understood as a triadic relationship between signs, objects, and intepretants (rather than the usual computationalist understanding of representation, which remains binary (symbol-referent), leaving no room for an interpretant (or subjective prehension in Whitehead’s sense). For more on Peirce’s triadic mode of thought, here’s my introduction.

But Deacon resists what he dismisses as “homuncular” explanations of subjectivity in nature, like Whitehead’s panexperientialism. He feels these approaches just assume what needs to be explained. For my part, I think a satisfying explanation is always going to be relative to the metaphysical background assumptions we are making (eg, whether any such thing as “matter” can be thought to exist independent of its polar complement, “mind”). Aim, value, subjectivity are, to my mind, not explicable as products of evolution, but are rather intrinsic features of reality, the necessary conditions of intelligibility for an evolutionary cosmology.
Levin approaches the problem of purpose in nature differently. He is skeptical of dichotomies like living vs. non-living, or purposeless vs. purposeful. Instead, he looks for general models of transformation and scaling. In his work on polycomputing, he shows how biological systems operate simultaneously across multiple domains—morphological, electrical, metabolic—solving problems at many levels. Levin emphasizes that any system capable of error is by implication already normatively driven. In his research, even models of simple gene regulatory networks can learn and adapt, and thus display cognitive behavior. From Levin’s perspective, “goals go all the way down”: purpose is not a sudden emergence but a gradual scaling up from simpler adaptive behaviors.
Deacon here introduces the term “normative chemistry”—meaning chemistry where some outcomes are better or worse for a self-organizing system. Levin likens this to (bio)chemical processes that can be trained via rewarded and punishment. But Deacon also asks, very importantly: who is the beneficiary? Levin notes that even a paramecium can be trained, but Deacon emphasizes that normativity only becomes meaningful when there is a beneficiary who benefits or suffers.
Deacon then makes the very provocative claim, stemming from his study of Peirce, that in living systems, semiotic activity generates and maintains its own physicality. Meaning, in other words, is not some epiphenomenal ghost floating atop material processes: instead, at least in living systems, meaning makes matter. He notes that when researchers simultaneously measure EEG signals (neural electrical activity) and fMRI signals (blood flow and metabolic activity) during cognitive tasks, they find something surprising: metabolic changes often occur before there is measurable electrical activity in some brain regions. This is a challenge to the conventional view in computational neuroscience that neurons first generate action potentials that transmit information, and that metabolism follows as the energetic support system to replenish the system afterwards. From Deacon’s point of view, metabolism is itself already semiotically active, playing a sign-generating and -interpreting role. It does not merely support information processing at the neural level, but participates in and even anticipates it. Meaning (semiotic differentiation) and matter (energetic flows) are not two separate layers; they are co-constructed. These findings present a severe challenge to computational models of the brain that treat neurons like logic gates.
Both Deacon and Levin affirm that absence can be causal—that possibilities shape physical outcomes. Levin discusses how cognitive systems can offload computation to their surroundings, letting environmental structures inform problem solving. Evolution ingresses form by exploiting “free lunches”: lawful relationships found in geometry, arithmetic, and computation structure biological development without needing to be genetically encoded or invented via natural selection.
Deacon expands this into his “lazy gene” theory (which I take to be a playful inversion of Dawkins’ “selfish gene”): evolution often exploits existing mathematical sources of “order for free” (Kauffman) rather than painstakingly inventing every new trait. Deacon gives the well-known example of the Fibonacci spiral found in plants, which is not computed by genes (“the genes aren’t doing the math”)—the geometry is already there, and biological systems simply tap into it.
Decompressing Evolution
Deacon next unpacked the relevance of Shannon’s two ways of measuring information in terms of channel entropy and message entropy, respectively. Channel entropy measures the capacity of a physical medium to transmit information. It is a thermodynamic concept meant to quantify how many different states a system can be in, how much variation it can carry without noise overwhelming the message. Message entropy, in contrast, measures the informational content or meaning of a message transmitted through that channel. It is not about the raw capacity, but about how that capacity is constrained and semiotically shaped. For Deacon, the critical point is that meaning arises from constraint—not just the number of possibilities, but how they are constrained into coherent patterns that refer beyond themselves. Deacon connects Shannon’s two entropies to the way biological systems handle information both ontogenetically (via development) and phylogenetically (via evolution). First, in ontogeny, the organism has to work from a highly compressed description—the genome—which it decompresses into a fully differentiated living body. The genome does not and cannot specify every detail of development. Rather, development relies on environmental affordances, physical constraints, and emergent interactions among cells (eg, Levin’s bioelectric fields). Ontogenesis is thus best understood as a semiotic decompression process. DNA is not a program but a compressed narrative awaiting the interpretive community provided by environmentally-embedded embryogenesis to find living expression.
Over evolutionary history, successful developmental trajectories—those embryological decompression processes that yield viable, adaptive organisms—are compressed into the genome. Whereas development acts as the decompressor, evolution acts as the compressor, distilling the memory of successful semiotic transactions (developmental pathways that “worked”) into heritable information. Thus, evolution is not primarily about selection of random genetic mutations. It is primarily about the accumulation and compression of developmental wisdom, etching successful semiosis into durable nucleic acid form.
In short, for Deacon, phylogenetic evolution follows from the compression of successful ontogenetic development. Evolution compresses and transmits the semiotic solutions discovered by generations of decompression experiments. Thus, compression comes second: life begins not with some RNA-world scenario, by with dynamic decompresses processes generative of semiotic self-organization before stable genetic templating emerges.
Memory Materialized
When trying to understand memory, Levin claims that we have access only to the material traces or engrams of the past as they are inscribed in the present, not the past itself. But I think this neglects the deeper ontology of possibility. Following Whitehead and Bergson, the entire past is still present in potentia. It does not need to be stored materially; it is virtually active, woven into the present moment by virtue of its very absence. Memory is not material storage of traces, whether in genes or synaptic pathways, but our participation in a living continuum of possibility.
Levin’s account remains too tied to material engrams. Deacon edges closer to a process view, but even he could push further toward seeing memory as an intrinsic feature of becoming, not a passive record. In Matter and Memory (1896), Bergson lays out a post-materialist alternative to the still residually reductionist readings of Levin and Deacon. The engram approach spatializes time, in Bergson’s sense, flattening memory into a sequence of effects in extended matter. Rather than seeing the brain as a local storage device, Bergson inverts the common sense of materialist reductionism: the brain becomes a device for forgetting, for filtering out the plenum of the past. He takes the phenomenology of memory seriously, recognizing that consciousness is not frozen in an instantaneous present moment but lives in a durational field thick with past potentials.
I quote Bergson at length from the first pages of Creative Evolution (1907):
“…our duration is not merely one instant replacing another; if it were, there would never be anything but the present–no prolonging of the past into the actual, no evolution, no concrete duration. Duration is the continuous progress of the past which gnaws into the future and which swells as it advances. And as the past grows without ceasing, so also there is no limit to its preservation. Memory, as we have tried to prove, is not a faculty of putting away recollections in a drawer, or of inscribing them in a register. There is no register, no drawer; there is not even, properly speaking, a faculty, for a faculty works intermittently, when it will or when it can, whilst the piling up of the past upon the past goes on without relaxation. In reality, the past is preserved by itself, automatically. In its entirety, probably, it follows us at every instant; all that we have felt, thought and willed from our earliest infancy is there, leaning over the present which is about to join it, pressing against the portals of consciousness that would fain leave it outside. The cerebral mechanism is arranged just so as to drive back into the unconscious almost the whole of this past, and to admit beyond the threshold only that which can cast light on the present situation or further the action now being prepared-in short, only that which can give useful work. At the most, a few superfluous recollections may succeed in smuggling themselves through the half-open door. These memories, messengers from the unconscious, remind us of what we are dragging behind us unawares. But, even though we may have no distinct idea of it, we feel vaguely that our past remains present to us. What are we, in fact, what is our character, if not the condensation of the history that we have lived from our birth-nay, even before our birth, since we bring with us prenatal dispositions? Doubtless we think with only a small part of our past, but it is with our entire past, including the original bent of our soul, that we desire, will and act. Our past, then, as a whole, is made manifest to us in its impulse; it is felt in the form of tendency, although a small part of it only is known in the form of idea.
From this survival of the past it follows that consciousness cannot go through the same state twice. The circumstances may still be the same, but they will act no longer on the same person, since they find him at a new moment of his history. Our personality, which is being built up each instant with its accumulated experience, changes without ceasing. By changing, it prevents any state, although superficially identical with another, from ever repeating it in its very depth. That is why our duration is irreversible. We could not live over again a single moment, for we should have to begin by effacing the memory of all that had followed. Even could we erase this memory from our intellect, we could not from our will.”

Bias from the Bottom-up
Deacon made the important observation that hypotheses always begin with bias. There is no pure, bias-free reasoning. Every act of inquiry arises out of pre-existing patterns of expectation, or interpretive habits. Bayesian reasoning formalizes this. Bias is not a flaw in cognition; it is a necessary condition for cognition.
But this raises a deeper metaphysical question that Deacon did not fully pursue, though he gestured toward it: are these biases merely the products of historical accidents—fossils of contingency frozen into our nervous systems and societies—or are they signs of something deeper? Might they also reveal the presence of eternal objects and primordial aims, cosmic tendencies latent in the rhythm of becoming? I do not doubt that biases update in the course of development and evolution. But can these biological processes explain the origin of bias (or “graduated intensive relevance” in Whitehead’s sense) as such? Might biology be building on something more ontologically basic?
Levin also hints at this deeper possibility. His work shows that even in very simple biological networks, such as gene regulatory circuits or electrical fields in multicellular organisms, there is an inherent tendency toward coherence, toward repair, toward the generation of ordered wholes. Under conditions of perturbation unlikely to have been experienced by genetic ancestors, living systems do not merely fall apart: they strive to re-interpret and re-negotiate, often finding new paths to adaptivity. This does not seem accidental. It suggests that the universe is not simply a random walk through chemical space, but a field of potentiae curved by teloi—a landscape always already enfolded with possible attractors toward complexity and valuation.
If this is true, then bias is not simply historical inertia. Bias, in this deeper sense, would be the trace of the eternal within the temporal—a local expression of and iteration upon cosmic aims at work since the beginning of everything. Hypotheses would then be acts of participation in a universe that is already striving, from the bottom up, to realize higher orders of meaning.
Semiotic Negotiations
The semiotic dynamism of life becomes vividly clear in considering Levin’s description of his bioelectric manipulation experiments. When researchers in his lab alter the bioelectric gradients of developing embryos, they can induce radical transformations: frogs can be made to grow fully-formed eyes on their bellies. But these transformations are not inevitable. Sometimes, the surrounding cells resist. They seem to engage in a kind of debate, contesting whether the proposed new structure should be accepted or rejected.
In some cases, the manipulated cells succeed in recruiting their neighbors, and the ectopic eye forms. In others, the natural plan reasserts itself, the anomalous researcher-induced pattern is erased, and normal development proceeds. These outcomes are not random failures. They are signs of an ongoing, collective negotiation. The cells are not passive receivers of genetic or electrical instructions; they are active interpreters, weighing multiple semiotic cues, collaborating or resisting depending on their shared assessments.
Development is obviously not a linear or mechanical engineering project, where docile proteins and cells slavishly follow a genetic program. Rather, it is a collective semiotic negotiation, an emergent outcome of interpretive processes operative at multiple scales. The organism is not constructed like a building from a blueprint. It is the expression of an ongoing cellular conversation.
Deacon expressed concern that Levin’s emphasis on pre-patterned bioelectric fields risked sliding into a kind of preformationism—the idea that form is already fully determined from the start and merely unfolds deterministically. He even drew a passing critical comparison to
Rupert Sheldrake’s speculative (but testable!) notion of morphic fields: subtle non-local fields of form that influence physiology and behavior. I’m not sure this is a fair criticism of Levin or of Sheldrake, even if I agree with Deacon that we must beware the temptation to a one-sided preformationism. For my part, I think we need to find a middle path between what used to be called epigenesis and preformation, since both positions address crucial aspects of the puzzle of living organization.
Levin responded with a somewhat subtler portrait of his approach. The bioelectric maps revealed in his experiments are not rigid blueprints. They are dynamic set-points, attractor states toward which development tends to move, but which remain flexible, revisable, and contestable. These set-points bias development toward certain outcomes without determining them completely. Bioelectric fields act as semiotic constraints: they shape the landscape of developmental possibilities without eliminating the organism’s interpretive freedom. The embryo is not a miniature adult awaiting inflation. It is an interpreter, navigating a landscape of affordances, holding the tension between stability and transformation.
…
There is much food for thought here! I continue to digest it all in an effort to articulate a philosophy of nature that is responsive to cutting-edge scientific findings without losing sight of the metaphysical presuppositions of science itself. My guiding thread continues to be the pursuit of a general theory of evolution that does justice to the fact that self-conscious agents exist to engage in such an inquiry. Said otherwise, I am after a participatory onto-epistemic account of the evolutionary process. We cannot explain evolution as a primarily material process, treating our own minds as an afterthought. We are part—or, perhaps, the microcosmic whole—of what we are trying to understand.

What do you think?