Thursday, June 9, 2022

Organisms vs Machines

This post is Part 3 of Terrence Deacon’s Incomplete Nature (here are Part 1 and Part 2). In this last section of the tome, I was hoping for some illumination, but while a few of the cobwebs in my mind have untangled, the questions of the origin of life and the origin of consciousness remain enigmatic.

 

Let’s begin where I left off – with “Work”. Deacon first defines it as “a spontaneous change inducing a non-spontaneous change to occur”. That’s not unreasonable from a thermodynamic view. Deacon then uses the example of Brownian motion to argue that “even at equilibrium there is constant molecular collision, and thus constant work occurring at the molecular level”. But there’s a problem. You can only get macroscopic work (our colloquial view of getting something done) if microscopic work “is distributed in a very asymmetric way throughout the system”, i.e., when the system is not at (thermal) equilibrium. Otherwise, the symmetric system gives you nothing. Deacon concludes: “Microscopic work is a necessary but not sufficient condition for macroscopic work… [which] is a consequence of the distributional features of the incessant micro work, not the energy of the component collisions, which as a whole can increase, decrease or remain unchanged.”

 

Deacon then proposes a more general definition of work: “the production of contragrade change” and “contragrade processes arise from the interaction of non-identical orthograde processes”. (From Part 2, a contragrade process is defined as going against the flow, while an orthograde process goes with the flow.) When I first introduce thermodynamics in my chemistry classes, we discuss the zeroth law: When you put a hot object next to a cold object, heat spontaneously flows from the hot to the cold. Before the objects were put together, in each separate system, the microscopic particles had symmetric distributions. When brought together, the combined single system now has an asymmetric distribution (heat-wise), and is no longer at equilibrium – it proceeds to find a new equilibrium via heat flow where symmetry can be re-achieved. This is all orthograde.

 

But this is all still in the realm of equilibrium thermodynamics. How do living systems keep themselves away from it? As in Part 2, Deacon invokes climbing up the dynamics ladder. Orthograde thermodynamic processes that “oppose” each other can lead to contragrade morphodynamic work. And after a morphodynamic system has been established, orthograde morphodynamic processes can lead to contragrade teleodynamic work. The establishment of appropriate constraints in all this is crucial. But such situations may be rare, and may be why no scientist has succeeded in creating life from non-life using such principles. It’s unclear how this works outside of the abstract. Even though Deacon provides what he calls practical examples, I’m having trouble seeing this and I feel like I’m stumbling through a cobweb-filled cave. Deacon acknowledges this later in the chapter: “If you have read to this point, you have probably… struggled without success to make sense of some claim or unclear description.” I agree.

 

The next chapter is about Information. All I will say is that at least Deacon reminds the reader not to conflate Shannon entropy and Boltzmann entropy. The next several chapters discuss cybernetics, evolution, and the notion of self. I kept plowing through until getting to the following nugget in the chapter on “Sentience” that relates to the title of today’s post: machines versus organisms. I think it’s worth quoting Deacon in full here (italicized paragraphs below) – he does a good job discussing the distinction. He begins with the notion of computation.

 

Whether described in terms of machine code, neural nets, or symbol processing, computation is an idealized physical process in the sense that the thermodynamic details of the process can be treated as irrelevant. In most cases, these physical details must be kept from interfering with the state-to-state transitions being interpreted as computations. And because it is an otherwise inanimate mechanism, there must also be a steady supply of energy to keep the computational process going. Any microscopic fluctuations that might otherwise blur the distinction between different states assigned a representational value must also be kept below some critical threshold. This insulation from thermodynamic unpredictability is a fundamental design principle for all forms of mechanism, not just computing devices… we construct our [machines] in such a way that they can only assume a certain restricted number of macro states, and we use inflexible… regularized structures… and numerous thresholds for interactions, in order to ensure that incidental thermodynamic effects are minimized. In this way, only changes of state described as functional can be favored.

 

… Although living processes also must be maintained within quite narrow operating conditions, the role that thermodynamic factors play in this process is basically the inverse of its role in the design of tools and the mechanisms we use for computation. The constituents of organisms are largely malleable, only semi-regular, and are constantly changing, breaking down, and being replaced. More important, the regularity achieved… is not so much the result of using materials that intrinsically resist modification, or using component interactions that are largely insensitive to thermodynamic fluctuation, but rather due to using thermodynamic processes to generate regularities…

 

… In machines, the critical constrains are imposed extrinsically, from the top down, so to speak, to defend against the influence of lower-level thermodynamic effects. In life, the critical constrains are generated intrinsically and maintained by taking advantage of the amplification of lower-level thermodynamic effects. The teleological features of machine functions are imposed from outside, a product of human intentionality. The teleodynamic features of living processes emerge intrinsically and autonomously.

 

And to connect it to the mind and consciousness (which is Deacon’s last chapter):

 

…computations only transfers extrinsically imposed constraints from substrate to substrate, while cognition (semiosis) generates intrinsic constraints that have a capacity to propagate and self-organize.

 

Essentially, human brains are meaty, sloppy, computing devices, but this makes all the difference between mind and computing, or between organism and machine. That’s my takeaway from Deacon’s tome.

 

Bonus track: In his epilogue, Deacon has a short section titled “the calculus of intentionality” where he relates taking a derivative at a tangent (to compute an instantaneous velocity) as an analogy to how telos shows up unannounced. (Integrals are also mentioned less convincingly in passing.) This reminded me of another such analogy.

No comments:

Post a Comment