Thursday, February 13, 2025

Animals as Gardeners

Yesterday, in my General Chemistry class, we discussed using bond energies to calculate the change in enthalpy of a chemical reaction. Breaking bonds is endothermic and requires energy input into the system. Conversely, making bonds is exothermic and energy is released from the system. Chemical reactions almost always involve both the making and breaking of bonds. Therefore, whether the overall chemical reaction will be endothermic or exothermic will depend on whether the bonds being broken are stronger (or weaker) than the bonds being formed.

 

One example I showed was ATP hydrolysis. The reaction is marginally exothermic. Even though the same types of bonds were being made and broken, the bond energies are slightly different in different chemical structures. That’s the beauty of chemistry – a subtle interplay between structure and energetics! The purpose of this example was to counter the conceptually wrong mind-worm students acquire where they tell me that “breaking bonds releases energy”. This usually comes from a simplified misunderstanding of something they hear in a biology class.

 

Towards the end of class, I couldn’t resist connecting bond energies to the origin of carbon-based life on Earth. The students had previously worked a problem on the strength of the O–H bond in water and the corresponding wavelength of a photon that matched the bond energy. Referring to the solar spectrum and ultraviolet light, I speculated about how adenine may have been important as a photon absorber prior to its role in the universal energy transduction of living systems. I mused about water-splitting, the invention of photosynthesis and suitable molecular pigments (conjugated pi-systems!) that may have arisen through chemical evolution. I didn’t say anything about such pigments dissipating thermal energy and seemingly “wasting” it.

 

This brings us to today’s question: Why do animals exist on Planet Earth?

 

This morning, I went down a rabbit-hole reading several articles by Karo Michaelian. It all started with “The Pigment World: Life’s Origins as Photo-Dissipating Pigments” (Life 2024, 14, 912). He makes the provocative claim that animals essentially “provide a specialized gardening service to the plants and cyanobacteria, catalyzing their absorption and dissipation of sunlight in the presence of water, promoting photon dissipation, the water cycle, and entropy production.” That’s a mouthful. We’ll break it down momentarily, but essentially the claim is that animals help to move molecules around, spreading them far and wide so that more and more photons can be absorbed and that energy dissipated. It’s the second law of thermodynamics in action at the level of the biosphere. And what’s the stuff we’re moving around? Pigment molecules!

 

It's an interesting argument. He begins with the argument that many leaves absorb photons in the ultraviolet and visible range before dropping off significantly at the infrared boundary. Leaves look green to us because red and blue light are absorbed more than green. Photosynthesis however only makes use of a narrow regime of red light, yet leaves strongly absorb in the ultraviolet and in the (blue) visible range. Plants evolved to absorb photons which are hardly absorbed by water, and apparently “fill even small photon niches left by water over all incident wavelengths”. That’s rather curious. Also, the albedo in life-rich ecosystems (jungles and forests) is considerably lower than in sandy deserts which reflect much more of the incident light. Additionally, “the albedo of water bodies is also reduced by a concentrated surface microlayer of cyanobacteria”. What happens to this absorbed energy? It is converted to heat – essentially chopping up a smaller number of high-energy photons into a large number of low-energy photons. It’s the second law of thermodynamics: energy is being dissipated and entropy increases mightily!

 

The evolution of these absorbing pigments in plants may have been primarily to increase transpiration. Photosynthesis is a secondary player in this regard. That’s a shocker to me. I’ve always considered the oxygenating of the atmosphere via photosynthesis to be a driver for the complexity of life – which it is – but I hadn’t thought of it as a byproduct to mostly increase heat dissipation via transpiration. In the first week of class, I told students about water’s high heat capacity and its suitability as a calorimeter. In a couple of weeks after we get through entropy, we’ll be looking at the change in enthalpy and entropy of vaporization as liquid water turns into gas. Water is an excellent dissipator that helps drive the second law of thermodynamics, but does so if there’s more of it in Earth’s water cycle. Transpiration puts more water into the cycle!

 

What do animals do? They help disperse the pollen or seed of plants. They help bring nutrients to plants through poo or death. As heterotrophs disperse organic matter, they disperse the pigment molecules. More opportunities for absorbing photons. More dissipation to high entropy heat. We animals are the gardeners, helping the second law to roll along. Humans in particular have come up with alternative ways to tap photons with inorganic materials, but that’s a recent phenomenon. The organic pigments have been at it far longer than we have. All this makes me wonder if the reason why photosynthesis is so inefficient is because life isn’t optimizing for capturing energy from photons in that way; rather it is optimizing for seemingly wasteful heat dissipation. The second law rules!

 

The tropics are rife with life. Is it because they receive the most photons? Why are there so many insects there? They’re a key part of the gardening crew. Why are there larger animals further away from the equator? The gardening crew is mostly about seed dispersal and larger creatures roam far and wide to stay alive in a less energy-rich environment. Michaelian argues that his proposal cuts to the heart of the source of evolution – the second law, a physical imperative. It cuts through the Gordian knot of biological relativity. It gets around the problem of extending the ecosystem to include more and more of its environment until it becomes an organism of sample size one where Darwinian evolution becomes nonsense. It’s an intriguing argument.

 

A linchpin of the argument is the chemical evolution of pigment molecules that absorb well in the UV-C range eventually transforming into the “broadband pigment world of today”. A specific detailed example looks at the oligomerization of HCN into adenine (C5H5N5) and relies on physics-based arguments about the dissipative process after a UV-C photon is absorbed. In particular, it hinges on the photoexcited pigment rapidly reach the conical intersection that shunts it towards a particular product. There is some hand-waving about how this opens up producing a broader spectrum of molecules capable of absorbing a larger range of wavelengths in the uv-vis range. Analogies are made to how thermal convection cells arise as forces come into “balance”. Stationary states, autocatalytic cycles, and other such features are invoked. And finally, once the ozone layer built up, access to UV-C is now much reduced and therefore we’re unlikely to see life originate again from scratch on our planet. (Also, the heterotrophs will chomp up anything they can!)

 

The final kicker? If UV-C is crucial to the origin and evolution of carbon-based life, then you’re unlikely to see life evolve on systems powered by M-type red dwarf stars. That’s not good news for astrobiologists who have become increasingly interested in such systems as providing suitable cradles for life. UV-C, primarily thought of as destroyer now also takes on the role of creator – Brahma and Shiva, two-in-one, with Vishnu in between as preserver while the photon flux from our sun lasts.

 

The bottom-line of how the second law and chemistry intersect? In Michaelian’s words (from a different article): “All material will dissipatively structure, depending on the strength of the atomic bonding and appropriate wavelength region.” Funny how a first-day class exercise of connecting bond energies to photon energies might turn out to be the foundation of everything we see in our solar system be it on our living planet or our seemingly dead neighbors. I haven’t yet wrapped my head around all of this. In the meantime, I’ll just keep on being a gardener and cultivator of my students’ understanding of chemistry and its wonders. And I won’t look at a plant in the same way again!


Monday, January 27, 2025

Arc of Invention

Who invented the airplane, the steam engine, and the printing press? I would have answered: the Wright brothers, Watt, and Gutenberg. Now I’m not so sure after reading How Invention Begins by John Lienhard. His argument is to look much more broadly at the ecosystem surrounding such technologies both backwards and forwards in time. And while the aforementioned names are the most famous or well-known, Lienhard brings to light many other names and their contributions to the process. Nor is there one type of airplane, steam engine, or printing press. Rather there’s a rich variety of such technologies even if the famous names are associated with iconic versions of each.

 


Why did these inventions come about and were they inevitable? Lienhard wants us to figure out the broader motivations: the desire for flight, the desire for energy sources to do work, and the desire for getting your ideas out! And as incremental improvements build up along the road to technology, there is an inevitability that a flying machine, an engine, and a method for mass-produced reading material, would show up. Perhaps not the iconic Wright, Watt, or Gutenberg version we encounter in museums or history books, but some version would have been invented, then widely used, and then possibly surpassed.

 

Today’s blog post focuses on one chapter in the book, “From Steam Engine to Thermodynamics”. It’s particularly relevant for my G-Chem and P-Chem classes this semester, where thermodynamics is a sizable chunk of the course material. Here’s the broad sweep before we get to the nineteenth century. Mankind has known that boiling water turns it into steam. It’s obvious that the gaseous state can be powerful when you encounter strong winds. But how would you harness its power? As far back as antiquity, there was one Hero of Alexandria who made mini steam-powered turbines. The Egyptian alchemist Zosimos writes of one Maria the Jewess who invented the double boiler, figured out how to make silver sulfide for metalwork, and essentially founded a school of chemistry. By medieval times, windmills have shown up and in the seventeenth century, the behavior of gases was being investigated and vacuum pumps were introduced.

 

I’m skipping over the details of the steam engine in which Watt played an important role amidst a constellation of many others; Lienhard, as a mechanical engineer, discusses this in detail. He also does a great job condensing and lucidly explaining how the ideas and terminology of phlogiston and caloric came about even though these theories have now been superseded by the atomic theory. Joseph Black (a contemporary and friend of James Watt), William Cleghorn, Joseph Priestly, Antoine Lavoisier, Carl Scheele, all show up as they puzzled over the nature of heat (which still confuses us today) and it took a while before the mechanical theory of thermal energy began to take precedence. Even today we still think of heat as fluid-like according to the caloric theory.

 

In the nineteenth century, while atomic theory was still fighting for recognition, another constellation of folks built the foundations of classical thermodynamics – no atoms needed! Carnot, Mayer, Joule, Clausius, and Tyndall show up for their turn in the spotlight. While I had heard all these names and learned the streamlined version of the history, I appreciated Lienhard’s wading into what was confusing at the time. Carnot accepted caloric theory even as he formulated his now famous “ideal” engine model. Mayer, who was trained as a doctor, made the observation that venous blood was redder when he was in Indonesia (then called the Dutch East Indies). Turns out it’s because in the tropics you don’t need to “burn” as much food (there’s a little more oxygen in the venous hemoglobin). And that got him thinking about energy transformation. Joule connects work and heat through his famous experiment shown, now a mainstay in textbook figures. By the time Clausius puts it together, you have the introduction of the new term entropy, and a way to quantitatively discuss the efficiency of an engine.

 

I always find it a balancing act when teaching thermodynamics. Much of the language and terminology we have inherited isn’t intuitive and students easily get confused. The equations we use are built on models from the nineteenth century when calorimetry was an important technique for trying to figure out energy changes in chemical reactions. Chemists use the word enthalpy to describe these changes, again confusing the students when it is used interchangeably with heat, and sometimes no temperature change is taking place. Knowing the models and their limitations helps us think about thermodynamics, but they’re a little strange and were defined for an age now past. In P-Chem, my treatment of thermodynamics is heavily statistical and I try to show students how this leads to what they first encountered in G-Chem. I try to include some of the history for context, but I’m not sure the students quite appreciate it. I certainly didn’t when I was an undergraduate.

 

One thing in Lienhard’s book that I’m still pondering is that we can trace the broad arc of invention in hindsight. But it’s very hard to see where something is headed when you’re in the midst of what might be a technological revolution. Right now the buzzword is A.I. systems, most familiarly in the form of large-language-models that guzzle energy resources. How Invention Begins was published almost twenty years ago as the Internet was becoming ascendant. After discussing the printing press, the explosion of literature, and then the opening up of tertiary education opportunities with the G.I. Bill, Lienhard wonders where education is headed in the age of the Internet. We haven’t quite figured that out and we’re starting to grapple with A.I. with numerous pundits championing it or being detractors. There is an arc, and we should ask the broader question of what humankind is aspiring to, but I’m not sure it’s a thirst for knowledge per se, at least in the way an educator like me envisions it.

 

Will we always crave novelty? I think we’re wired to do so. Do we want labor-saving devices? Yes, most likely. But we also want nebulous things like meaning and fulfilment in life, and it’s less clear how the technological arc will lead us in that direction. If we’re not careful, we can end up becoming slaves to a small oligarchy satisfying their desires for novelty, labor-saving, and fulfilment, which will override at least for a time what the majority would like. But within such a complex system, with nonlinearities that we cannot easily predict, at some point a phase change may take place. A revolution. An evolution. It will likely be messy and painful because of globalism and interconnectivity.

Wednesday, January 22, 2025

The Predictive Brain

What is our brain for? Making predictions. Why? Because that’s one way for a living organism to survive and possibly thrive in an environment that’s constantly changing. In the words of Andy Clark, author of Surfing Uncertainty, the brain is “an action-oriented engagement machine, adept at finding efficient embodied solutions that make the most of body and world.” I’m glad Clark provided that pithy summary at the end of his book. Because I’m not a neuroscientist, it took me a while to work my way through his argument. But I’m glad I did because it made me think a lot about how humans learn and about my origin-of-life research; both are key topics I think about a lot in my professional life.

 


I haven’t fully digested his argument which is essentially using a model he calls Predictive Processing (PP) to explain what the brain does and why. Many open questions remain, and Clark early on acknowledges that the specific details of his model may turn out to be wrong, but that the overarching idea of top-down predictive processes coupled with bottom-up error-signaling processes work together in concert to home in on a best guess of any encountered situation. But this isn’t an isolated brain in a jar. Embodied action is a critical part of honing the process. I will quote parts that really struck me and muse about them briefly in a meandering way. Like a surfer perhaps. This may be fitting given the title of his book.

 

More than a decade ago, when I first encountered the notion of System 1 and System 2 thinking (made famous by Daniel Kahneman’s Thinking Fast and Slow), I was enamored by the idea. But over time I’ve found the separation a little too clean. Clark argues they are one multi-faceted system. We might “use some quick-and-dirty heuristic strategy to identify a context in which to use a richer one, or use intensive model-exploring strategies to identify a context in which a simpler one will do. The most efficient strategy is simply the (active) inference that minimizes overall complexity costs… system 1 and system 2… are now just convenient labels for different admixtures of resource and influence, each of which is recruited as circumstances dictate.” I have a feeling Clark is correct and that his emphasis on multi-timescale processes is a key part of how organisms do what they do. I don’t quite understand how the longer timescale ‘higher-level’ brain processes couple to shorter timescale sensory signals, but I suspect the dynamic coupling of such processes is the beating heart of life.

 

Thermodynamic terms show up in Clark’s treatise. There’s free energy minimization when the brain tries to be efficient and make a prediction at the lowest cost. It’s why we continue to make mistakes (and learn from them) as we encounter new situations or variations of what we thought were things we knew. Entropy is defined in terms of surprisal; when prediction goes awry and we have an oops moment, this allows us to recalibrate. As a chemist, I define these terms differently, but I see a kinship between how I think about thermodynamics and what Clark is trying to do with these terms. However, having seen thermodynamic principles invoked in multiple areas, in my opinion I see more and more muddied thinking that may introduce more confusion than clarity.

 

I very much appreciated Clark’s emphasis on perception and action being inseparable. He writes that they are “two sides of a single computational coin. Rooted in multilevel prediction-error minimizing routines, perception and action are locked in a complex circular causal flow… Percepts and action-recipes here co-emerge, combining motor prescriptions with continuous effort at understanding our world.” While I mostly thought of sensory signals as exteroception, I appreciated Clark’s reminder that proprioception and interoception are just as important, and our brain needs to make sense of all three incoming channels. This made me ponder how to include all three in origin-of-life modeling, and also how to structure the seeming digital-analog divide. Information is efficiently stored digitally, but the action of life is analog. I’m sure that different timescales are important here, but I haven’t figured out how these could or should be modeled.

 

In Chapter 6, “Beyond Fantasy”, Clark delves into the idea that “perception is controlled hallucination”. He thinks we should be circumspect about the notion that our brains and thoughts are akin to virtual reality. Action on our part is important to continuously update the “probabilistic prediction-driven learning… able to see past superficial noise and ambiguity in the sensory signal, revealing the shape of the distal realm itself.” But our brain has also evolved to be an efficient computing machine, and this means pruning out or ignoring a lot of the sensory stimuli to focus on what is salient. I’m reminded about the mystery of learning, especially when it comes to the nonintuitive subject of chemistry. When the aha moment occurs, it’s a gestalt experience. After that I can’t unsee what I now know. It also blinds me as a teacher through the curse of knowledge. It reminds me that I constantly have to work hard at teaching because things obvious to me are not obvious to students encountering it for the first time. I can provide helpful scaffolding but how one actually learns is still mysterious. And my learning needs to be continuously updated. I’m sure I have erroneous notions I’m still passing along to students, but they’re in my blind spot – and I won’t know until I’m surprised by them.

 

Uncertainty surfaces when you least expect it. Perhaps that’s the moral of the story.

Sunday, January 19, 2025

Space Sucks

When  The Expanse begins, a Mars colony already exists and is known for its military prowess. Earth’s moon is an established rendezvous port that avoids the energy-costly gravity well. And the asteroid belt is active with space stations and mining for ores. When you hear enthusiasts discuss the near future of space exploration, this is what they’re imagining is within reach. But is it really? In their tongue-in-cheek book A City on Mars, Kelly and Zach Weinersmith, authors of Soonish, suggest it will be very, very difficult.


 

Why? Because to put it bluntly, outer space sucks for us humans, used to the natural resources and the gravity well of Planet Earth. We evolved to live at an atmosphere of pressure, with the abundance of water, oxygen, and surrounded by carbon-based food sources. These niceties will not be easily available on the moon, on Mars, or anywhere in outer space. Behind the humor of their presentation is well-researched science. You would not be surprised by the basic issues of human body physiology presented by outer space travel and long-term survival. We simply don’t know enough, and what we do know suggests there will be numerous obstacles. There’s even a section titled “Actually, the Whole Universe Wants You Dead” on radiation poisoning outside our home planet.

 

I already knew about bone loss and muscle atrophy when not living at 1-g. But I hadn’t thought about fluid shifts. Apparently, there’s something called “Puffy-Face Bird-Leg syndrome”. Also it causes vision problems. They provide the statistics: In short (< 2 week) missions, 23% of astronauts reported problems with close-up vision. In longer missions involving a stint on the International Space Station, it’s 50%. That’s ghastly. Apparently, the “best guess is that the upward fluid shift increases the pressure in your head, altering the shape of your eyeballs and the blood vessels that feed them.” Other questions asked include whether you can have sex in space and whether you can give birth in space; we’ll need to know this for long-term settlement. The answer: We don’t know. But a speculated theoretical solution involves a strange donut-shaped environment.

 

That was only a smidgen of Part I. Part II goes through the pros and cons of different possible locations: Quoting the chapter titles conveys this nicely:

·      The Moon: Great Location, Bit of a Fixer-Upper

·      Mars: Landscapes of Poison and Toxic Skies, but What an Opportunity!

·      Giant Rotating Space Wheels: Not Literally the Worst Option

·      Worse Options

All I can say is that it’s interesting to think about the ecology (or lack thereof) for the different locations. You’ve really got to worry about the environmental issues, which are all there to kill you! But if you did manage to overcome these, there’s Part III to worry about how to create and live in a terrarium. The authors argue that we should have spent a lot more money creating many different Biosphere-like experiments here on Earth instead of blowing money on prestige-inducing projects involving launching stuff out of our gravity well. We need some of the latter, but we need much more of the former. I found this section interesting because it made me think carefully about inputs and outputs needed to sustain a complex system. That’s what a living organism is!

 

Until I read Part IV, I had not ever considered the legal and sociocultural issues related to space settlements. This part was fascinating and new to me. The authors explain the current loose framework we have now and its problems. They also discuss in detail two other frameworks: Antartica agreements and the Law of the Sea (UNCLOS). The history and the ongoing negotiation of these treaties is a lesson we should learn. These community-rules kinda, sorta work, but there’s always the lurking problem of “might makes right”. At several points, the authors caution us that the scramble for outer space real estate could well lead to more conflict here on our planet’s surface. I’m inclined to agree with them.

 

What I really liked about the book, besides the humorous and engaging approach, is that the question of outer space exploration and settlement is inherently very interdisciplinary. I could see a series of interesting linked college-level courses across the humanities, social sciences, and natural sciences, that could prompt students to think deeply and learn a lot! However, the bottom line still stands. Space sucks. But it’s fascinating to think about.

Monday, January 6, 2025

Stone Age Mismatch

Growing older means I’m regularly losing muscle mass. Thus, I need to exercise regularly and increase my protein intake. I’m not on any special diet nor do I plan to have one, but I have read about the paleo diet and I do know a little biochemistry. I’ve heard arguments made that our hunter-gatherer-adapted human bodies are mismatched to what’s available foodwise in the modern era. There is some truth to this; larger scale evolutionary changes can take a long time; many people have had very sedentary lifestyles for maybe a couple of generations; and the access to high-energy low-fiber processed foods is unprecedented. Worse, the cheapest foods are the worst nutritionally (except for consuming sheer calories) leading to a larger divide between socioeconomic classes.

 

But have paleo-enthusiasts overstated the case? That’s the topic of Paleofantasy by Marlene Zuk, a professor of ecology, evolution and behavior. The subtitle: “What evolution really tells us about sex, diet and how we live.” Zuk makes three main points: (1) Evolution is always happening, (2) the rate of evolution can change (sometimes drastically) depending on the environment and selection pressure, and (3) evolution is not goal-directed – there’s no pinnacle of perfection that organisms are marching towards. She tackles the topics of diet, exercise, sexual habits, and communal-living. She talks about the effects of genes and the environment. And there’s even a tiny section about the two kinds of earwax and who might have which type.

 


I’d previously mused about what one can learn from the bone structures of hominids and their close cousins from reading Daniel Lieberman’s The Story of the Human Body. It discussed dentition and what foods hominids might have adapted to eating. Zuk’s book complements this by examining genetics and biochemistry. Today’s post will mainly muse on the diet aspects of Paleofantasy. While there are many variations of paleo diets, the typical argument made by enthusiasts runs like this: Large-scale agriculture has only been practiced for 5000+ years. Hunter-gathering hominids have been around for a few million. Evolution takes a long time. We’ve had millions of years to adapt to the hunter-gatherer lifestyle so we should be eating much more meat, some fruits and nuts, and avoid grains and dairy.

 

Zuk spends an entire chapter on consuming milk and lactose tolerance. All mammals (yes, we’re named after our infancy milk drinking!) have enzymes that help to digest and break down the sugars in milk. After weaning, we lose that ability. In humans, the enzyme lactase is active in our very early years and then the majority of us stop producing it. But some people continue to do so, and thus they have lactase persistence (more commonly referred to as lactose tolerance). Humans have been cattle-breeding for a while, 7000 or more years. Since there are 4-5 generations per hundred years, this adds up to 280-350 generations. Could lactase persistence have evolved within this period? Quite possibly, and Zuk provides some evidence to support her argument. What is particularly interesting is that the genes contributing to lactase persistence are different in lactose-tolerant Europeans and Africans (with additional differences among different herding African groups). In some cases these are the genes of our cells, in other cases they relate to gut microbiota.

 

If you are following a paleo diet today, it is actually very difficult to eat like a Stone Age hunter-gatherer would. The meat you get is different and you’re certainly not hunting it down fresh (nor are the animals we eat eating the same things they did in the Stone Age). The fruit available to Stone Agers was likely much less sweet, more fibrous, and required much more chewing. Even the few hunter-gatherer societies living in very remote areas today do not have the same diet as their ancestors hundreds of generations ago. Zuk also discusses evidence that grains and tubers were eaten in the Stone Age although they aren’t the same as today’s grains and tubers. And in early hominid times, before the invention of the bow and arrow, meat was not as easy to obtain in large quantities. There was likely more gathering than hunting.

 

A Stone Age hunter-gatherer was on average, a hungry person. Yes, there would be times of plenty and feasting, but more often than not, there would be very lean times. You might say that we are adapted to munch energy-dense foods whenever we can get it, and for many millenia this was not easy for most of the population. It’s still true of our ape cousins in the wild, unlike those in zoos with the same problem as us – easy access to calories and not enough exercise. Were our paleolithic ancestors well-adapted to their foods? It’s hard to say. Zuk argues that “the notion that humans got to a point in evolutionary history when their bodies were somehow in sync with the environment” is a fantasy. There’s never a match, so I suppose there was always a mismatch – but this may not be the appropriate comparison.The Stone Age diet was likely very varied – you eat what you can get, and it would change over time and place.

 

I found Zuk’s discussion on amylase, an enzyme that breaks down starch, interesting. Scientists studied the distribution of the number of copies for the amylase gene in different populations. Turns out that in populations that have been eating starch as a mainstay, “70 percent of the people had at least six copies of the amylase gene”; in populations that did not, it was less than 40 percent. The populations studied included present hunter-gatherer groups (both pastoralists and tuber-eating groups) and modern day society groups in each category. I haven’t gotten myself tested to know how many copies I have, but I don’t recall having problems with eating rice, my mainstay carb. That being said, I have over the years moved to increasing the mix of brown rice to my white. There’s also an interesting short discussion about the NAT2 gene and folate availability, but I need to read up a bit more about its biochemistry.

 

Reading about differential rates of evolution, genetic drift, and how a harsher environment can accelerate evolutionary changes, made me think about my origin-of-life research. Today, we know that prokaryotes and viruses can evolve and adapt quickly because their generation time is short and they can be subject to significant environmental pressure. For eukaryotes and multicellular organisms, the body-system provides more of a buffer against the vagaries of the environment, thus changes do not occur as quickly. But you might still see noticeable changes in as little as tens or hundreds of generations. For a proto-metabolic system, this might also be the case. Thus, I can potentially build kinetic models to explore how a proto-cellular chemistry might evolve with selection pressure. Figuring out what these kinetic parameters are, and for that matter what protometabolic systems might be self-sustaining is my present challenge. What was the Stone Age for the first organisms? I don’t know, but I would surmise that some mismatch may always have existed because there was never really a match in the first place. In life, good enough to survive is good enough.

Thursday, January 2, 2025

Flavor: Aroma and Taste

The Xmas-to-New-Year break was excellent timing-wise for reading Flavorama. With holiday foods, tasty treats, and wonderful smells from the oven, I found myself paying attention to aromas, tastes, textures, and more. I also learned that my vocabulary for describing such sensory delights was very limited – but this can be remedied by paying careful attention and lots of practice. I haven’t done so systematically, but I believe the words of Arielle Johnson (the author), who has her PhD in analytical chemistry and did research at food labs and restaurants. She has trained many volunteers to help smell and taste experimental food products, and almost everyone can do so with practice.

 


Flavorama is subtitled “a guide to unlocking the art and science of flavor”. You don’t need to be a scientist to understand the science part of it. Johnson is an excellent writer with a sense of humor and knows how to explain chemistry to a non-expert. The hand-drawn figures are both excellent and informative. However, let’s not forget the art. While there are tried-and-true techniques that Johnson shares, she also provides examples of how one might explore the vast and subtle landscape of flavors – through some trial and error tempered with understanding the general principles of flavor creation and manipulation.

 

Did you know that your senses of taste and smell are the only ones that are sensitive enough to distinguish nanoscale individual molecules? Our olfactory apparatus and our tongue are amazing molecular-level sensors. In her introduction Johnson provides the four laws of flavor:

1.     Flavor is taste and smell.

2.     Flavor follows predictable patterns.

3.     Flavor can be concentrated, extracted and infused.

4.     Flavor can be created and transformed.

This divides her book into its four major sections. I was reminded of the laws of thermodynamics because before these laws comes the Zeroth Law that underpins everything. According to Johnson, and rightly so,

0.     Flavor is molecules.

The chemist in me rejoices!

 

The First Law distinguishes taste and smell. The two are quite different. Tastes are almost monolithic. They’re simple. They’re baseline. The taste receptors on our tongue each have a dedicated line to tell our brain when we’re tasting one of the five: sweet, salty, sour, umami, and bitter. Sugar molecules bind to the sweet receptor. Inorganic ions (sodium in particular) activate the salty receptor. Acids are detected by the sour receptor. A particular amino acid, glutamate, tickles the umami receptor. Turns out there are twenty-ish different bitter receptors, good for detecting potential poisons, but they all send their signal to the same line that tells you “bitter”.

 

Smell, on the other hand, is complex. I learned that the olfactory bulb has a direct route to the brain that bypasses the brain stem, the gate for most other neuronal signals. Unlike the tongue with its one-to-one taste-to-channel setup, smell is multi-faceted. Johnson describes it this way: “The signal the brain gets from a taste molecule is like hitting one key on a piano: it activates its own distinct indicator that’s easy to tell from the the others. The signal that a smell molecule activates is more like a QR code: a two-dimensional pattern of many unique indicators… It’s impossible to pin down single smell elements in isolation… smells have multiple sensory qualities compounded together. The way we perceive them is more like seeing a face than tasting a taste… This is frustrating for simple categorization but, in its limitless variety, very fun and delicious for flavor.”

 

In chemistry, the organizing principle of the different elements is the Periodic Table. You can do something similar with flavors. That’s the Second Law. That’s where developing a vocabulary is useful: you could have broad categories such as fruity or floral or woody or vegetal. You could then drill down a little: something fruity could be citrusy. And that citrus could remind you of an orange, a lemon, or a grapefruit. For each of the five tastes, Johnson provides a set of elementary rules: what they are and what they are used for. I loved the chapter on salts: While I knew there were various types, I had mostly only cooked with simple table salt and soy sauce; but Johnson reminded me that you can get salty tastes from other salty ingredients: anchovies, cured meat, miso, and more. In the section on umami, Johnson explains how the signal sent by the receptor intensifies when both glutamate and inosine monophosphate (IMP) are both present. What else did I learn? Bitter flavors can be effectively chaperoned by the other four tastes. Spicy hot and menthol cold come from touch-receptors on our tongue. And that I like foods that release various sulfur molecules!

 

The Third Law on extracting and infusing flavor focused on techniques. It reminded me that I can draw on flavorful examples in my chemistry classes when I’m discussing solubility in water versus oil, distillation, diffusion, colligative properties, and intermolecular forces. And the Fourth Law is all about chemistry: using heat to transform raw food molecules into tasty and aromatic ones, or to coopt fermenting micro-organisms that specialize in creating alcohols, acids, and other assorted volatiles that give us rich and complex flavors. (I haven’t developed the vocabulary to describe these yet!) Reading these two sections made me think about how the chemistry I study in my research intersects with the joy of cooking. I investigate the interactions of aldehydes and amines; that’s the Maillard reaction of browning meat.

 

All this made me excited about potentially teaching a class on the Flavors of Chemistry. I can add this to the list of things that excite me, and whether I’ll actually get a chance to teach these as part of the standard curriculum in my department. One of the best motivators for me to learn more is to force myself to teach something new! And if anyone wants to delve into the subject, Flavorama is a great starter!