Monday, May 13, 2024

Before Boltzmann

On the last day of class, I touched on some big picture things in my P-Chem class on Statistical Thermodynamics and Kinetics. But after class, while sitting in my office, I started to ponder the conceptual strangeness of entropy before Ludwig Boltzmann’s statistical interpretation.

 

Sadi Carnot had laid the foundations for the theory of heat. At the time, heat was considered a ‘weightless fluid’ called caloric. This fluid spontaneously flowed from hot to cold, down a temperature gradient (the Zeroth Law of Thermodynamics). Carnot devised the heat engine model, and laid the foundations for conservation of energy (First Law of Thermodynamics) by examining the conservation of heat in the ideal cyclic process of the model. The caloric theory of heat turns out to be wrong, and Robert Mayer was instrumental in figuring out that energy (a hard-to-define word) was conserved. Mayer actually called it the conservation of force. William Thomson (Lord Kelvin) came up with ‘energy’. It’s not surprising that all this was confusing to the scientists of the day.

 

It was Rudolf Clausius that extended Carnot’s early ideas but he needed to introduce ‘entropy’ as a partner to energy. Arguments about the conservation of heat (energy) only applied to reversible processes, but we observe many other physical and chemical processes that seem to go in one direction. Entropy is crucial because for a process that proceeds irreversibly, this quantity called entropy increases, at least in an isolated system (Second Law of Thermodynamics), although how fast it would proceed was controlled by kinetics. Clausius connected entropy to heat and temperature mathematically, but could not take the step that Boltzmann did, partly because in the mid-nineteenth century, many scientists did not believe in the existence of tiny molecules that no one could see.

 

Why the Second Law is obeyed was rather mysterious. It was as if there was some hidden variable in nature that commanded that entropy must increase for anything to proceed over time and not get stuck in equilibrium. Not that scientists couldn’t posit strange abstract ideas like a weightless fluid or the luminiferous ether of space. But it’s like invoking magic. Something must be transferring. Something must be facilitating movement. But we have no idea what that something is. Is it even a thing? Invoking entropy was like invoking magic, but not uncommon for scientists positing interesting new ideas. There was a mathematical framework for calculating entropy, but no one knew what it was. We have a better idea now, but like any large cross-cutting concept, it’s hard to provide a succinct definition, and we have to rely on multiple examples to illustrate entropy. One idea is that entropy says something about the quality of energy.

 

By embracing the molecular hypothesis, Boltzmann was able to make a powerful argument that brought the statistical to thermodynamics. It’s how I approach the teaching of thermodynamics to students in my chemistry classes. Why does the Second Law do what it does? Sheer probability. When there are six-gazillions (or moles) of molecules, the most probable distributions far outweigh any seemingly ‘ordered’ macroscopic structures. That’s why entropy is often associated with disorder – a helpful analogy although occasionally misleading. We need Mack and Mike to help us think about what’s actually going on, and we have Boltzmann to thank for that point of view. 

No comments:

Post a Comment