Wednesday, December 26, 2018

Magic 101


I’m mulling a potential project involving magic and chemistry for my G-Chem 2 class in the upcoming semester. G-Chem 2 is mainly about thermodynamics and equilibria. Smaller, but significant topics include kinetics and electrochemistry. In previous years, I have started the class thinking about energy, and this remains a strong theme throughout the semester. Last semester in my G-Chem 1 class we had one class discussion on the energy required to cast spells. Based on that discussion, it seemed we could group energy use into three categories.

Category #1: Moving matter. Provided you don’t have to move too many particles, this has the lowest energy cost. To lift a stationary solid 10 kg stone to a height of 1 m requires overcoming a potential energy of 100 Joules. To move water vapor molecules in the air to create a tablespoon’s worth of water in your hand might require less energy. Since gas molecules are already moving with plenty of their own energy, nudging them in a particular direction might not be too strenuous.

Category #2: Chemical reactions involving making and breaking chemical bonds. Turning a cup-full of liquid water into gas would require putting in sufficient energy to break the hydrogen bonds between molecules. We’re now approaching the 10-100 kiloJoule range. Stronger covalent, ionic or metallic bonds are typically in the 200-800 kiloJoule range per mole of bonds broken. This could be substantial, however for many chemical reactions to proceed, one does not need to fully break the chemical bonds of the reactants. Partially breaking them is sufficient; barriers to chemical reactions are typically a quarter to a third of the strength of the chemical bonds. All this is to say, that a spell requiring chemical reactions to take place might require more energy. A fireball requires starting a combustion reaction.

Category #3: Changing the identity of atoms. It might not be as difficult to transfigure sand into glass and vice versa since both are primarily made up of silica (SiO2). However, turning copper into chalk (calcium carbonate) might not be so easy. Transmutation, changing one element into another element, requires changes to the nucleus. The energies required are likely to be in the MegaJoule range or higher.

There is however a fourth category that can be related to energy, although isn’t exactly an energetic quantity. This is the thermodynamic property known as entropy. Overcoming entropy would be a feature of many magical spells. Seeing a broken egg reassemble itself would seem like magic was at work. Creating a vacuum above an object to levitate it might be a subtler version of overcoming entropy. Thanks to the science of thermodynamics, we can calculate the energetic cost to overcome entropy for different physical and chemical manipulations.

Assuming that the energy involved in magic can be transmitted via electromagnetic (EM) radiation, we have a range of choices for different manipulations. Visible and ultraviolet light are well-suited to breaking chemical bonds. Microwaves are well suited to heating and therefore moving molecules around. Higher energy or lower energy EM waves may be called upon for different tasks. The advantage of EM radiation as an energy over other sources is that it can be tuned specifically depending on the energy requirements. Photochemistry is very specific with a laser-like focus (pun intended). Thermal energy, on the other hand, is undisciplined, inefficient and leads to plenty of waste. But it has its uses, especially if you’re trying to warm up.

I could come up with a variety of magical challenges and have the students figure out how to achieve their goals with the minimum amount of energy required. For example, what would it take to cast aguamenti to create a cup of water? (It’s a magical spell that creates water in the Harry Potter series.) It might depend on the humidity in the air, or how close you are to a water source. How might you actually cast fuego (the fireball spell in the Dresden Files)? What burnable material is around? How much energy would be required to combust it in an atmosphere of 21% oxygen?

This sort of thinking tickles my brain. I wonder if the students would find it interesting or merely irrelevant. I suppose I won’t know unless we try it.

P.S. I did make it past a hundred, and since this is #101 for the year, I’ve aptly titled this blog post.

Friday, December 21, 2018

Brakebills Magic


Since the phenomenon of the Harry Potter series, there have been many other books featuring youngsters discovering they are magically talented and attending a magical school, hidden from the eyes of non-magical folks. One of these magical colleges is Brakebills, located in the Hudson Valley; and the protagonist is one high-school misfit named Quentin Coldwater. Brakebills is the fictional creation of Lev Grossman in his Magicians trilogy. As a regular reader of Time magazine, I recognized Grossman’s name because he wrote many of the tech articles. The first book in his trilogy is aptly titled The Magicians. Although it was written back in 2009, I’ve only just gotten around to reading it this winter break.


I typically read mostly non-fiction, but after starting this blog and musing about the theory of magic, I’ve started to explore several other books that might provide interesting insights in addition to fodder for my blog. Most recently this has been the Dresden Files, but I’ve also covered the Kingkiller Chronicles, The Strange Case of the Alchemist’s Daughter,  and of course re-reading the Harry Potter series. I also watched Once Upon a Time, the TV series. So here’s what I’ve learned so far from the first half of The Magicians.

The overall picture is articulated by Professor March in Quentin’s first lecture class at Brakebills: “The study of magic is not a science, it is not an art, and it is not a religion. Magic is a craft. When we do magic… we rely upon our will and our knowledge and our skillt o make a specific change to the world. This is not to say that we understand magic, in the sense that physicists understand why subatomic particles do whatever it is that they do… In any case, we do not and cannot understand what magic is, or where it comes from, any more than a carpenter understands why a tree grows. He doesn’t have to. He works with what he has. With the caveat that it is much more difficult and much more dangerous and much more interesting to be a magician than it is to be a carpenter.”

As Quentin starts to learn magic in his first year he discovers that it is very tedious, very finicky, and that “much of spellcasting… consisted of very precise hand gestures accompanied by incantations to be spoken or chanted or whispered or yelled or sung. Any slight error in the movement or in the incantation would weaken, or negate, or pervert the spell.” Worse, one had to learn all manner of variations because the environment influenced how a spell worked. “Magic was a lot wonkier than Quentin thought it would be.” He also studies history of magic and learns that “magic-users had always lived in mainstream society, but apart from it and largely unknown to it.” Sounds similar to the Statute of Secrecy in the Harry Potter books. The ones that are known to the world-at-large turn out not to be “the towering figures” but rather only have “modest ability”. Da Vinci, Bacon, Newton, Nostradamus and John Dee are name-checked. However, “by the standards of magical society they’d fallen at the first hurdle: they hadn’t had the basic good sense to keep their sh*t to themselves.”

There are occasional references to the Harry Potter books as fictional stories that Quentin and some of his classmates have read, although Quentin’s true love is another fictional series named the Fillory novels. They sound Narnia-ish, but I suspect their importance will increase later in The Magicians since constant reference is made to them and I can see how some of the story threads are gathering. I’ve only finished the first half where Quentin makes it through Brakebills, has just had his graduation ceremony, and is about to move to Manhattan.

Sad to say that I haven’t encountered much interesting theory that has made me ponder the nature of magic. (Maybe more will be revealed later.) Like Hogwarts, electronic devices do not work well at Brakebills because of all the magic. There is mention of a professor who specializes in quantum theory and presumably its intersection with magic, but there is no elaboration. (Too bad, because I would have found the connection very interesting.) There is a part where Quentin learns that becoming a proficient and powerful magician is not just learning the words, hand movements, theory, but somehow opening oneself up to imbibe it as a second nature or second skin. How exactly this happens is vague, although there is a clear experiential component and requires pushing oneself to extremes.

One tantalizing tidbit does come from a professor musing about the connection between language and the material world. “Sometimes I wonder if man was really meant to discover magic… If there’s a single lesson that life teaches us, it’s that wishing doesn’t make it so. Words and thoughts don’t change anything. Language and reality are kept strictly apart – reality is tough, unyielding stuff, and it doesn’t care what you think or feel or say about it. Or it shouldn’t. … Little children don’t know that. Magical thinking: that’s what Freud called it. Once we learn otherwise we cease to be children. The separation of word and thing is the essential fact on which our adult lives are founded. But somewhere in the heat of magic that boundary between word and thing ruptures. It cracks, and the one follows back into the other, and the two melt together and fuse. Language gets tangled up with the world it describes.”

The Magicians rushes through Quentin’s five-year education in half the book, unlike the one book per year in the Harry Potter series. Magic is punctuated with college antics: drinking, sex, and teenage angst. I found the pace choppy and the story so far is not all that compelling or interesting. That being said, the book is divided into two parts and I’ve only just finished Part 1. For now, I’m planning on reading the rest of the book to give it a fair shake, and then we’ll see whether I read the next book in the trilogy. Apparently there’s also a TV series, but I’m not sure if that’s worth watching yet.

Wednesday, December 19, 2018

P-Chem Advice


Now that I’ve turned in final grades, I have the opportunity to read my end-of-semester teaching evaluations. Except they’re no longer officially called teaching evaluations. Last year, the faculty voted in favor of a re-designed form. It’s officially called the Student Evaluation of Educational Experience (SEEE). The name makes sense, because the questions now ask the students to reflect on their learning experience. I particularly like that we added questions asking them how much they perceived they have learned, how much effort they believe they put in, and their perceptions of the workload. I’m certainly biased in favor of the new form because I was on the ad-hoc committee that spent the better part of a year redesigning it. The only drawback is that the new name is a mouthful, and so folks are likely still referring to the forms as teaching evaluations.

One open-ended question that we added: “What advice would you give to another student who is considering taking this course?”

Here’s what my P-Chem (Quantum) students thought (italicized), and what I thought about what they thought.

The best overall summary from a student that encompasses most of the important points: Stay on task; read before coming to class; go over your notes after class; visit office hours; do each problem on the problem set shortly after class.

One thing I tried to stress every class period was to point out the specific problem set questions they should attempt after class. During the lecture, I would provide context for each problem. Here are some related comments on the same issue.
·      Highly recommend doing assignments the day they’re assigned. Study a little every day.
·      Go to office hours, read before class, and do homework in timely manner.
·      Work hard on problem sets, it helps for exams. Ask questions if you are confused. Take time to really annotate problem sets.
·      Do the homework in advance and go to office hours.
·      Come to class always. Do problem sets. Study a lot.

One potential disadvantage of my new approach to problem sets is that students still get full credit even if they are unable to finish the problem set as long as they do a decent job annotating after getting the solutions. It puts more responsibility on the students; this is a good thing in my opinion, and I think appropriate for juniors and seniors. But as you can see from the comments, many of the students indicated that doing the problem sets was very important. Close to half the class mentioned this in some form.

Math (calculus) is always a concern as illustrated below. This came from students who barely scraped by in the class, but also from students who aced the class.
·      If you haven’t taken a math class in a couple of years because you either earned AP credit for calc, or finished early on, do not take this class without first brushing up on calculus.
·      Review previous classes and lessons, like calc and chem.
·      Be devoted to really engaging the material. Take the time outside of class to really become proficient in the require math skills.
·      Read the book, review math concepts.
·      Remember your Calc 1 class. Calc 3 would help but not needed.
I thought the last comment was funny because Quantum has a lot of integral calculus mostly found in Calc 2, while differential calculus is mostly found in Calc 1. Students need both Calc 1 and 2 as pre-requisites for P-Chem. This student had taken Calc 3 (and had very strong math skills – yes, I can identify the student’s handwriting) realized that comfortability with math helps.

One student did something that I did as a student: Rewriting my notes to study for tests proved to be very useful as a study tool. It really helped me understand the conceptual part of the class.

Another student’s brief, sage advice: Put in the work, it is worth it.

And my favorite cryptic advice from a student: Be ready for a course which questions how you think of chemistry. I particularly like this one, because this student I think tasted the mind-blowing experience that is Quantum chemistry. It makes you look at everything in a new way. Since my specialty is chemical bonding, we delved into the nitty-gritty and surprising details of how chemical bonds are represented and the pros and cons of different models. Electrons are enigmatic creatures; we don’t quite know their boundaries!

Since students are encountering the new form for the first time, I’m hoping that the advice will build for the next class. I’m definitely going to list all the advice provided by the students this semester in P-Chem 1 for my upcoming P-Chem 2 class next semester. For the most part, the same advice holds. (Biochemistry majors are only required to take one semester of P-Chem, they get to choose which but the majority opt for P-Chem 2. Chemistry majors have to take both.) Hopefully this helps the next group, and then the next... the gift that keeps on giving.

My evaluations from students are generally quite good, but they were even better than I expected this semester. It might be that the redesigned form asking the students to be more reflective leads them to (gasp!) actually be more reflective! But it may be that I had anomalously small G-Chem (Honors) and P-Chem classes this semester and that I had built a better rapport with the students because of this. The test will be next semester when I have a third more G-Chem students and double the number of P-Chem students. It will be a busy semester so I better enjoy my break now!

Monday, December 17, 2018

Boundaries: Robust Regularity


I’ve been pondering the difference between natural and artificial boundaries, thanks to reading another chapter of Carving Nature at Its Joints. Chapter 7 by Achille Varzi is titled “Boundaries, Conventions and Realism”.

The chapter begins by contrasting ‘artificial’ human-made political boundaries such as state and country lines that seem arbitrary, with ‘natural’ geographical boundaries such as a river, shoreline or mountain range. The E.U. is an interesting case whereby the landscape might not seem different, but the different languages on signs alerts you that you have crossed a line – the inhabitants on one side of the line might culturally behave quite differently from those on the other side. Here in the U.S., there might be less of a distinction. The author writes: “Most drivers feel nothing at all as they pass the border between Wyoming and Idaho, a line whose embarrassing geometric straightness says very little about its history (or says it all). Yet even here there are differences, and Idahoans are proud of their license plates just as Wyomingites are proud of theirs.”

How thin is the artificial state-line boundary? Is it infinitesimal? As thin as the planar node of a p-orbital? Or does it depend on the scale at which you’re drawing a map? Even then things get tricky. For example, how long is a coastline? The boundary seems more clear-cut when you look at a map. But as you start to zoom in, things get fuzzier with more detail. Strange, is it not? As you see more details, the actual boundary gets fuzzier. More fractal-like. With its twists and turns, the coastline starts to look longer at close range than from a bird’s eye view. Fractals beget more fractals.

In the world of chemistry, the same thing happens. As I write, I’m looking at the smooth surface of my laptop. It’s probably an aluminium alloy of some sort and feels smooth to the touch. But zoom in at the atomistic level and it is full of ridges, hills and valleys. Is Kansas as flat as a pancake? Possibly flatter. Depending on the relative scale of the protuberances. The atoms in the body of my laptop are chemically bonded to each other –they are joined together in some way. The piece of dead skin I just wiped off my keyboard? I consider it as separate from my laptop. Not chemically bonded. Was that skin flake previously chemically bonded to the (relatively smooth) skin on my hand? Likely so. Once again, things get more complicated when you zoom in closer.

What is the boundary of an atom? Is it natural or is it artificial? At first glance, you might think it’s natural. It seems depicted as such in chemistry textbooks. Your mind’s eye likely pictures colored spheres when you think of atoms. But as my G-Chem students learn in the first week of class when encountering the Rutherford model of the atom, it’s unclear where exactly the boundary is located. I draw a dashed line to indicate its fuzziness. When we get to the orbital descriptions of electrons, we see three-dimensional shapes once again. But since the orbital is defined as a probability distribution, we typically draw the orbital shapes with an arbitrary boundary – enclosing a 90% probability that the electron will be found. The probability distribution can be modeled mathematically (for hydrogen-like orbitals), but the wavefunction has a long asymptotic tail. There is no ‘hard cut-off’. The same is true of chemical bonds. The bond dissociation curve also features a long asymptotic tail.

Does this mean that all boundaries are arbitrary? Maybe there are no such things as clearly defined natural boundaries, and they’re all somewhat artificial. Perhaps we’ve descended into a post-modern, post-fact, post-truth cynicism with all this arbitrariness. But not being able to clearly define the boundary in detail doesn’t mean there isn’t a natural, albeit fuzzy, way to distinguish two separate states. We do it all the time, classifying things into one category and other things into other categories. We might scratch our heads when odd cases show up – hello platypus! But eventually an adequate space is found as we move other boundaries around.

Shorelines recede. Rivers erode a bank. Lakes dry up. Mountains rise slowly, volcanoes perhaps a little quicker. Even natural boundaries seem fluid in the long view of geological time. But what about atoms and chemical bonds? Unless the fundamental laws of physics change, an electron that’s less than 1.5 Angstroms from a hydrogen atom nucleus is likely to be “bound” to it. An electron further away, noticeably less so. One could quibble about whether the dividing line should be 1.4 or 1.6 Angstroms, but the scientists would very readily say that 0.5 Angstroms is bound and 3 Angstroms is not. (0.53 Angstroms is the Bohr radius, the most probable distance of the electron from the nucleus.) We might not agree on the exact point of division, but there are vast swaths of agreement about what constitutes the ‘normal’ state of affairs. (Similar arguments can be made about the lengths of chemical bonds.)

There’s a neat phrase to encompass this state of affairs: ‘robust regularity’. Boundaries may get fuzzier as you attempt to look closely at the details, but one doesn’t need to throw the baby out with the bath water. Heisenberg’s Uncertainty Principle may not allow us to exactly locate where an electron is, but chemists can do amazing things by moving those electrons around and all we need is a rough relatively fuzzy idea of where the electrons whiz around. Nature is regular. No two expert butchers would carve the cow along precisely the same lines, but they are likely to be close when superimposed. The boundaries are robust. Up to a point. We can’t exactly define that point in very, very close detail, but pragmatically it doesn’t matter.

After reading this chapter, I’m starting to see robust regularity everywhere. Not just in my field of chemistry. I think it’s a useful way to think about similarities and distinctions. The author writes: “Such is the magic of boundary lines: they are thin, yet powerful; they separate, and thereby unite; they are invisible yet a lot depends on them, including one’s sense of belonging to a country, a people, a place.” This reminds me of the first Harry Potter book where we are introduced to a boundary separating the wizarding and non-wizarding worlds – a pub called the Leaky Cauldron. I like the name. It hints that the boundary is leaky, and not a hard cut-off. Muggles do find their way into the magical world and vice-versa.

It’s amazing that we live in an ordered world. As a scientist, being able to discover how things work and design new technologies is dependent on that regularity. But nature is also robust, perhaps magically so. Things get fuzzy if we try to examine them too closely, escaping our senses as if a magic trick was being played. We may not know how the trick actually works but we’ve seen it enough times to make regular robust predictions. That’s how science works!

Wednesday, December 12, 2018

Ad-hocracy


Uberification.

Yes, it’s a word, and you can probably guess what it means. There’s even a book called Uberification of the University. I haven’t read it yet but maybe I should.

I did however read Temp by Louis Hyman. Hyman is a professor of economic history tracing the rise of the gig economy. Uber is one of the most visible forms at present, especially in the U.S. where car ownership is widespread. Temp is appropriately subtitled “How American Work, American Business, and the American Dream Became Temporary.” But the gig economy is not new. Hyman traces the rise of Manpower and McKinsey, and shines a light on the many ‘hidden figures’ who have long faced job insecurity – women and people of color in the U.S.


The gig economy has been here for a long time. It’s simply more visible now. Those lamenting for the good old days of secure jobs have a narrow vision of those days – they were only good for a small segment of the population, particularly those who had some cultural and social power. When I was growing up in a faraway country, America was the land of opportunity. While things have worked out well for me so far, that’s not the case for many others and perhaps most others. Being a tenured professor means that I am among the sub-1% of people with the highest job security. I should count myself very, very, very lucky. A generation from now, I think such positions will be extremely rare.

I did start out as an adjunct professor (i.e., not on the tenure track) at my current institution. A physical chemist was going on sabbatical and someone was needed to cover P-Chem. I was a postdoc, and doing very well research-wise, but I was slightly bored and missed teaching very, very much. (I’ve known since young that teaching would be my career of choice.) So I left my postdoc early and took up the adjunct position, but I continued to move my research projects along and publish papers, and I mentored undergraduate research students as an adjunct professor. That’s an uncommon situation, but I was (and still am) in a particularly supportive department. A couple of years later, I joined the tenure track and the rest is history.

Until now.

Higher education is struggling. Private colleges not in the elite group are in trouble if they are mostly tuition-driven (i.e., a small endowment at best), not well-located, and low status in terms of name-recognition. My institution is doing fine overall. The endowment is decent, although not fantastic. There is some name recognition in some areas. But we’re also in a very desirable location – which means that students will keep coming, at least for a while, if we’re able to keep tuition and financial aid manageable. I’m pretty sure my job will be stable for at least ten years, and possibly twenty years – by which time I would have retired. Not planning to be a hanger-on.

But all this makes me wonder if I should be thinking much harder about the future of education. Is what we’re doing sustainable? I’m not so sure anymore. What might teaching look like in the gig economy? An example might be the blossoming web of tutoring or extra classes in Asia where intense national exams act as a sieve. The tutoring economy is huge, decentralized and highly competitive. It also has features of a superstar economy. The ‘top’ tutors who have built up a solid reputation and clientele are much sought after. Some have opened schools and centers with multiple branches across a city, and they charge premium prices. As you might surmise, this further widens the gap between the haves and have-nots.

We live in an ad-hocracy. This will be the experience of most of our current students when they graduate. A small number may land those coveted few positions that still provide good-old-days job security with benefits and protections. The majority will have to learn to live with the new normal of job insecurity. Corporations have restructured to emphasize shorter-term profits and quick growth over longer-term stability. McKinsey’s consultants, themselves part of the (albeit high-paying) gig economy, led America (and other parts of the world) into relying more on temp-supplying agencies such as Manpower. Flexibility is the new watchword. Gigging through TaskRabbit and Upwork have become more desirable options than the gig-shift-work of Starbucks and its lookalikes. In the penultimate chapter of his book, Hyman paints a potential differential outlook for winners and losers.

For those at the top… who were ambitious, smart and entrepreneurial, the digital economy was just shorthand for opportunity on a global scale. Energetic entrepreneurs could have an idea, assemble a team through Upwork, and bring a product to market with few barriers. For everybody else, the digital economy was a grind. Labor laws offered few protections, desgined as they were, for an industrial age. Employers, given a chance, evaded wages and benefits… Office work no longer felt like a path to the middle class… Uber drivers hated the low wages and lack of choice but lamented the coming of driverless cars, which promised to eliminate even that last refuge of commodity skill… Uberification portended a future of winners and losers, of insider and outsiders, of differentiated employees and commodity workers. Interchangeability was the core of the workforce problem… workers in the twenty-first century need to find monopoly skills. For the rocket scientists and visionaries, this requirement is not a problem, but for the rest of us, who are more or less human, commodity workforces can’t help but seem like a race to the bottom.

If that sounds bleak, Hyman paints a potential silver lining in the last chapter of his book. He has some interesting suggestions of how the crumbling corporation can evolve into a digital cooperative with the potential to provide economic benefits (and some semblance of security) while preserving flexibility. He briefly discusses some pros and cons to portable benefits, universal basic income, reorienting the movement of capital into small business rather than consumer debt. The answer is resolutely NOT to “turn everyone into software engineers” but to allow a myriad of talents and skills to find their niche in the long-tailed digital economy. The old idea of job security is indeed going the way of the dodo, but there might be ways to still provide life security in a thriving populace.

One of the most interesting vignettes in Temp is how Tesla’s self-driving modules leapfrogged past competitors by simply including sensors in their mass-human-driven cars to train the artificial intelligence in the modules. Hyman suggests that virtual reality training of robot A.I. may lead to similar leaps. Think there’s a task only humans can do? You’d be surprised at what robots can do through machine learning. I think I have a movie idea of a prequel to the Matrix movie series based on Hyman’s descriptions. That being said, the full strength of Hyman’s argument comes to the fore when reading the entire book. His expertise and blend of economics and history makes a solid argument for how and why the ad-hocracy has evolved to its current state. I highly recommend it, even as it makes me increasingly skeptical of the value of consultants. 

Saturday, December 8, 2018

Magic, Science, Energy, Beasts


Putting together an end-of-semester event for my class close to Finals is probably not the best idea. My excuse? I was constrained by Hollywood and Holidays.

The Fantastic Beasts movie sequel was released in the U.S. the weekend before Thanksgiving. I make it a point not to go to the movies on opening weekend to avoid the crowds. The next weekend was Thanksgiving. Then the next weekend was my General Chemistry take-home Exam #3. That left this weekend, more specifically, today. Amazingly, three-quarters of my small Honors class came out to lunch at a nearby mall, and most of them also watched the movie. But to warm up for the event, we had a class discussion on magic, energy and science yesterday afternoon.

As pre-reading, I had directed my students to an early blog post that hinted at the relationship between magic and energy, both mental and physical. In class, I gave a brief intro as to why I thought the topic was interest and its relationship to interdisciplinarity, inquiry and the liberal arts. I distinguished between spells, potions and imbuing objects with magic – and why one might choose different approaches. I had the students suggest magic spells of interest (mostly with reference to the Harry Potter series). Once we had about fifteen on the board, we divided up into small groups to analyze the spells, in particular the students would attempt to rank the spells by energy cost and difficulty.

The ‘fun’ part (at least for me) is that figuring out the difficulty of a spell requires thinking very specifically about how exactly the spell causes matter to interact. For example, a levitation spell could involve diminishing local gravitational effects. How would you do that? Do you decrease the mass of the objects, and if so how? Or do you move air particles underneath the object to push the object upwards? Or you could create a vacuum above the object by moving air particles away? Depending on how you effected the spell, different energy considerations would be required. We’ve been learning about the properties of gases in general chemistry, and not surprisingly many of the student ideas had to do with moving particles around.

Creating water via aguamenti was deemed one of the easier spells, if the method used was to draw water vapor from the surroundings and condense it in a particular location. More difficult would be finding a source of hydrogen to react with the relatively abundant oxygen in air to create water; this would also require a sufficient source of energy to break chemical bonds before the atoms can recombine to form H2O. Resurrection would be the most difficult, and certainly more difficult than a killing spell.

The vanishing spell brought up discussion about whether you were making an object merely invisible or whether the object was teleported or even whether the spell was used to manipulate a hallucination in the mind of someone else. Memory spells were thought to be difficult – as the neurobiology isn’t well understood, and perhaps hallucinatory perceptions would be similarly difficult. I briefly brought up invisibility cloaking in real life although forgot to mention the related ‘ring of protection’. We did discuss how a shield spell such as protego might be effected by moving particles in the air to create a barrier.

I was a little disappointed that the movie (in my opinion) did not provide much interesting fodder for future discussion on magical theory. There wasn’t as much about the magical beasts either, unlike the first Fantastic Beasts with the demiguise. The climax did have energy-beasts, but otherwise there wasn’t anything too fantastic. I did enjoy seeing Nicolas Flamel saying he was immortal because he was an alchemist, and also having no food in the house. Apparently being immortal didn’t require him to eat. That being said, the movie was well executed, had good special effects, a somewhat interesting storyline (although a little convoluted) and had the prerequisite set up for its sequel. My overall rating: the movie was good, enjoyable, but not fantastic.

Wednesday, December 5, 2018

Do Molecules have Shadows?


Do molecules have shadows?

I don’t know. It’s a question I wrote on the white board in my office two weeks ago. Several students have since asked me about this, leading to some interesting discussions! The impetus for the question comes from reading Carving Nature at Its Joints. Chapter 6, written by Roy Sorensen, is titled “Para-Natural Kinds”.

What are para-natural kinds?

Here is the author’s definition. “A para-natural kind is an absence defined by a natural kind. For example, cold is defined as the absence of heat and shadow as an absence of light.” This constrains the definitions to connect with physical measurements involving heat and light, rather than more abstract concepts – evil is the absence of good, for example.

Sorensen argues that “absences are not substances and so are not natural kinds”. One neat visual example he provides is from The Tomb and Shade of Washington by James Merritt Ives. Shade takes two meanings in reference to the tomb and the trees in the landscape. There are the shades or shadows ‘cast’ by these structures, but between the two trees is an outline of Washington, a spirit shade perhaps. Very clever!


Let’s tackle the issue of heat. Technically, as a physical chemist, I would define heat as the transfer of thermal energy. Thermal energy arises from the motion of atoms and molecules. The amount of thermal energy is measured by temperature – itself a tricky concept with extensive treatment by the philosopher of science Hasok Chang. I don’t use the word ‘cold’ when I’m teaching chemistry; we do talk about higher and lower temperatures as a measure of thermal energy. I haven’t done so consciously because of the philosophical implications; I’ve simply imbibed the culture and language of my discipline – we use the words heat, temperature and thermal energy often, but rarely if ever use the word cold. I’ve never brought up this issue with my students, simply because I’ve never thought about it philosophically, but I might do so next semester since I’m teaching thermodynamics in both my G-Chem 2 and P-Chem 2 courses.

The issue of light is stranger. When light falls on an object, depending on the source direction of light and the shape of the object, it casts a well-defined shadow. The shadow moves, grows, shrinks (as the object or the light source is moved) in ways that can be calculated and predicted exactly. Things are more complicated however. If an object reflects some light, then it also casts a ‘para-reflection’. Sorensen provides visual illustrations to distinguish these two cases. He also provides a practical example of why one might care, comparing a white beach ball and a black beach ball sitting on the surface of a pond. “A hot duck that wants to cool off will paddle to the ball’s shadow, not to its para-reflection.”

Sorensen makes a further distinction between para-natural kinds and artifacts when he discusses shadows. I can’t summarize his text more clearly than what he has already written so here are snapshots of the relevant text.


If astronomers care about shadows, should chemists? After all, we’re interested in the interaction of light and matter. (And if magic is tied to the electromagnetic spectrum, a spell-caster should care greatly.) This is what led me to the original question: Do molecules have shadows?

In a sense, the simple answer is yes. Molecules have mass and shape. Some molecules are particularly ‘good’ at absorbing light of particular wavelengths. The entire dye and color industry is predicated on such properties. So, perhaps if you shine a swath of light at a molecule, then it should ‘cast’ a shadow. Do molecules reflect light? A macromolecular solid certainly would. A liquid-ish colloid would provide interesting dynamic patterns. But our ‘real world’ experience is built on the fact that the size of photons (if they have a ‘physical’ measurable size) is much smaller than the size of everyday objects. Analog-ously (pun intended) one might say that the wavelength of visible light is much smaller than the lengthscale of everyday objects.

What happens when a single photon interacts with a single molecule? There is a quantum mechanical interaction, and it can be calculated. (I won’t go into the details here.) But I don’t know if this would lead to a shadow in any meaningful sense that we can conceive. So I don’t know the answer to the original question.

What is a hole? Or a gap? If it is the absence of matter (be it atoms, electrons or photons), is a hole a para-natural kind?

The physical chemist Peter Atkins has suggested that all of chemistry can be boiled down to just four processes: proton transfer (Bronsted acid-base), electron transfer (redox), sharing unpaired electrons (free-radical), sharing electron pairs (Lewis acid-base). You could boil down chemistry essentially into one reaction type: moving electrons into ‘holes’. I think I know what an electron is, but what’s an electron-hole? The absence of an electron? The shade of an electron? I like this latter definition – it sounds ghostly to me!