Saturday, September 30, 2023

Mystery of Mastery

Adam Gopnik’s new book sounded intriguing: The Real Work, “On the Mystery of Mastery”. The title comes from a colloquialism among magicians, referring to the “accumulated craft, savvy, and technical mastery that makes a great magic trick great.” Gopnik’s opening chapters discuss the mystery of performance, what he learned in interviewing magicians, and his own efforts in taking drawing classes. Gopnik’s stories about himself are deeply personal. He muses on his expertise as an art critic in sharp contrast to his own inability as an artist. Can one really talk about art without knowing what it’s like to perform the real work?

 


Maybe I’m naïve but I don’t think of mastery as mysterious. You have to be interested, put in the work, and aim for excellence. It takes lots of practice. Gopnik’s expertise is as a writer and has ‘book knowledge’ in the arts. My expertise is as a teacher and my theoretical knowledge is in chemistry. As a computational chemist, I’m not a lab practitioner per se, and I don’t have good hands in lab. But I do have intuition to sniff my way around a computational or theoretical problem – although I might not always have the math or programming expertise to surmount it.

 

I felt Gopnik’s book was uneven. He attempts to draw out general principles, but either he doesn’t do it successfully or I was too thick to understand his profundity. He has a knack for the turn of a phrase, but I got a feeling that many of the words were superfluous. The best parts of his book were when he would quote what his ‘teachers’ would say in whatever skill he was trying to pick up (driving, dancing, drawing, and more). Did those teachers grasp the real work in their own spheres of expertise? Gopnik seems to assume so tacitly. I’m inclined to agree simply because these folks have spent a long time and lots of practice as teachers of their craft. But you have to know your craft well to teach it. What I enjoyed most was that the book prompted me to think about my craft of teaching.

 

Am I a master teacher? I don’t think so. I have a pedagogical bag of tricks, honed over years of practice. Many of my colleagues have picked up one of my mainstays – short, frequent, low stakes in-class quizzes that take less than five minutes and students write their answers on an index card. I’m not sure I’m good at teaching someone else how to teach, nor have I tried to do so systematically. But I do know and practice the elements that go into good teaching: being organized, being well-prepared, knowing your subject matter well, displaying empathy, being enthusiastic about the material, ensuring that the pedagogical approaches are chosen judiciously to match what I want the students to learn, and there’s certainly a performance aspect to the business. You also need to keep learning yourself! Teaching is both art and science.

 

Teaching biochemistry for the first time has been interesting for me metacognitively. Before this endeavor, I kinda sorta knew swaths of the subject material – but not very well or deeply. This is the position students are in. When they first learn something, they only kinda sorta know it, but they don’t have mastery. It takes practice and thinking more deeply for things to click. And content knowledge, some of which needs to be memorized so you have it at your fingertips. That’s what allows you to go deeper. Getting to mastery is not a mysterious business. I know what I need to do, and I just need to be willing to do it. There’s the rub. The spirit might be willing but the body is weak.

 

But in a sense there is a mystery to learning. How exactly learning happens is unclear. There are steps that work most of the time for most people, but the grasping of a new concept seems to be a gestalt experience – something just clicks. We can’t explain exactly how or exactly why. The eureka moment can be dramatic, but it can equally be subtle. You don’t realize the aha moment until you’ve passed it. I know there was a time where I didn’t understand stoichiometry, until I did so sometime later, but I can’t tell you how it happened. No flash of a light bulb. I can say the same for most of the concepts I’ve learned in chemistry. As a teacher, I can sometimes see the flick of the brain switch in a student, but more often I don’t. Perhaps some background consolidation of understanding happens when we sleep or when we’re not thinking about the matter at hand.

 

Do I do the real work of teaching? Possibly to some extent. And I know there are multiple ways to do so. My pedagogical approaches may not fit so well hand-to-glove with another instructor’s. This may also be true for some of my students. But students are overall more similar than they are different. And this might also be true of good teachers. One thing I have learned is that the work never stops. I can get into a rut, I can be overconfident and be underestimate my preparedness, and so the real work actually requires constant work. No mystery there. I need to keep alert and keep putting in the work. That’s what mastery requires.

Sunday, September 17, 2023

Fighting Ghosts: Lockwood version

I don’t subscribe to Netflix, and therefore have not watched Lockwood & Co. It is based on a book series by Jonathan Stroud (of Bartimaeus fame). I picked up the first book, The Screaming Staircase, from my local library for my weekend read.

 


The protagonists are three ghost-hunting teens. Why teens? Some of the them can sense ghosts, an ability they lose in adulthood. The setting is England where hauntings have increased. No one is sure why but they call it the Problem. Anthony Lockwood is the leader of the trio who has set up his own Psychic Investigation Agency. His talent is Sight – he’s particularly able at seeing apparitions and death glows (sites where murders took place). George is his talented researcher. And the story is told from the point-of-view of his newest employee Lucy, who can Hear and sense apparitions by touching objects in a haunted vicinity. The narrative action is brisk, and Stroud has a talent for being very descriptive without bogging down the text. It’s an engaging story – hard to put down the book once you’ve started! But on to the elements of subduing ghosts.

 

In Lockwood & Co, ghosts only come out at night and they do not venture far from their Source – typically their dead bones. But they can lodge themselves in other objects. Spiders and cobwebs are a sign of a haunted area. (Not sure why.) As they manifest, one might see ghost-fog, a thin greenish-white mist that seems to stay low to the ground. (Gaseous particles more dense than air?) It is speculated that this matter is ectoplasm which is harmless. More dangerous is a ghost-touch, bodily contact with an apparition which has a visible shape usually matching the deceased. (However there are Changers that can alter their shape.) The presence of ghosts decreases the temperature dramatically (like dementors) suggesting to me that they draw (thermal) energy away from the surroundings and perhaps the ghost-touch draws energy from human cells. A ghost-touch injury is fatal if not treated quickly. Ghosts come in several levels. We encounter Type One (low-grade) and Type Two (more dangerous) in The Screaming Staircase. I’m guessing the stakes will be upped in the sequels.

 

There are two large famed agencies that employ hundreds of teen agents who help protect the populace by eradicating the ghosts. Lockwood’s new agency is an upstart minnow. There’s also a government Department of Psychic Research and Control that licenses and these agencies, but they are also engaged in researching the Problem. Both the department and the local populace hire such agents to rid themselves of these hauntings. The government has set up warning bells and ghost-lamps – high-intensity white light that wards off ghosts. So does running water, feasible in the city but not in the suburbs apparently. (Not sure why.) The government has also imposed a curfew that the populace is careful to keep for their own personal safety. Except for the agents. Their work begins at night.

 

How do you fight ghosts? Agents have iron rapiers tipped with silver. They carry iron filings and iron chains to protect themselves. Yes, it’s a weight to carry around. Iron is mainly preventive. They are installed on window-bars, door frames, even in furniture. Silver seems to be more effective, although more expensive. The cheapest substance to drive away ghosts is salt, presumably sodium chloride, but it is only effective on weaker Type Ones and is essentially a deterrent. As an offensive weapon, magnesium flares (canisters containing magnesium, iron, salt) can be ignited. Why those metals? Would others work? Gold doesn’t. I infer that neither does copper since I expect that electrical devices are wired with copper. And those don’t seem to stop ghosts.

 

If ghosts absorb energy, why do they only do so at night? Is there too much energy from photons in the day? Could large bursts of energy harm a ghost? And perhaps that’s why magnesium flares are effective. Burning magnesium is a very exothermic reaction not to mention it also releases lots of photons of blinding white light. What would electricity do to a ghost? Interestingly, there might be some issues. To give you a sense of Stroud’s writing, here’s Lucy in the middle of a mission in the hallway of a haunted house:

 

“Doorways opened on either side: gaping and choked in darkness. All of which could have been nicely illuminated if we’d turned on the lights, of course. And there was a switch on the wall, right there. But we didn’t attempt to use it. You see, a second rule you learn is this: electricity interferes. It dulls the senses and makes you weak and stupid. It’s much better to watch and listen in the dark. It’s good to have that fear.”

 

More questions than answers. I hope that subsequent books uncover more about the theory of apparitions and the chemistry of what works and doesn’t. If not, I expect to still enjoy Stroud’s writing. I will be getting the next book on my next visit to the library.

 

P.S. My most recent Ghost post was almost a year ago!

Saturday, September 16, 2023

Tomorrow's Science Today

Human brains have evolved to predict the future. We take into account what happened yesterday, consider what’s going on today, and make plans for tomorrow. But our predictive models are only based on what we know, and to some extent what we can imagine. Lurking outside our ken is a host of unknown unknowns. Not that this stops our imaginative fancies. I fully admit to enjoying speculating about the future, both an art and a science.

 


I’ve been reading The Skeptics’ Guide to the Future by the Novellas (Steven, Bob & Jay), published last year, subtitled: “What Yesterday’s Science and Science Fiction Tell Us About the World of Tomorrow”. The book is divided into five parts: (1) a brief introduction to futurism, (2) near futures based on present technology, (3) more speculative further future technologies, (4) outer space travel, and (5) sci-fi tech and its potential reality or lack thereof. In today’s post I’ll briefly discuss (2) and (3).

 

In terms of peering ahead at the coming decades, I agreed with the authors on much of their speculations. We should expect to see some of their proposed advances in genetic manipulation, stem-cell technology, robotics, wearable tech, and energy advances. Before reading this book, I was personally skeptical about the future of brain-machine interfaces, but the authors make a good case for advances in one direction: brain-to-device. I remain skeptical about the opposite direction: device-to-brain. While they thought that general A.I. was within reach, I disagree. And I think that’s because I think general A.I. and device-to-brain advances go together, and this is a truly “hard” problem.

 

I particularly enjoyed the chapter on “Two-Dimensional Materials and the Stuff the Future Will Be Made Of”. It begins with a historical sweep from stone to concrete to glass to iron and steel. I’d forgotten how important the Hall-Peroult process was for purifying aluminium – what a wonderfully versatile metal! It’s so ubiquitous that I forget how remarkable it is. The authors then discuss composites, nanostructured materials, metamaterials, and of course graphene. Reading this chapter made me want to completely overhaul my G-Chem course. I could make it so much more interesting than it is now – if I can just take the step to ditch the textbook and not shy away from the mounds of extra prep work it will take me to do a total redesign. G-Chem 1 could be about molecular architecture, and G-Chem 2 could be about powering life on our planet and beyond!

 

The two-hundred-year look-ahead of speculative future technologies include taming fusion energy, mature nanotechnology (in material science, manufacturing and medicine), synthetic life, room-temperature superconductors, and space elevators. Given my research interests in the origin of life, I was familiar with the current science on synthetic life (still in its infancy). The authors ask the question “why would we make artificial life when we could evolve or genetically engineer conventional life?” Their suggested answer: “to have a greater level of control and more open-ended possibilities… competing technologies...” and they speculate that marrying this with future narrow A.I. might be an expected route for efficient and enhanced design. I think room-temperature superconductors would be huge, but getting there will be challenging if it’s even possible. Especially since it would likely require a bunch of rare and expensive metals to scale up. We’ll see what the future holds!

Sunday, September 10, 2023

Ant-Man, Biochemist

If you asked me which superhero’s powers I would like to have this week, I would choose Ant-Man. I’d never considered choosing Ant-Man before. He seemed a bit dorky and clueless from the Marvel Cinematic Universe. (He might be more interesting in the comic books but I haven’t read them.) Why Ant-Man? I continue to spend lots of time prepping for Biochemistry class. Most recently I’ve been working on a lecture that encompasses protein folding and denaturation.

 

It’s challenging to study protein folding experimentally. Levinthal’s paradox argues that it cannot be random sampling of conformational space and must be guided in some way. We do know that proteins already begin the folding process even as the polypeptide is being synthesized on the ribosome, and that molecular chaperones play a role to ensure the protein folds correctly so that it can perform its functions. But capturing the dynamics and the details is very difficult. If only we could, like Ant-Man, shrink ourselves to the size of molecules, we could directly examine each step of the process as it happens in living cells. Without Ant-Man powers we have to resort to killing the cell as we tease it apart to figuring out what’s going on inside.

 

A complementary approach is to use computational methods. In preparation for that lecture, I’ve been perusing the primary literature for examples to use in class. Using state-of-the-art molecular dynamics simulations, we can now see small proteins approaching the millisecond timescale. These don’t include the chaperones or the ribosome but rather rely solely on the thermodynamics and kinetics of the peptide in water. By that I mean a model of the peptide in a model water box. As a computational chemist with some knowledge of this field (although it’s not my specific area of expertise), I can say that the models are well-parameterized and do a decent job. We scientists have learned a lot from such model simulations and they keep getting better.

 

When I watch a molecular dynamics simulation, it’s a bit like being molecular-sized Ant-Man. Even more so if I immerse myself in a virtual reality setup. I can track what individual atoms are doing, I can see larger-scale globular movements, and I can observe how the protein folds and unfolds dynamically. Maybe I already have Ant-Man-like powers; and with faster computer processors, faster algorithms, and more accurate models, my powers would increase. But I would need to guzzle much more energy than Ant-Man would require, and I likely wouldn’t have the same visceral feeling that Ant-Man would have if he went into a cell and observed molecules up close and personal. In computer simulations, atoms look like bright colored balls of different sizes, and it might feel like I’m in a playground. In the actual cell, it might be much more sinister and scary.

 

I don’t even know what atoms would actually look like. What is an electron cloud? All our imaginations of atoms and molecules are models. We’ve never seen an atom with the naked eye, although we have seen images from a scanning tunneling microscope translated on to a screen. In my quantum chemistry class, I try to help my students “see” electron clouds (orbitals) through mathematical equations. It’s a tall order. If we could be Ant-Man and the Wasp making our way through quantum world, maybe we’ll be truly enlightened. So far I have been unimpressed by MCU’s rendition of the quantum world. But I haven’t seen the third installment yet, Quantummania. The DVD recently came to my local library so I will be watching it soon, and if I find anything noteworthy it might be a future blog post.

 

P.S. I did use Ant-Man in earlier blog posts about resizing chemistry!

Saturday, September 9, 2023

The Devil Never Sleeps

I live and work in an area prone to earthquakes. When will the big one hit? It’s not a matter of if but when. I also think there will be another global pandemic within the next decade. And with global warming, there will be more problems with extreme heat and flooding. I’m not a pessimist; but because I study complex systems I’m convinced that disasters (from the human perspective) are inevitable. The Devil Never Sleeps. That’s the title of Julia Kayyem’s book, subtitled “Learning to Live in an Age of Disasters”. I’m in full agreement with her premise.

 


This is not a theoretical book. Kayyem draws from her experience in crisis management, and her main points are peppered with practical examples. She begins with a problem: the paradox of preparation, or “how successful preventative measures can intuitively seem like a waste of time.” If nothing happens, or if there was a “near miss” with minimal consequences, it seems that things are okay and that the doomsayers were just cranks. Y2K is a good example. If not for all the behind-the-scenes work, it could have been a disaster, but instead there were only minor failures.

 

As our way of life becomes increasingly interconnected, as we become more efficient and cut out redundancy, as our networks become more dense, we are more enmeshed in a complex system. Such a system can provide some resiliency from the buffeting of a changing environment, but only for a while. But eventually every system will experience system failure – it adapts or is destroyed and a new system arises. I am a complex system. But with age, I will experience bodily system failure. I will die. It’s not a matter of if but when. And what exactly is system failure? The devil is in the interconnected details which cannot be easily isolated from each other.

 

Because the devil never sleeps, Kayyem advocates the goal of minimizing the consequences. Her mantra is that we have to try our hardest to make things less bad. How? You must have contingency plans, and they must be practiced. Constant vigilance! Failure modes are numerous, relying on “the last line of defense is a trap”. Avoid stupid deaths. Pay attention to the near misses. My mantra used to be good enough is good enough. Sometimes it simply isn’t. Kayyem’s book made me think of where I need to do a pre-mortem and start making preparations, even small ones, that could be widely applicable.

 

I’ve thought about how to pivot (in my day job of teaching) if we’re forced to shelter in place again, be it another pandemic or some other natural disaster, likely a flood or an earthquake in my area. But I’ve assumed that internet connections will still be up and running. A massive power outage, a serious cybersecurity breach that cripples the network, are very real possibilities. I’m less prepared for those, but now I’m thinking about them. Have I built in redundancies and backups? Minimally at best. Or I could have an accident that incapacitates me in some way or fall very ill. Ugh. I don’t even want to think about it.

 

I’d rather think about how minimal living systems arise and die, evolutionarily leaving behind a set of instructions of how to be born again. What is life if not a system that attempts to persist? The living organism tries to stave off the forces that threaten to starve it or eat it. Food is just high-energy molecules with weaker bonds that can be transduced into energy to stay alive, if not thrive. But disastrous death is always just around the corner. Humans today who live in “comfortable” situations (and perhaps their pets) may be the few organisms that aren’t watching their back 24-7. Every other creature from the bacterium to the blue whale has to get enough food and watch its back. The biochemistry of life has evolved to make things less bad should there be a problem. And that might be a fresh perspective for me in my research. The devil never sleeps. Biochemistry needs to keep it at bay.

Monday, September 4, 2023

Information Empires

I should have read Tim Wu’s The Master Switch thirteen years ago. If I had, I would be less puzzled about the strange dance that has ensued this century involving mass media, the telecom industry, movie studios, and internet tech behemoths. I felt the same way about reading Spillover after COVID-19 hit. We should not have been surprised, but we ignored the signs of increasing zoonotic activity.

 


Wu keeps the reader engaged by weaving stories of empire-building titans while describing technological innovations in the field of distance (followed) by mass communication. He gives inventors their due credit, but the narratives focus on single-minded individuals with the financing and clout to monopolize an industry. The book begins with Alexander Bell of telephony fame, but the main players that contribute to the “rise and fall of information empires” (the subtitle of The Master Switch) are folks like Theodore Vail, David Sarnoff, and Ed Withacre. I had heard of Sarnoff, and about the breakup of AT&T/Bell in 1984. But as told by Wu, the story has twists and turns with an eye towards the long game. Government gets involved both in regulation and deregulation, but it’s not always clear when that’s a good thing.

 

Today’s post is about two things. The first is Wu’s very short chapter 15 titled “Esperanto for Machines”. Only seven pages long, several things jumped out at me. Vint Cerf and Robert Kahn, in creating TCP/IP, were simply trying to come up with an ad hoc fix to get three networks to talk to each other. They were constrained by the fact that “the wires were owned by AT&T and computing was a patchwork of fiefdoms centered on the gigantic mainframe computers, each with idiosyncratic protocols and systems.” By creating a “standard for the size and flow rate of data packets… [they] allow the Internet to run on any infrastructure, and carry any application, its packets traveling any type of wire or radio broadcast band… independent of the physical infrastructure.”

 

Wu emphasizes the tension between centralized and decentralized systems, making an analogy between the founding of the internet and the U.S. “The Founding Fathers had no choice but to deal with the fact of individual states already too powerful and mature to give up most of their authority to a central [federal] government.” Central planning realizes a number of efficiencies and can be less wasteful than competition. But because no central planner has perfect information, not to mention all the biases and groupthink that emerge in such systems, there isn’t one best way of doing things and hegemonic approaches stifle and stunt growth and adaptation in the long run. The result is death. Slow and maybe by a million cuts, but death nevertheless.

 

To navigate that tension, Wu quotes a dictum by Jon Postel regarding TCP/IP: “Be conservative in what you do. Be liberal in what you accept from others.” This is giving me much to chew on, and brings me to the second part of my post. It made me stop and think about the systems I’m embedded in (my department, my university, my chosen profession) and about the systems I research (proto-metabolism at the chemical origins of life). So let me digress briefly on these things.

 

Am I conservative in what I do as a professor? Largely so. They don’t call it the ivory tower for nothing. As a teacher and a researcher, I do try new things on a regular basis, but I do these incrementally – and I’ve found myself sometimes returning to early tried-and-true strategies, appropriately tweaked with the benefit of hindsight. As to my department, our curriculum is very conservative. We’ve made changes over the years, but in my opinion these have been sustaining innovations rather than disruptive innovations, and they were borne mostly out of necessity. Am I liberal in what I accept from others? I don’t know how to answer this. I do believe that there are many ways to teach students effectively and that there’s no “best” pedagogy or curriculum. I think educators should teach to their strengths and I think I’m an oddball among my colleagues. No one bothers me and I don’t bother them. As an individual professor I strongly defend being able to teach in my idiosyncratic fashion, but having served as an administrator in several capacities, I also understand the desire for some level of standardization and how that helps mitigate certain systemic issues that would otherwise crop up.   

 

Research-wise, I’m primarily trained in electronic structure theory, although I have some experience in other areas of computational chemistry and I can converse on a wide range of subjects because I did my PhD in a large research group with diverse interests and employing a slew of different computational methods. My bread-and-butter is still electronic structure theory and my research students today do similar things as they did two decades ago (albeit on larger and more interesting systems). All this seems rather conservative on my part. While my research students are primarily chemistry or biochemistry majors, I’ve happily (or is it liberally?) accepted students from other departments: math, computer science, biology. But I don’t think this captures Postel’s dictum.

 

Then I got to thinking about the origin of life. Prebiotic chemistry is messy. It perhaps starts out decentralized with a slew of intermediates and products. But once the first (albeit primitive) autocatalytic cycles kick in, certain molecules will be selected over others to be amplified. Competition ensues and the losers die out, either being “eaten” as food or decomposed into yet other molecules. But this centralized ascendant protometabolic systems that incorporate food and dissipate waste molecules, alter their environment. This provides the opportunity for other liberal protometabolic systems to take advantage. One’s waste becomes another’s food, and competition comes again to the fore. Rinse. Repeat. The cycle of decentralization and centralization that Wu describes of information empires seems to apply well to both life and proto-life. As a multicellular organism, I might be an example of a highly centralized system that’s currently ascendant. But remember what happened to the dinosaurs. Meanwhile bacteria, perhaps representing decentralization, are still here and regularly trying to cannibalize us. And what is life, if not an organism that stores and utilizes information that also relies on communication to survive or thrive?

Saturday, September 2, 2023

Glut: Bacon Approach

I’m reading Glut by Alex Wright. It’s not about food. Unless information is food, which might be true for an A.I. sopping up gobs of data. The book, subtitled “Mastering Information Through the Ages”, takes a long-view sweep from hunter-gatherer notions of classification through cuneiform, the alphabet, scroll-to-book transition, scientific taxonomies, encyclopedias, and of course, the Internet. Today’s post targets the book’s midsection with anecdotes of Bacon.

 


That’s Francis Bacon – philosopher and empiricist who served as Lord Chancellor of England in the early seventeenth century. According to Wright, Bacon had trained in the scholastic art of memorization as had many monastics and academicians of the times. But Bacon rejected this “tendency to celebrate… elaborate memory feats [and] intellectual gymnasticism”, finding them ostentatious. Bacon carved a new path with his famous work Novum Organum which “proposed nothing less than restructuring the enterprise of scholarship… [laying] the philosophical foundations for a process that would later become known as the scientific method, a radical departure from the scholasticism that sought pathways to truth through esoteric practices and belief in disembodied ideals.”

 

While Bacon wouldn’t have labeled himself as a scientist, he championed the direct observation of natural phenomena. He would have been comfortable being called a magician. There is speculation he was an alchemist. Bacon even wrote that “the aim of magic is to recall natural philosophy from the vanity of speculations to the importance of experiments.” Wright writes that for Bacon, “the duty of a philosopher… was not to reject magical traditions out of hand but to delve into them with fresh eyes and a critical perspective.”

 

Bacon is credited with the method of inductive reasoning and is considered a father or perhaps grandfather of what we today call the scientific method. He championed “research as a collaborative process” but was also an elitist, “believing that the great uneducated masses posed a severe threat to the integrity of scholarship”. There is a post-modern flavor to Bacon’s “idols” – barriers to understanding because of human and societal limitations. As individuals, we have implicit biases. Our perceptual umwelt is limited and colors our conclusions. Meaning is constructed socially and defining words is a slippery business. Mythology and ideological beliefs get in the way.

 

It is hard work to be a master of information. Having just read an entire book about the history of the encyclopedia, it was interesting to learn that the point of Bacon’s empiricism was to “formulate a new philosophical framework for classifying all of human knowledge. He postulated that all human intellectual pursuits revolve around three essential facilities: memory, reason, and imagination (or, in more familiar terms: history, philosophy, and poetry).” In subsequent chapters of Wright’s book there are figures showing Diderot’s Encyclopedie and Jefferson’s library catalog. Both men utilized Bacon’s categories (rather than using an alphabetical list). I also learned that Jefferson donated his personal library, one of the largest in the world at the time, to restart the Library of Congress after the original library was destroyed by British forces in 1814.

 

There’s an argument that information is not the same as knowledge is not the same as wisdom, and there’s an increase in quality along that continuum. However, defining each of these terms is a slippery business. There are multiple ways to quantitatively define information, but each of these is to some extent context-dependent and likely observer-dependent. What does it mean to master information? Can we define knowledge in this way, as the mastery of information? If I say that someone is knowledgeable in an area, am I saying that this person demonstrates some level of mastery or expertise? This likely implies not just having a quantity of information, but also having organized it in some way to be useful, although usefulness is also in the eye of the beholder. I’d say I’m knowledgeable in the field of chemistry. But do I demonstrate wisdom in chemistry? I don’t know. Does wisdom even apply? Or is it I that must apply wisdom?

 

Taxonomy and classification loom large in Wright’s book. Implied in his long-sweep history is that organizing information into “useful” categories plays a key role in the technological advances of human-kind. We’re all a little Homo habilis: tool-makers. As a teacher, organizing information is what I do for my students, so they can get up to speed quickly and efficiently. And chemistry curricula at most colleges are organized hierarchically – perhaps more so than any of the other sciences (even physics). It’s challenging to incorporate the glut of new chemical information pouring in – we do it through special topic electives that are populated by seniors. I try to sneak in bits of stuff as early as first-year General Chemistry, but there’s not much room to maneuver within the strictures – which I’d argue are self-imposed in a confluence of history, being practical, feeling stuck in a system, and sheer laziness.

 

In the closing chapters, Wright argues that to a large extent the way we organize and acquire information on the Internet is molded on the shoulders of giants. We certainly have lots of information at our fingertips – and yes hypertext is cool (but not novel, even for novels) – but by-and-large it has settled into a command-and-control structure. The Wild Days seem to be a forgotten dream as tech behemoths consolidate their early gains. This is even true for Wikipedia. Throughout his book Wright drums at the tension between top-down hierarchies and bottom-up networks, and the constant push-and-pull between the two. As someone who researches the chemistry of the origin of life, I think Wright is on to something. I suspect that this dynamic interplay is an inherent part of life’s structural underpinnings. The glut comes under control; but it cannot be contained and new life eventually finds a way. Rinse. Repeat.

Friday, September 1, 2023

First Week: Fall 2023 Edition

September 1: First day for Hogwarts students; end of the first week for me. It’s been a very busy week with lots of meetings in addition to my three classes. This semester I’m teaching General Chemistry, Physical Chemistry and Biochemistry. All three are lecture classes (I’m not teaching the associated labs).

 

I have the Honors preceptorial this semester: twenty new first-year students who are my academic advisees and also enrolled in my G-Chem 1 course. I think I finally learned all their names today! I worked hard at it by saying their names multiple times in discussion. I’m also having each of them visit me in my drop-in (office) hours so they know where it is, and it helped me with memorizing their names. Half of them came this week; I expect to see the other half next week. I think I’ve helped all of them figure out any changes to their course schedules.

 

In terms of the G-Chem course, I made some changes to my topic lineup. I’ve moved the two lectures on Nuclear Chemistry to the very end (typically I get to it in Week 2), and moved Intermolecular Forces up by several lectures. I’m also making study guides for each class session since I felt that was something the students benefited from last semester and I had a lot of positive feedback. I’ve also decided to give four midterm exams and drop one exam score, rather than three (which I’d been doing for about a decade). We’ll see how it all works out but hopefully it puts the students a little more at ease; but also moves the first exam earlier in the semester so they get a sense of what a college-level chemistry exam is like!

 

P-Chem 1 was a ton of work the last time I taught it because I ditched the textbooks and completely converted to worksheets. This time around is much less work but I have made several changes to the topic lineup. I shortened some of the math surrounding the rigid rotor and the hydrogen atom. I moved centrifugal distortion from rovibrational spectroscopy to perturbation theory. I will be adding some computational components (IR spectra, MO calculations of various types) and I plan to expand Valence Bond theory and hoping to culminate in the strange case of the dioxygen molecule. I have a particularly small class of just 10 students this semester. (Our number of majors has been going down.) I’ve learned all their names! But I already knew half of them ahead of time.

 

Biochem is the most work since it’s a completely new prep. I’ve blogged about how much time it is taking and I hope to get more efficient. I have the first four weeks of class prepped in detail so I’m in relatively good shape at the moment. I haven’t learned all the students’ names yet. It takes longer because the class meets twice a week (TuTh) rather than thrice (MWF) which is the schedule I usually teach. I’ve had a number of the students in G-Chem so that has helped in my identifying the students. I hope to have all the names down by Week 3. One challenge for me is that Biochem has been taking up much of my mental space. I must remember not to under-prepare for each of my G-Chem and P-Chem classes so that I can still give those students my very best.

 

I had practically no research activity in the month of August, except for the national conference I attended. Will I get to it in September? I hope so. I will still be helping my research students move their own projects along, but I’m not sure if I’ll be able to get to my own stuff. I really want to get a paper submitted by late October but I don’t know if I’ll make my self-imposed deadline. I’m on a grant review panel (online) next week so that’s an additional thing on my plate.

 

But it’s Friday afternoon and the tiredness is kicking in, so I’m not going to spend time worrying about it and instead enjoy my long (Labor Day in the U.S.) weekend!