Friday, February 6, 2026

Second Week: Spring 2026 Edition

I forgot to write last Fall’s First Week edition (so here’s the one from 2024), probably because I was super-busy teaching two sections of G-Chem 1 and one section of Biochem 1. I’m not teaching any Honors section this academic year so last semester I had around 100 students across my three courses. We’re also using a new textbook and online homework system for G-Chem, and it was just my second time teaching Biochem. All in all, more prep work than usual. This semester, I’m teaching one section of G-Chem 2 and one section of P-Chem 2 with around 60 students across both courses. I’m now used to the G-Chem online homework system, and I like the new textbook. This, the semester feels lighter from a workload standpoint. Hurrah!

 

But I was still very busy last week because of the possibility of a government shutdown here in the U.S. (which turned out to be mercifully short, thankfully). I decided to write up my annual report for my current federal research grant and submit it about a month earlier than usual, in case the government shut down. There was some confusion about whether an administrative request that I sent via webform went through because the website was apparently having problems, but eventually some back-and-forth emails confirmed that they received my request and my report. And hence today’s post is happening at the end of Week 2 rather than Week 1 of the semester.

 

After the long winter break, I’ve been enjoying interacting with students in the classroom, in my office, or in the hallways or atrium of my building. I’ve been making an extra effort not to be overly quick or efficient in my interactions, and hopefully students feel I’m not rushing them when they have questions. We’ll see how that shakes out the rest of the semester. I feel I have more energy even though my first class starts at 8am (instead of 9am last semester), maybe because of the lighter load and maybe because I just feel freer. The first week of class I was still struggling with timing in my G-Chem class because I had rearranged the material to match the new textbook. This week I feel I did a much better job without rushing in the last 5-10 minutes of class. I’m not making many changes to the first eight weeks of my P-Chem class so that has been going smoothly timing-wise. (I will be making some changes to the latter half.)

 

I also feel I have time for research this semester! Last semester, I felt that I hardly made any progress on my own projects. I still helped my research students make progress in their projects, but didn’t have much time for my own. This semester, however, I’ve been getting in 5-10 hours of research or writing (working on a paper) per week, which has been very nice! One of my summer 2025 research students who continued working with me last semester is also a very capable writer so I invited her to write the first draft of the research paper featuring her work. The carrot is she gets to be first author! I have revised much of the text, but kept the overall flow intact. She also made all the Figures and Tables. (For many other papers, I make the Figures since I feel they look nicer and more consistent in size/shape, but this student is exceptional and detail-oriented.)

 

I’m not taking on any new service activities because over winter break, I found out that my sabbatical application for AY26-27 was approved! This means fewer committee meetings and more time this semester, probably contributing to my feeling freer! I’ve also decided to try presenting my work digitally rather than in-person at the upcoming American Chemical Society national conference, so I don’t need to block off travel time. More time freed up! I am reading a little more to try to fine-tune a plan for my upcoming sabbatical. There are so many interesting things to explore. Okay, that’s the end of Week 2, Spring 2026 edition. It may be my last such post as I’ve been reducing my blogging activities.


Tuesday, January 27, 2026

Words and Pictures

I’d read several papers by Richard Mayer on the dual-coding model: Learners have two channels for processing incoming information, verbal and visual. Over time, this was combined with insights from cognitive load theory and learning more about the brain and how memory works. Mayer now calls it the cognitive theory of multimedia learning (CTML) and I read a recent review that goes through the history of how they got there and where to next. The citation is Educational Psychology Review (2024) 36:8, DOI: 10.1007/s10648-023-09842-1. I very much enjoyed the personal insights the author shared about his research journey. Each heading is listed in the bullet points below followed by my thoughts.

 

1. Theory Building Depends on Intellectual Curiosity. Mayer became very curious about how to improve teaching for “transfer” – being able to apply something you’ve learned usefully to a new situation. He did this by first narrowing the issue to the effects of multimedia. I am curious about a lot of things, but I haven’t had the discipline to really narrow my focus, and as a result I remain a dilettante on a broad range of topics. As a result, I haven’t made significant theoretical contributions in my field even though I’ve learned a number of interesting things about a number of interesting systems I’ve studied. It seems I scratch the surface, pick the low hanging fruit, and move on. Maybe I need to change my approach.

 

2. Theory Building is Grounded in Old Ideas. Mayer discusses his reading of classic works in his field. I find reading the historical underpinnings of my research and teaching very enjoyable from a learning point of view. I hadn’t thought much about building new theory off the old ideas in a systematic way. Something for me to consider.

 

3. Theory Building is Not a Straight, Planned-Out path. Mayer relates how he usefully breaks down interesting questions into “shorter 2- or 3-year plans targeted on specific research questions”. This led him to the multimedia principle: “people learn better from words and pictures than from words alone”. I’ve known about this, and it’s common in the natural sciences, to have lots of pictures. I’ve also learned that the pictures I project on the screen should not be cluttered with text as I verbalize my way through an explanation (Mayer’s coherence principle). After doing so, I then write things on the board for students to have good notes, at least in G-Chem. (I’m worse at it in upper division classes.) Mayer also writes about pursuing fruitful paths; I also do this research-wise but I likely move too quickly away from something that looks like it would take more work. I’m lazy.

 

4 & 5. Theory Building is an Engineering Problem [and] an Iterative Process Involving the Persistent Interplay Between Research and Theory. By this Mayer means that it requires tinkering, to make something work better, and going through a development cycle where theory leads to research experiments, the results of which feed back into theory. Mayer discusses fostering generative processing: “motivating the learner to actively engage with the material”. This is a weak area for me. I’ve relied on my enthusiasm for my subject area (which students recognize and comment positively on) but this is likely not enough. My activities mostly require the students to do analysis, but few of them ask the students to be generative. This needs more work on my part.

 

6. Theory Building Depends on Persistence in Collecting New Research Evidence. Sounds obvious, but this requires hard work which is not my strong suit.

 

7. Theory Building is a Team Activity. The days of the lone theorist making substantial novel discoveries are long gone. A good and fruitful collaboration requires work to sustain it, and since I’ve already admitted I’m lazy, my collaborations tend to be short-term and specific, and not dedicated to theory building in particular. Maybe I need to change that.

 

In the middle section of his article, Mayer discusses his “inching towards a visual representation of the theory”. This is very appropriate given what he studies. He starts with simple flowcharts that slowly build up to what has become a compact and useful picture. Here’s Figure 8 from this article. You’ll have to read his article to get all the details, but once you know what each of the boxes and arrows represent, it summarizes the theory in a single uncluttered visual representation. 

 


There’s also a useful Table with his fifteen principles of multimedia instructional design along with their effect sizes from experiments. I already follow some of these, given my prior immersion into cognitive load theory. Here are some that I hadn’t thought about much or haven’t incorporated yet.

·      Presenting material in user-paced segments rather than a continuous unit. I don’t do this well and I need to improve how I cue different segments in class.

·      Sometimes I assume students know definitions and terminology that they don’t and/or present them in an order that confuses them.

·      Apparently in multimedia, using a conversational style works better than a formal style. I don’t know where I am on this spectrum and should reflect more on this.

·      If you’re onscreen as an instructor, high embodiment helps. I take this to mean that being a disembodied voice talking though slides is inferior. In our pandemic all-on-Zoom year, my camera showed me writing on a large white board, and I would sometimes step out of frame so that more of the board would be visible. At some point we’ll have another pandemic and I’ll have to think about this.

·      Generative learning activities help. I mentioned this above; I should design more of these.

While I don’t use 3D immersive virtual reality, apparently studies show that students don’t necessarily learn better compared to a corresponding 2D screen presentation. The effect size of this was small.

 

I have a sabbatical coming up that will allow me to think more deeply about some of these issues. A third of my sabbatical proposal had to do with pedagogy but mostly related to adapting machine learning and data science. And there were a whole bunch of other things in my proposal which are dilettantish, so maybe what I should be considering is how to narrow what I’d like to accomplish into specific questions and design specific activities ahead of time instead of my ad hoc muddle-through approach. But meanwhile I should look over my upcoming class materials and think about the words and pictures and whether I can improve optimizing student learning.


Sunday, January 25, 2026

Are you You?

Getting a new phone this month meant moving from thumb authentication to face authentication. It seems to work pretty seamlessly when I pick up my phone – the devices are getting smarter. Also, this month I’ve noticed Gmail regularly Captcha-asking for additional authentication to prove I’m not a bot. Is agentic A.I. causing more issues? I don’t know. But it made me think about how verifying my identity has changed over my lifetime.

 

Everything was done manually when I was young. No computers or internet. I don’t remember how I was verified when I first entered primary school. How did the school and teachers know I was not an impostor? In my most recent videochat with my mother (a former schoolteacher, long retired), I asked her and she told me about the systems they would use. My first major verification that I vaguely recall was the primary school national exams. Apparently, the government sent us letters with an entry slip and a unique number; and this allowed me to take my exams where I think I had to carefully write the unique number on all my exam papers. I vaguely remember teachers drilling us to do this. And while I don’t recall exactly, I think our own teachers were also the ones who verified us because we’d been in their classes for a whole year.

 

After the age of twelve, when I had secondary school examinations, we all had to bring in our identity cards, and place them on the corner of our desks for each exam. Each of us had a specific desk that had our name on it, and the invigilators who were not our teachers, walked around (clipboard in hand) to verify each of us via the identity cards. I still experience this process when I’m at the airport, visiting the bank in person, checking in to a hotel, picking up my badge at a conference, or any other situation where I need to verify who I am because the verifiers don’t know me and wouldn’t recognize my face. If you lived in a small village and never had to leave, everyone knows everyone and verification is easy. But in an era of urbanization, global travel, and not knowing your neighbors, verification becomes trickier.

 

The age of the internet has made authentication even more challenging. Are you who you say you are? How does the system know? There are logins, passwords, two-factor authentications, additional questions for information unique to you, and now voice and face verification. These are going to get more stringent as A.I. makes it easier to “fake” more characteristics. We’ll be increasingly up the wazoo in verification.

 

“Authenticating” is the title of the second chapter in Brian Christian’s The Most Human Human which details his experience as a confederate in an annual Turing Test competition. He’s trying to prove he’s the human against an A.I. competitor. The chapter opens with a story about a man with phonagnosia. He cannot recognize anyone’s voice and growing up had assumed his voice was distinctive because everyone else recognized his voice but he couldn’t recognize anyone else’s which made phone calls an interesting case of guess-the-identity for him. He couldn’t voice-verify. In another vignette, someone easily breaks into the email account of a public figure simply by selecting “I forgot my password” and then verifying information based on internet searches.

 

The meat of this book chapter, however, is about what might be unique between a human conversationalist and a chatbot. Many of the successful chatbots in the early competitions were able to steer the conversation to avoid the tricky out-of-book situation, which is a reasonably strategy when the conversations are timed in a speed-dating like format. Successful bots typically had a single programmer devoted to developing the bot’s personality so that it would seem like a single coherent individual. The bot felt like a singular You. Today’s A.I. large language models however were developed with the opposite philosophy: use weighted statistics from millions of disembodied conversations. Apparently, this is why A.I. translators are weak on long literary novels which require a singular coherent voice throughout, but do just fine on shorter snippets.

 

When you’re conversing with a chatbot today, you’re conversing with a multitude of voices averaged into a response. Your interlocutor is Legion for they are many. They contain multitudes. You are no longer talking to an individual but a host of ghosts. I’ve never had an extended conversation with a chatbot (I have better things to do with my time), and my queries have usually been specific and chemistry-related; I dabble in exploring if chatbots can help my students gain a better understanding of chemistry. So I don’t personally know if I would ever feel that a chatbot feels like a friendly human; I know some of my students do enjoy their chatbot chats. And it may be that sufficient familiarity and multiple chats provides its own authentication, for better or worse, now that chatbots can access their memory store of their personal conversation from you and draw from it. I suppose this is what Personalization is all about. At some point the chatbot might feel like a You. But that’s because of you.


Tuesday, January 20, 2026

Overly Efficient

I was forced to get a new phone; the old one started to precipitously decline and would randomly restart. I’m a creature of habit and I’m easily bewildered by new technology (shocking for a computational chemist, I know!); I’ve only owned two smartphones in thirteen years (an iPhone5 and a first-generation SE). To minimize having to switch again in the next five years, I decided on an iPhone17. I was dreading making the switch.

 

The Apple store at my local mall was quite busy when I arrived; eventually someone was assigned to help me. I said I was interested in getting a new phone. The employee wanted to know which model, assumed I knew all the pros and cons, was curt in her responses, and I felt rushed through the process. I was nervous about moving data from my old phone to my new one, but was told it would be easy and I should “just follow the prompts”. A new phone was given to me in its box, I tapped my credit card, and I was done. The employee moved on to the next customer. Purchase completed, but anxiety heightened.

 

I stopped by the T-Mobile store for help getting the SIM and info in my old phone transferred to the new one. There were no other customers at the store, and the employee was relaxed and friendly. I said I was feeling anxious about the challenge given my old phone had a very old OS which suggested a more arduous process. But the employee helped me through the steps, which turned out to be short and easy. The phones, placed next to each other, did their info-sync dance. He then patiently showed me some basic moves for my new phone (when to use buttons and how to swipe for different options). He empathized with the challenge of switching to a new outlay, and answered my very basic questions without judgment of my ignorance. It was a very positive experience and reduced my anxiety substantially.

 

The epiphany I had after these two contrasting customer experiences is that I want my students to experience the second one when they visit me in office hours. However, more often than is warranted, they get the first one instead. The problem is me; I’m overly efficient, and sometimes the student feels rushed through a process. If the student was feeling anxious by their ignorance, I haven’t allayed the student’s (usually unspoken) concern. I expect a student to have done some background work and have their questions ready, and I answer them efficiently, especially if there is more than one student in my office. I ping-pong amongst the students so they get their answers efficiently while making sure no one student has to wait long in between. Students have busy lives, many busier than mine, and I don’t want them to experience long wait times if they visit. If there’s only one student in my office, I don’t have to ping-pong but I do multitask and work on something else in between their questions, which I suppose is me trying to be efficient with my own time. But from their point of view, the vibe I’m giving out might well be “I’m a busy person, so get on with it.” Overly efficient.

 

In my student evaluations, I get the occasional comment from a student that feels intimidated about asking questions in office hours and feels I was being dismissive in my answers. It’s true I expect the student to come into office prepared with their questions after working through the material; that’s something they should learn to do as they prepare for the working world. I also have a philosophy of not spoon-feeding; I sometimes respond with a question or ask a student to look at their notes to see the definitions or examples we covered in class. I want the student to understand that they have to put in the time and work to chew over what we’ve covered and not just say “I didn’t understand anything in class” and try to get me to go over it all again a second time. The vibe I’m giving off? Like the Apple employee. Quick, curt, efficient, and expected me to know what I was looking for before I came in.

 

Not all students feel this way. Some like the way I do things and say so in the student evaluations. They felt their questions were answered, that I was helpful in office hours, and felt that I cared that they were successful in the class. They said I was accessible, even when other students were in my office. But these are likely the students who were not struggling excessively with the material, had a reasonably good background coming into class, and were stronger academically. They liked the depth, my varied explanations, and felt that the way I organized the class set them up for success. In contrast, the students who had negative experiences comment that I go over the basics too quickly, I’m not good at teaching beginners, and that I assume prior knowledge they don’t have. How I felt at the Apple store likely mirrors how some of them were feeling. So while I have good reasons for the way I structure my class and office hours, my recent customer service epiphany tells me that I need to make some changes so that the anxious students who feel they are clueless in class feel more welcome to ask questions and feel like they were actually aided.

 

My mantra to myself this semester is “don’t be overly efficient”. When you’re old and set in your ways, it’s harder to change. But that’s not an excuse; rather it’s a challenge I need to overcome. I know that I will revert to my efficiency (hopefully not too often) during office hour visits or during my interactions in class, but I need to keep making the effort to help the struggling students feel welcome. They’re the ones that need the most help, and if they’re not coming to my office, that’s a problem and I’m not helping them. I don’t need to be overly efficient with my own time, and maybe my time-log (which was likely excellent to have in my first several years as a professor) has slowly compounded my overly efficient behavior. I’ve been successful partly because I was efficient, and I want students to learn to be efficient in managing their time and learning. But more importantly, they are students, and still learning. I need to make accommodations and at least not give out an “impatient” vibe in my interactions with them. I think I’m being efficient. They think I’m impatient and that they’re not worth my time. I need to work on this because the whole reason I became a professor at a liberal arts college (and not a research-intensive university) is because I want to spend most of my time teaching students.

 

I tell students that office hours (I’ve called them “Drop-in hours” for the last five years) are my favorite time in my workday. It’s when I feel I actually help them individually because it’s harder to do so in the classroom with many students. I tell them I look forward to seeing them, but it’s clear some don’t believe me and my over-efficiency vibe dissuades them. I feel that fewer students come to my office in the age of A.I., where a friendly chatbot is always there to answer their questions as many times as necessary and makes them feel good while doing so. I’m no chatbot; I should be better than a chatbot (I’m certainly more accurate and I actually care about their learning as a human being). My goal this semester is for more students to feel that I am accessible, especially the ones that should be coming to my office to ask questions. Maybe my new phone can be a visible reminder to be more like the helpful T-Mobile employee and not be overly efficient.


Sunday, January 18, 2026

ADOM Update

Nine months later, I’m still playing Ancient Domains of Mystery (ADOM). The character I was playing back when I blogged about ADOM did win, but with plenty of “scum saving” in between. It means that if my character died or did something particularly stupid, I simply restarted at the last save. While that’s not in the spirit of ADOM’s permadeath setup, I felt it was a good way to learn more of the game.

 

After the success of the hurthling archer (level 41), and subsequently losing several other characters in early deaths, my second character to win was a drakeling barbarian (level 35, 96 days in the Drakalor chain). There was also a little scum-saving so I could learn to deal with the strange effects of being cold-blooded. This made the Tower of Eternal Flames quest exciting and nail-biting and the final level was a little harder because my character needed to recover from the cold in the penultimate level. For both these wins, I avoided big boss Fistanarius, used a magic map, a wand of digging, and a wand of destruction on the levers. Both normal endings; I’ve never done an ultra-ending.

 

At this point, I decided I will no longer scum-save even if I have a promising character that makes it to the mid or late game. And many don’t. The next character who almost won, a human necromancer (level 39), died fighting Fistanarius. It was very close and I should have healed but thought I could withstand one more blow and Fistanarius was on his last legs. Then there was the gnome assassin (level 37), who got killed by the emperor moloch – I was doing well on the Filk quest up to that point. My third win, and my first with no scum-saving was a gnome priest (level 39); this is also my highest score at 14,747,264 points. Then there were two deaths (level 27 and level 31 characters) to Katharamandus, the stone dragon. I hadn’t gauged the difficulty because my hurthling archer didn’t have too much trouble completing the Rolf quest. That’s the one and only time I finished what Rolf asked.

 

I had my fourth win this weekend, with a little scum-saving because I had a dwarf chaos knight, which meant a very different game. I had lost several chaos knights in the early game, and when this one made it to the mid-game, I wanted to keep going. I fed the Demented Ratling six artifacts, but had hardly any wild boar encounters and no boar head so I gave up on that quest and decided it would be okay to finish as a most stupid follower of chaos. My character was very powerful (level 40) and I’ve never gotten so many artifacts. I easily beat molochs and balors on the final level. Because the strategy was different, I learned a lot in this last game.

 

At this point I’ve played 60+ characters, always randomly generated by Fate. Interestingly, of the top eight high scores (the finishers and the ones who made it to the mid-game), seven were female including all four finishers. Not sure why; a possibility is that I pay more attention to the Appearance stat and maybe my characters saw less corruption overall. I created my next character on New Year’s Eve, my first on a special day. Fate chose a male gnome wizard; he’s a bit boozy but maybe he’ll survive. And maybe he’ll thrive.


Monday, January 12, 2026

Biochem Round 2

Teaching biochemistry for the second time this past semester was not as time-consuming as the first time two years ago. I spent 3-5 hours per week on class prep and updating the materials which was three times less than my first run. This was significantly more manageable given I had two larger sections of G-Chem 1 using a new textbook. For Biochem, I did not make large-scale changes to the course. The topical flow was similar and I mainly updated the slides and study guides. I made some changes to the in-class computational activities, exclusively using the Molstar/PDB viewer (and skipping Pymol). I added a protein-folding-prediction exercise given the ubiquity of AlphaFold-like tools. It also has a nice wow-factor!

 

This second pass, I was able to clear up some errors I made and confusion on my part about some of the more complex enzyme regulation involving kinases, in particular FBPases. I streamlined the enzyme kinetics so it would be less heavy math-wise, and I think I did a better job with carbohydrate nomenclature without getting stuck in the weeds. Those are the positives. The negatives are that I likely went faster and had a little more information on my slides when I should likely have done the opposite. I also went into more chemical detail because a third of my class were chemistry or biochemistry majors; in contrast I had less than fifteen percent of them the first time I taught it.

 

My class was fifty percent larger this time around, simply because there were more students enrolling in the course as numbers have rebounded post-pandemic. This probably made the largest difference because it means I help each student less individually. This was certainly true during in-class activities where the students work in pairs or small groups and I circulate. The majority of students never came to office hours, which didn’t help matters. My end-of-semester grade distribution was much wider and included some D’s and more C’s, and there was a surprising amount of nonsense answers on exams. That being said, many of the students still did well and two-thirds were in the A and B range, unlike the first time around when ninety percent earned A’s and B’s. It was an unusually small class and I was likely paying lots of attention to the students and their learning. By spending less time on the metacognitive aspects of my own teaching and focusing much of my time on G-Chem, I think I did a poorer job overall.

 

The end-of-semester course evaluations were not surprising. On the Likert scale questions, my ratings went down – as expected for a larger class with more students not doing so well in the course. There were the usual comments about the speed at which we went through the material and the density of the material. A couple of students thought the twice-a-week format (with two longer rather than three shorter classes per week) was exhausting, and I see their point even with my three-minute break mid-class. Students found the study guides the most helpful; again not surprising. The chemistry and biochemistry majors liked my chemical emphasis and details. The non-majors did not like it. One made comparisons to the other sections which had “more MCAT applications” and another felt that while the other sections “skimmed through a lot of topics”, our class “felt like we learned the whole damn book”.

 

I don’t know when I will get to teach the class again. Recent staffing changes in my department might preclude my teaching it again anytime soon. If there is a next time, I would consider not using a standard textbook now that I am more comfortable with the material. One problem with following the textbook somewhat closely (which is a reasonable thing to do when you’re teaching something for the first time or two) is that you can get lost in the details and forget the big picture. A couple of students commented that this is how they felt about my class. I think instead of opening with review of G-Chem concepts and launching into amino acids and proteins, maybe I can start with some big picture metabolism (not the weeds) before getting into the building block molecules. There’s a logic to biochemistry and I’d like the students to see this. I thought I was trying to emphasize this, but many students found these details bewildering possibly because I had not spent enough time on the big picture or I was too abstract.

 

My self-rating for Round 2 is that I was overall mediocre; I’m not sure I did a better job teaching the second time around even though I was clearly more comfortable with the material. Perhaps that was the problem; I let the curse of knowledge slip in, and spending less time on thinking about the class showed.


Thursday, January 8, 2026

Student A.I. use: Fall 2025

The last two times I taught G-Chem 1, I briefly told students how a generative A.I. such as ChatGPT can be useful and what some of its limitations are. Last semester (Fall 2025), I made no mention of A.I. use in any of my classes until the last week of the term. I surveyed the students asking if they used A.I. in my class, how so, how often, and if they found it helpful. The questions were open-ended and students could answer (or decline to do so) in any way they wish. I prefaced by saying that I had no problem with A.I. use, and that their responses would help me provide guidance to my future classes. I taught two sections of G-Chem 1 and one section of Biochem. My musings on the results are mainly focused on G-Chem because of the larger class sizes.

 

In G-Chem, 12% of students said they did not use any A.I., while 88% did so. ChatGPT was by far the main source, with Gemini a distant second. (Other apps got only one or two mentions.) Only a small proportion of students said they used it a lot. Most used it sparingly or occasionally. A.I. was most often used shortly before exams (in conjunction with getting answers to my study guides) and on the stoichiometry unit where students wanted help on step-by-step calculations. From my limited tests, GPT-4o does noticeably better on stoichiometry than GPT-3 (which wasn’t very good) in providing a correct solution, although typically a verbose one.

 

Interestingly, a few students used the chatbot to recommend youtube videos to help them understand a topic. (Many students just use Google or go straight to youtube to look for such videos.) Most students said they found it helpful in “explaining” concepts or how to solve problems. Several students specifically said they used it to generate practice problems or to quiz themselves. One student said it helped them “decode” their notes and explain it in a simple way. Students said it was particularly helpful when they missed class, one even saying “I didn’t need to go to office hours… it gave me the answer from anywhere I liked.” While the majority of students said they found ChatGPT useful, a handful did not.

 

A number of students provided specific caveats in their usage. A student writes: “I would strongly recommend not heavily depending on it for homework, as it ends up being more harmful than beneficial. You must know how you obtained your answer, not just copy and paste.” Another student: “These models are constructive for learning as long as you use them productively and have them guide you instead of answering for you.” A student notes: “It was helpful, but some ideas it presented contradicted my notes, so I am not sure how accurate it is.” Another student: “While not always correct, I felt that it would usually get me started in the right direction to finish understanding the topic or solving the question on my own.” Interestingly, the students who made these types of comments were almost all students who earned A’s or B’s as their final grade. Also noteworthy, the 12% of students who did not use A.I. also earned A’s or B’s. (The average grade was in the C+ range so slightly less than 50% of the students earn A’s or B’s.) Of the students who used it sparingly or rarely, again these were the A or B students. This is perhaps not surprising. The students who knew the material felt less of a need to use A.I.

 

Since the best use of a generative A.I. is to generate test questions and study guides, I’m glad to see many students mention it in this way. Even more use it for explanations or answers which is more hit-or-miss, but I’m glad that students noticed this. Here’s one thoughtful student comment: “When it comes to studying equations, ChatGPT was very helpful because it showed me step-by-step how to solve it. I also used this model to create practice problems for me. In terms of elaborating the material from class, it was moderately helpful. It mostly gave me vague explanations.” This student also thought it was a limitation of the free version and mused that if they had used a paid version they may have had better results. One student would load the study guide in and then ask ChatGPT to provide timed quiz questions so that the student would feel like they were in an exam.

 

In Biochem, I saw similar trends: 15% of the students did not use A.I. (All three earned A’s and were among the top five.) There aren’t many math-related or calculation questions in Biochem so most of the students used it to clear up things they weren’t sure about, again usually pertaining to the study guides or my lecture slides (which I provide to the students). Since this is a smaller class, I’m not sure if any trends are significant.

 

My takeaways: Students are going to use A.I. in a chemistry class regardless of whether you have a policy or not. The majority of them already do so and feel that it is helpful, so they will keep doing so. The academically stronger students use it less, but likely because they feel they understand the material in class and are able to solve problems without outside help most of the time. Many students leverage the generative capabilities of a Large-Language-Model A.I. to generate test questions although whether they are generating sufficiently complex questions is less clear. Some students notice the weaknesses of A.I. answers yet still find it helpful as a guide. Students think A.I. helps to “simplify” some concept they are struggling with. Whether or not it is over-simplified is less clear. Students still gravitate to video explanations to supplement the text explanations of A.I., and youtube remains a key source for students.