Thursday, July 30, 2020

Eating Brains

Continuing my zombie interests, I decided to check out the TV series iZombie. I enjoyed watching Season One, and am looking forward to seeing what comes next, although I may have to wait awhile. I have Season Two on hold at my local library, and circulation wait times have increased thanks to Covid, so I’m not sure when my turn comes around.

 

iZombie is basically a police procedural, sorta like a cross between Bones (I watched several seasons some years ago) and Medium (which I haven’t watched). The main protagonist, Liv Moore, has her life turned upside down or inside out, personally, professionally, and most interesting to me, metabolically. From star medical intern, Liv gets turned into a zombie, and develops a craving to eat human brains. What does she do? Get a job at the medical examiner’s officer to examine deaths, so she can get her steady supply of brain-munchies. If she doesn’t keep eating them, she’ll become less human and more classic-zombie-like.

 

But there are side effects.

 

First, she sees visions from the point of view of the deceased. This helps her aid the police solve a number of murder cases, and yes, she has a cop liaison buddy (who thinks she’s a weird psychic) a supportive boss (who is trying to cure her by doing experiments on zombie rats). Second, she takes on personality traits of the deceased which allows the actress (Rose McIver) to play all sorts of interesting character-types. Third, she develops some of their skills which is cool – understanding and speaking a new language, becoming a pro hacker, and being a skilled marksman, were some of the interesting showcases.

 

What’s going on here?

 

Scientifically.

 

I’ve avoided reading any websites analyzing or speculating this topic, so the following is purely from my stream of consciousness. It seems that in eating the brain, something of that brain goes to your brain – at least in the zombie metabolism. In humans, eating brains (monkey brains being a delicacy in some cultures) doesn’t give you visions, personality traits, or skills, of monkeys or any other species. You might get a tummy-ache. But back to zombies. The gut has sometimes been dubbed the ‘second brain’. Perhaps in zombies, digestion of the brain matter provides appropriate physical material or electrical signals from gut to brain. Somehow there’s a merging between the you who is you and the who you just ate. That could explain the visions and the personality traits.

 

But there’s more. Liv exhibits the muscle memory of the skills of her eatees. So it’s not just gut brain to skull brain. A bunch of other body subsystems get involved in the temporary transformation. How does all this happen? I haven’t a clue. The only thing that comes to mind is parabiosis – y’know, how vampires stay young by consuming the blood of youthful others. We humans are very interested in parabiosis. Okay, some of us humans. Okay, okay, I find it sorta interesting. I can even recommend a very readable recent review paper, “Young Blood Rejuvenates Old Bodies”, citation and abstract shown below.

 

In the Matrix movies, you can upload (or download?) skills into your mind/body that can then be utilized by your virtual self, kungfu for example. Seems plausible. Not so easy for the real self where it takes a lot longer to train the muscle memory. Could you do it magically? In Harry Potter and the Goblet of Fire, Krum’s (partial) transformation into a shark-like creature gives him the swimming abilities, although Harry’s gillyweed-munching does the same more effectively. Poisons can immobilize us. Drugs can provide enhanced performance. Perhaps it’s not so far-fetched after all with chemistry at the heart of things!

 

We have a book on the neuroscience of zombies. Maybe, there should be one on the biochemistry or metabolism of zombies! There’s a whole (nutrition) industry around enhancing your smarts with brain food. iZombie gives it a new twist. Brain as food.

Monday, July 27, 2020

Proust and the Squid

Reading is a complicated business. Certainly, choosing what goes into your reading diet has profound implications on what you’ll be thinking and the type of person you are becoming – as evidenced in the scary media polarization these days. But the act of reading itself is more complicated than meets the eye. In more ways than one. Eye connects to brain, and somehow squiggly patterns become rich in meaning. This is the story of Proust and the Squid by Maryanne Wolf.

 

The book is subtitled “the story and science of the reading brain”. Part philosophy, part neuroscience, part child development, part history, Wolf lays out a fascinating narrative with hieroglyphics and early alphabets, gobs of neuroscience research, reading education programs, and a very close look at dyslexia. Reading is indeed a complicated business. Many of us take it for granted. At some magic moment, we were able to read, forgetting how difficult it was to begin with, and being unaware of the many factors that contributed to this newfound ability that opened up new worlds to us.

 

Our brains are not wired for reading. And for much of the history of homo sapiens, most folks didn’t read. Even today, in many parts of the world, reading is a luxury – books are scarce, and in oral cultures one’s native language may not have a written version. Yet somehow, some time ago, some folks, be it the Egyptians or Akkadians, started representing large parts of their vocabulary by creating symbols. Why? Was it because there was too much to hold in the mind? For me, blogging is a way to offload my thoughts, giving me the opportunity to sift them again at leisure (with internal links, tags, or search); it’s a sort of Pensieve or diary, I suppose. The ancient philosopher Socrates railed against the written word. But thanks to his star pupil Plato, we have his dialogues on record.

 

Wolf’s forte is child development, and she provides some marvelous narrative in support of reading some of my favorite fiction: “For young readers who are moving from simply mastering content to discovering what lies beneath the surface of a text, the literature of fantasy and magic is ideal. Think of the many images that Tolkien uses in Lord of the Rings to portray good and evil. The worlds of Middle Earth, Narnia, and Hogwarts provide fertile ground for developing skills of metaphor, inference, and analogy, because nothing is ever as it seems in these places. To figure out how to elude ring-wraiths and dragons, and how to do what I right, calls on all of one’s wits.”

 

At age twelve I received The Hobbit and Lord of the Rings as a gift from a family friend, and I read it over and over again to glean all its finer details. The educator in me perks up as Wolf makes strong ties to the learning process: “Comprehension processes grow impressively… where children learn to connect prior knowledge, predict dire or good consequences, draw inferences from every danger-filled corner, monitor gaps in their understanding, and interpret how each new clue, revelation, or added piece of knowledge changes what they know. To practice these skills, they learn to unpeel the layers of meaning in a word, a phrase, or a thought. That is, in this long phase of reading development, they leave the surface layers of text to explore the wondrous terrain that lies beneath it.”

 

If only I could say the same about learning chemistry. My first two years of high school chemistry were a complete blur, as in… I was a complete blur. I don’t think I had bad teachers, but I’m not sure I understood any of the principles, although I was somehow able to make it through national exams by employing pattern recognition and practicing lots of past-year papers. It was not until college that I started to appreciate the “wondrous terrain” of chemistry, looking beneath the surface and relishing the marvels of orbitals, symmetry, and the schizophrenic behavior of chemical bonds.

 

Perhaps some of my students have a similar blurry experience in my chemistry courses. Wolf’s detailed account of the complexity of merging the visual system with other parts of the brain related to memory, comprehension, and thinking, made me ponder further inherent challenges in learning chemistry. I’ve known about Johnstone’s Triangle for a while, and have wondered whether the symbolic nature of chemistry brings additional challenges to the learning-recognition-brain nexus. Wolf’s comprehensive explanation of the many types of dyslexia and how it incorporates different neural pathways gives me pause as I recall some office hour conversations where something so clear to me still seemed so unclear to the student. It’s like the neural pathways didn’t click. Perhaps there exists an equivalent chemistry-dyslexia that might be equally if not more prevalent.

 

You can’t learn chemistry with text alone. Pictures are crucial! As a chemist, the line structure tells me a lot about the properties and potential chemical reactivity of, say, the caffeine molecule. And like cognitive psychology’s proverbial expert chess player, I can probably recall its structure exactly after just a glance. (Since I study the origin of life, I know the structure of the closely related guanine and xanthine.) The text shorthand is a bear. In SMILES notation, caffeine would be rendered as: “CN1C=NC2=C1C(=O)N(C(=O)N2C)C”. Good luck visualizing that. Although encoding the information digitally in a SMILES string makes it quick and compact to reproduce in different ways.

 

It makes me wonder: Could you do the opposite? Teach chemistry just with pictures and figures, some oral language support, and just ditch the text of the textbooks? Is all that reading and writing necessary? Is it so that we teachers can easily read, write, and grade exams? Because we don’t have the time and energy to give each individual student an oral exam? Hmmm… I don’t know. What would Socrates think? If we plopped him in the twenty-first century world of video?

 

I don’t know, and my brain feels full of swirling thoughts that I cannot grasp. Perhaps that’s what the written word allows us to do: Take the time to think. By reading, pausing, writing, pausing, thinking, pausing, I can piece together complex arguments and tie together seemingly loose threads of information. I have to go slowly at first. New learning builds on prior learning. New discoveries build on prior discoveries – the story of science, and perhaps any other field of study. But as I progress I will build fluency. I’m not as eloquent as Wolf so I’ll leave you her words to ponder about being on the threshold of becoming an expert reader.

 

“The fluent, comprehending reader’s brain is on the threshold of attaining the single most essential gift of the evolved reading brain: time. With its decoding processes almost automatic, the young fluent brain learns to integrate more metaphorical, inferential, analogical, affective background and experimental knowledge with every newly won millisecond. For the first time in reading development, the brain becomes fast enough to think and feel differently. This gift of time is the physiological basis for our capacity to think ‘endless thoughts most wonderful.’ Nothing is more important than the act of reading.”

Wednesday, July 22, 2020

Collaborative Cognitive Load

In the midst of putting together the syllabus and readings for my origin-of-life class, I started to wonder if I was asking too much of the students – given we’re still in the season of Covid, and we might still be all-remote when the fall semester begins. Our class will essentially be reading and discussing primary literature. Some of the older ‘classic’ papers are not too difficult to read because they’re usually shorter and not as densely-packed with content and jargon. Unfortunately that’s not true of more recent articles. Even I have trouble reading them if I’m not steeped in the subtopic at hand; and origin-of-life papers cover many areas not in my wheelhouse.

 

The students will struggle with some of the papers, but that’s not intrinsically a bad thing.

I’m building in guides that include some pre-reading questions that encourage students to look up relevant information before they dive into the papers. I’m including a mixture of easier-to-read review articles coupled with the key research papers. I’m also encouraging students to write reflections on their work that may include things that puzzle them, and to build a glossary for new terminology they encounter. And I’m encouraging them to work together on answering questions in the reading guides. Final projects will also be collaborative.

 

Will the collaborative aspects help the students not feel as overwhelmed when they encounter the seemingly obtuse literature? I’m not sure, so I turned to see what the literature has to say about this. Over the years I’ve read a number of pedagogical papers and articles about “group work” and “peer learning”; I can tell you that the results are mixed, at least in what I think are the best studies. It looks like a muddle. But there might be some theoretical underpinnings for the messy observations, and this comes from an area I’ve followed for a while: Cognitive Load Theory.

 

I’m not sure how I missed the 2018 paper (which I discovered last week) that I’ll be discussing today, perhaps because it was published in a journal I have never heard of (citation shown in the picture below). I’ve read quite a bit from two of the co-authors, Paul Kirschner and John Sweller, and have generally found their arguments persuasive in the past. Of particular interest, the present paper covers both in-person collaboration but also addresses computer-supported collaborative learning (CSCL), which seems especially important in a time of Covid.

 

The paper begins with a short overview of cognitive load theory before diving into issues related to collaborative learning. The authors summarize the (possibly elusive) goal as follows: “Although in the short run, collaborative learning results in group members trying to successfully perform a certain learning task or solve a specific problem together, in the long run, as an instructional method, it is very important that all members of the group develop effective experience working together (i.e. domain-generalized group knowledge) that facilitates every member in acquiring domain-specific knowledge from this combined effort.” [emphasis mine]

 

Note that solving the specific problem is not the main goal, at least pedagogically. As to the long-term goal, pundits often tout the first part to be of pedagogical importance (teaching ‘soft skills’ as something employers want) but they neglect the second part that I think is much more important. I emphasized the word ‘every’ because to me, that’s the trickiest part in designing effective assignments for group work. I mostly don’t have to teach the first part (domain-generalized group knowledge) because learning these skills is (mostly) biologically primary. But I might have to help grease the wheels – and that’s where useful tips one often reads about how to structure group work can prove effective.

 

The second part (domain-specific knowledge) is my main concern. That’s primarily why I’m there as an instructor with domain-specific knowledge that, being biologically secondary, is difficult to acquire without being explicitly taught. Different settings (e.g. fully remote, hybrid, face-to-face) and/or different types of problems (e.g. mathematical, philosophical, scientific, historical) require designing the activities differently. Hence, as the authors caution: “It is possible that under certain circumstances, collaboration facilitates the learning of biologically secondary information while under certain circumstances it interferes with that learning.” I think this is why we’ve seen mixed results.

 

From cognitive load theory’s point of view, whether or not collaboration will be effective depends on the extraneous cognitive load imposed due to “task related transactive activities” on the one hand, and pooling resources to mutually overcome the limits of individual working memory stores on the other hand. Thus, one needs to pay careful attention to the prior content knowledge of the students (both quantity and asymmetry), the complexity of the task at hand, prior collaborative experience of the students, and what domain-specific guided teaching is needed to both reduce extraneous cognitive load and help manage intrinsic cognitive load. Table 2 in the paper summarizes the key factors. 

 

In assigning previous group work, I’ve taken some of these principles into account to varying degrees: task complexity, task guidance and support, team size, team roles, team composition. I have not done this systematically, nor have I coalesced on a set of best practices. I’ve had my students fill out a short questionnaire early in the semester to learn what background domain knowledge they have and something nebulous about their ‘personality’. To be frightfully honest, I don’t think I know what I’m doing for the most part.

 

I’ve paid less attention to the “transactive” parts – the mechanics of collaboration as students interact with each other – mostly out of ignorance. I only step in to head off what might be egregious problems, but I mostly just let them work things out. However, reading this paper made me think more carefully about the transactive parts given the ever-more-likely looming possibility that we will be all remote this semester. With virtual interactions, where it’s much harder to read body language or tone, and with increased asynchronous interactions, more scaffolding needs to be built in. The authors put it this way: “The more the channel of collaboration mimics a face-to-face interaction, the less of a load collaboration will place on working memory because it relies on biologically primary knowledge we have…”

 

Needless to say, I’m finding all this overwhelming. Perhaps that’s not a bad thing, and it helps me empathize with my students who are likely to feel overwhelmed at different points in the semester. My origin-of-life class is composed of juniors and seniors, many of whom already know each other, and may have worked together on prior tasks. On the other hand, in my first-year general chemistry course, students do not know each other, and may meet for the first time online. While I have some group project work in mind, I have to do much more work thinking about how to structure or scaffold those interactions so that students can work effectively without wasting precious cognitive resources. Never a dull moment as a teacher, I suppose.

Friday, July 17, 2020

The Horta: Silicon Life

Could silicon life exist? Not an A.I. in a computer, but a physical life-form capable of performing the activities of living?

 

For fans of the old Star Trek episodes, there’s the Horta. A rundown of the story (Season 1, Episode 26, “The Devil in the Dark”) can be found on this website.

 

The Horta is one strange-looking creature. One commentator described it as Hamburger Helper. I’ve never eaten Hamburger Helper so I have no further comment on that matter. However, as someone interested in chemistry and the origins of life, it’s interesting to consider if silicon life is possible – chemically speaking.

 

When Kirk encounters the creature (picture above), you can also see a clutch of metallic looking spheres in the background. In the TV show, they are supposedly made up of almost-pure silicon and might be Horta eggs, with the contents predominantly being Horta food (is my guess). Interestingly, here on Earth we make pure silicon spheres to define the kilogram – which also helps us to calculate Avogadro’s constant. And if you see non-faked pictures of natural metallic spherical objects in Earth’s oceans, those are manganese nodules, not silicon spheres.

 

But let’s get back to the chemistry of silicon and life.

 

Silicon sits just below carbon on the periodic table. An introductory chemistry course often declares that elements in the same row have similar chemistry. So if carbon can form the backbone of life’s molecules on Earth, couldn’t it be the backbone of life’s molecules on some alien planet?

 

Why is carbon the backbone element of Earth life? Carbon is tetravalent, meaning it can form up to four directional bonds with other atoms. Non-metals in other columns of the periodic table form three (boron, nitrogen), two (oxygen), or one (fluorine) directional bonds. (Metals form non-directional bonds for the most part.) This allows for a greater diversity of structures that can be formed. Silicon is also tetravalent. So far so good.

 

Adding to the diversity repertoire, carbon also forms double bonds and triple bonds. Silicon does so poorly, i.e., the second or third bonds (usually referred to as pi-bonds) are so weak, they break easily to form other single bonds. Also, the single bonds between silicon atoms are much weaker than the single bonds between carbon atoms – the larger silicon atoms can’t get as close to each other when “overlapping” (is my simplified explanation for today’s blog post; chemical bonding is complicated!).

 

The elements of life are often abbreviated as CHNOPS, which sounds almost like the name of some pharaoh. I’ve discussed why (the elements, not the pharaoh) in a previous post that included going into the details of the bond strengths alluded to above. But delving into the origins of metabolism, we find that the key metabolites have mostly just CHO (although S is implicated as being important in co-factor molecules).

 

Let’s break this down. We’ve talked about C. Why also H? You wouldn’t have very much diversity with pure carbon: there’s diamond, graphite, buckyballs, buckytubes, and the presently popular graphene (single graphite monolayers). That’s about it. Adding hydrogen allows you to “cap” a carbon atom’s valences giving rise to a huge diversity of hydrocarbon molecules. There are hundreds, thousands, millions, of combinations you could have. But besides adding diversity, carbon-carbon and carbon-hydrogen bonds are both strong and relatively inert, i.e., you get a diversity of stable compounds that aren’t overly reactive, especially in water (interesting chemistry takes place in liquids).

 

I used the word “backbone” deliberately. In life you’re trying to find that sweet spot between stability and reactivity. If you’re rock-solid-stable, nothing happens. If you’re crazy-unstable-reactive, you won’t hang around long enough to do anything interesting or be able to transduce free energy effectively. And mind you, energy transduction is the name of the game of life. That’s where oxygen comes in. The O atoms in CHO-containing molecules provide, not just additional diversity, but sites of reactivity while maintaining stable backbones. The entry of O2 as an oxidant changed the game remarkably (I recommend Nick Lane’s book Oxygen). Need energy? Got fuel (hydrocarbons)? Just burn, baby, burn!

 

It’s difficult for Horta to survive in a terrestrial-like environment with water as the main liquid solvent driving chemistry. Silicon is simply not stable enough nor can it form a sufficient diversity of molecules to support the structure, flexibility, and adaptability of living organisms. At least in life as we know it. There might be a very slow rock-like-life that we’d say looks dead. And while Hamburger Helper Horta has rocky looking characteristics, he can scuttle off or react remarkably quickly. But on a different planet, with a very different atmosphere, hydrosphere, lithosphere – you could have a very different biosphere, and Horta might exist after all.

 

P.S. Extant life does incorporate silicon, mainly in plants, and almost exclusively in just two forms: molecular silicic acid, Si(OH)4, or amorphous silica, SiO2.

Wednesday, July 15, 2020

Wicked Problems: 1973 Version

Reading the news is depressing. I’m tempted to cocoon in my own bubble world. Surprisingly, this is easier to do today. Working from home, I don’t have to deal with commute traffic. I can order groceries to be delivered. I can choose only to access feel-good stories in the news, or ignore it completely. And since I’m not teaching a summer class or taking on summer research students, my time is mostly my own, at least until the new semester begins.

 

Two major issues are dominating the U.S. airwaves: the continuing coronavirus woes, and the George Floyd aftermath. These, among many other issues both domestic and abroad, are “wicked problems”. Anything that involves government and public policy today might fall under this category. Even engaging in university-wide planning in a year without Covid might qualify, although perhaps less anxiety-inducing. But wicked problems are not new. They’ve likely existed as long as society has existed, and have become increasingly untameable with increasing complexity.

 

The phrase “wicked problem” has also increased in popularity as all manner of things are ascribed to this category. I don’t know who first formulated the phrase, but it is given some definition by Rittel and Webber in their 1973 article: “Dilemmas in a General Theory of Planning” (citation and abstract in Figure).

 

It’s an excellent article; I recommend reading it in full. The principles mentioned will resonate with anyone who is flabbergasted with the depressing news of our times. The prose is clear, accessible, and hard-hitting, as illustrated by the quote below.

 

Planning problems are inherently wicked. As distinguished from problems in the natural sciences, which are definable and separable and may have solutions that are findable, the problems of governmental planning – and especially those of social or policy planning – are ill-defined; and they rely upon elusive political judgment for resolution. (Not “solution.” Social problems are never solved. At best they are only re-solved – over and over again.)

 

Even the parenthetical statements pack a punch. When wearing my administrator hat, I am hereby resolved to discuss Resolutions to such planning problems, and not Solutions. It’s so easy for me as a chemistry professor to fall back on talking about “solutions”, be they chemical substance solutions or (more importantly to students) solutions to the latest problem set.

 

Rittel and Webber articulate ten properties that set wicked problems apart. I would call them ten dilemmas. We should be alert to them, even as we cannot solve them and only resolve them in a limited way. Before getting to their list, the authors carefully define what it means to be wicked.

 

We are calling them “wicked” not because these properties are themselves ethically deplorable. We use the term “wicked” in a meaning akin to that of “malignant” (in contrast to “benign”) or “vicious” (like a circle) or “tricky” (like a leprechaun) or “aggressive” (like a lion, in contrast to the docility of a lamb). We do not mean to personify these properties of social systems by implying malicious intent. But then, you may agree that it becomes morally objectionable for the planner to treat a wicked problem as though it were a tame one, or to tame a wicked problem prematurely, or to refuse to recognize the inherent wickedness of social problems.

 

Here is my very quick summary of these properties, but I recommend the interested reader to consult the full article.

 

(a) Defining the problem is the problem, and inherently includes biased solutions based on the ideology or vantage point of the problem (re)solver. There is no clearly defined end point to the problem that tells you you’re done.

 

(b) There are no “correct” or unambiguous solutions. Nor can you test them without consequences. You can’t exhaust the “solution space”. Wicked problems are interconnected to each other, perhaps even in one vast system.

 

(c) A wicked problem is unique, and although there might be overlap with related problems, as the authors state: despite long lists of similarities between a current problem and a previous one, there always might be an additional distinguishing property that is of overriding importance.

 

I close with a final quote from the article, that I think is wise advice, but has become increasingly difficult in our internet age of near-instant rage.

 

Part of the art of dealing with wicked problems is the art of not knowing too early which type of solution to apply.

Friday, July 10, 2020

Motivation in Lingots

Being back in the U.S. after a year away on sabbatical, I feel freshly motivated towards the teaching and learning enterprise. I’m teaching myself a lot about how to optimize remote learning, but that’s because I’m motivated not to be a complete failure if we are unable to hold physical in-person classes when the semester begins. Since I really do want to provide my students with the best learning opportunities possible, I’m motivated to try and do my part as their professor, guide, shaman, or whatever new role gets plunked on us. I want to design the best possible experience, with the sober understanding that I won’t be great at it the first time around. I’m aiming for good enough.

 

How will I keep students engaged and motivated, given a potential all-remote environment, with likely increased blocks of asynchronous learning? There are plenty of tips from seasoned professionals and pundits alike, thanks to the Covid Toggle of last semester, which I thankfully missed. Chemistry is not an easy subject – it’s all about invisible entities that you represent abstractly with pictures, symbols, and some math thrown in, not to mention the extra jargon. I’ve decided to experience this remote environment for myself by enrolling in Duolingo. Also killing two birds with one stone by upping my Spanish. No owls will be harmed in this process.

 

I’ve blogged about my previous experience with Rosetta Stone, where I… um, degenerated to “laziness and suboptimal learning”. The circumstances then have parallel to my present circumstances, eerily enough. I chose to brush up on Spanish, now that I’m back, given that my skills have clearly degenerated. Can Duo the Owl help keep me motivated? Will I keep working every day for Lingots, the cash currency allowing me to buy swag in Duolingo? It’s still early days and I just completed my seven-day streak since signing up a week ago where I took a pre-test and it was determined that my knowledge constituted 16% of the course, allowing me to skip a string of early lessons.

 

So far so good.

 

As I attempted to peer into the crystal ball of whether I would keep up and improve, I consulted my spouse who has a 340-day streak; she’s kept up since the day she began. Since we were moving to another country for a year, she chose to learn one of the oft-spoken languages there; she had no background and was starting as a complete beginner. (I had low-level competence.) Did Duo the Owl keep her motivated with his encouraging one-liners? Did earning Lingots give her a boost to keep going?

 

Sadly, no.

 

Her motivation has waned since we returned since we’re no longer in a context where that language skill is a plus. Early on, she was more motivated, as the lessons were easier and she felt like she was learning new things and making good progress. But as the material got more difficult and daily progress felt less accomplished, motivation waned. Getting on the leaderboard and advancing leagues was motivating in the beginning, but this gets harder as you move up in the company of highly motivated Duo-fanatics. At this point, she doesn’t want to break her streak, so it’s mostly for maintenance – not necessarily a bad reason. Lingots were motivating only if it helped with some sort of advancement. The cute owl Duo had no effect. Sadly.

 

It’s still early days for me. I enjoy being cheered on by Duo. I’m looking forward to earning more Lingots so I can get some costumes for Duo (so I’ve been told). I’m on the leaderboard and likely to advance out of the Bronze League to whatever’s next by this weekend. I’ve spread out the lessons I do for distributed and interleaved practice, including trying out some of the lessons I “skipped” due to the placement test. So right now, at least in Week One, I’m behaving like a motivated, excited, student. Things are novel and exciting.

 

But will this hold up over time? I don’t know.

 

Reflecting on this made me think about the things I could do to help keep my students engaged and motivated throughout the semester. Week One is easy. But the going gets tougher, and it’s harder to feel motivated. I don’t have Lingots to hand out, nor am I sure if they will do any good. I’m not a cute green owl, but I suppose I can spout lines of encouragement and optimism. Fun little apps (e.g., the “race” in Socrative) for classroom engagement lose their luster quickly. I’m hoping some of my assignments spark interest even while they push students to sink their teeth into the nuances of chemistry. I certainly don’t want to make things too easy.

 

But while I’m still motivated, I should keep my daily appointment with Duo!

Wednesday, July 8, 2020

Rubrics Redux

I use rubrics. Sometimes. When I think the occasion warrants their use.

 

I’m also skeptical of rubrics. I they think are over-pushed in certain educational circles, where they are the darling of assessment strategies, but sometimes they can be CRAAP. Like any tool, they can be used well or they can be used poorly. When a rubric becomes an assessment checklist that narrows the students’ view while promoting an instrumentalist approach (see for example Torrance, H. Assessment in Education, 2007, 14, 281-294), we would do well as educators to abandon it.

 

Rubrics are useful for qualitative formative assessment, and so you would expect to see them more often utilized in the humanities, arts, and some of the social sciences. They are sometimes utilized in the sciences, most often to evaluate presentations, projects, and other creative activity. I highly recommend Royce’s 30-year old article on the use of formative assessment in such situations. We should be careful when using rubrics, not to inadvertently promote the Swiss-cheese learning of our students.

 

So it was refreshing to read “Beyond Fairness and Consistency in Grading: The Role of Rubrics in Higher Education” by Ragupatha and Lee (Chapter 3 in Diversity and Inclusion in Global Higher Education, 2020). They weave a careful thread of how rubrics can be effectively used as scaffolds to improve student learning, while being aware of the aforementioned pitfalls. It’s not an easy road because Goodhart’s Law (“When a measure becomes a target, it ceases to be a good measure.”) is always lurking close by.

 

Ragupatha and Lee provide some constructs and examples, but these remain mostly at the general level, and appropriately so. I appreciated how they walk the reader through different types of rubrics and their purposes. I found their broader approach thoughtful, and they did not sound like the oft-encountered narrow-minded educational-ese singing the praises of rubrics as the panacea to all educational (assessment) evil. Lee is also in chemistry, and perhaps that influenced my positive reading of the article. Their article is well-researched, and I found myself reading some of their cited references, the mark of a worthwhile article! Still, the danger looms of over-playing rubrics, especially as they continue to acquire an underlying business-management tone. What was old becomes new again, both in the business world and in education.

Monday, July 6, 2020

FAQs


Several events conspired to make me read a bunch of FAQs this past week. My laptop drained its battery twice as fast. Fixing the problem required an OS upgrade, which then caused several applications to stop working; I had to fix those, mostly by reading FAQs. On top of that, I’m enrolled in a self-paced week-long how-to remote-teaching course, which encourages engaging the material two hours each day. Yes, I’m reading FAQs on how to use different tools whose role supposedly is a handmaiden to technology, although one wonders if such boilerplate language is simply a veneer. But wait, there’s more! We are also switching G-Chem textbooks, moving back to a Pearson book which comes with the online Mastering Chemistry, but of course it comes in a new enhanced version. So it’s back to the FAQs to figure out how to do what I want through the new interface.

Why do we have FAQs? If I asked someone (or the web) what the acronym stands for, the response is frequently, Frequently Asked Questions. But really, we’re looking for Answers to our Questions, supposedly frequently asked. Back before web pages, in the old days, when I got stuck, I would pick up the phone and call someone to get help. Nowadays, the A.I.-powered phone menu sounds like an FAQ. I’d rather read or skim through FAQ web pages than plod through the annoying audio. In the old days, the phone support answerer would (usually) patiently answer my questions and help me get unstuck. Patience is important, because I’m probably caller #1001 who is asking the same question. Why waste manpower money when you can stick it up on a web page or get an A.I. bot to do the same?

As a teacher, I sometimes have the same experience as the old-school phone support answerer. Students ask the same questions frequently. “Will that be on the test?” “What’s going to be on the test?” “Do you provide a review session before the test?” There are many others. A detailed syllabus acts somewhat like an AFAQ, as in Answers to Frequently Asked Questions. The questions themselves are relegated to relative unimportance, particularly when their answers are informationally simple. Just tell me the answer quickly so I can fix the problem I’m having and go on with my life.

But I think FAQs could play a different and more useful role in education. Students want to know what’s on the upcoming test so they can prepare appropriately and do well, hopefully in an efficient manner with as little work as possible, so that they can go on with their lives. Indeed a good indicator of what topics/skills are important in a course is how often a question is asked in that area. Frequently Asked Questions tell you what is important to know. Those are the questions I’ll be asking as an instructor. My G-Chem syllabi include key points for each class day, and things students need to learn/know. “By the end of today’s class you should be able to [insert learning objective with appropriate Bloom’s Taxonomy verb here].” I also provide previous year exams, satisfying the same objective.

As I’m going through the self-paced blah-blah [insert hyphenated dyads here] course, I’ve been thinking about course design for a remote audience. Perhaps I should put together an FAQ page, not like the tech-support AFAQs, but a true Frequently Asked Questions featuring… Questions. The focus is on the Questions. Yes, the answers are important too, and students think the answers are more important than the questions at the novice level. But I’m trying to move them along the continuum from novice to expert, and that means learning how to Ask Good Questions. Asking good, careful, and thoughtful questions helps to define and advance a field of knowledge. It leads to new knowledge, and correcting or updating previous knowledge.

All that sounds fine and dandy, and perhaps ivory-tower-ethereal. Maybe as a start I could focus on recasting my learning goals into a FAQ page on my course web site, one that might structure the conversation on the larger goals/themes of the course, in addition to the fact-ish minutiae students are more concerned about.