To err is human. Sometimes it’s necessary. Sometimes inevitable. But is it ever desirable?
Perhaps, there’s a Right Kind of Wrong, the title of Amy Edmonson’s latest book. Edmonson is a professor of leadership and management with a diversity of work experience before she entered academia. The subtitle of her book: “The Science of Failing Well.” The gist of her book is arguing that it is useful to classify three types of failures (intelligent, basic, complex) and be cognizant to your situation – you need to practice self-awareness, situation awareness and system awareness. A key ingredient to failing well is to have high standards but also be in an environment of high psychological safety – you’re not afraid to own up to the failure because you will be supported by those around you. Let’s take these in turn.
An intelligent failure is when you learn from failure in a novel situation. When you encounter something you’ve never done before or new to you, the only way to make progress is trial and error. Getting things wrong is inevitable. Edmonson brings up the example of a research lab. You’re pushing into the unknown and you will try a lot of things that fail before you succeed. It’s something I try to impress upon my research students; I typically gesture to my file cabinet of abandoned projects. If you don’t try, you won’t succeed. But you’ve also got to know when to throw in the towel, and that’s a skill that takes time and experience (and hopefully good advice from mentors). So it’s important to ask yourself: Am I in a novel situation?
A basic failure takes place in well-trod territory. It’s truly an oops! You should have known better, especially since you’ve done this before. But sometimes overconfidence and not paying attention can lead to an error. The consequences could be small; the consequences could be devastating. Regardless, you must try to learn from it so you can avoid repeating it in the future. On the other hand, a complex failure is not so easy do diagnose. I’ve previously blogged about this after reading Charles Perrow’s classic Normal Accidents. Such errors occur when there are complex interactions among multiple parts in a tightly-coupled system. System failure always has multiple causes. Again, paying attention is crucial, especially to early warning signs.
A first ingredient to learn from failure is being self-aware. Even so, failure feels like a letdown and your first instinct is to beat yourself up over it, or worse to ignore it and shift the blame. Edmonson’s advice: “Choose learning over knowing” and reframe a failure as an opportunity to learn. That requires taking a pause and not acting on that pernicious first instinct. To be situationally aware, ask yourself what context you are in. Is it novel? Is it routine? Is something different than before? Is this part of a complex system? You also have to rate the consequences: Is this a low-stakes or a high-stakes situation? If low-stakes, taking a risk so you can learn might be desirable; if high-stakes you might want to think twice before betting the farm. The stakes may be physical, financial, or reputational.
It’s hard to be system aware. Ever since I dipped into systems chemistry, I’ve often found myself lost in a tangle. Thinking systemically can also be discouraging; sometimes you feel stuck in a system and there seems to be no easy way out. Edmonson outlines a simple scenario she often uses called the Beer Game, a seemingly simple scenario where students play four roles in a supply chain: “factory, distributor, wholesaler, retailer”. The rules are simple. The retailer picks a card providing the demand for that turn and then each player makes orders and keeps track of inventory. But there’s a lag time as inventory makes its way through the system. Things go awry in a hurry. There’s a tiny and surprising catch in the game that surprises students, but I won’t give it away; read Edmonson’s book or look it up. Edmonson admonishes the reader to “anticipate downstream consequences”, “resist the quick fix”, and “redraw the boundaries”. Per the typical business book, she provides lively anecdotes, engaging examples, and a positive self-help vibe.
Reading this book made me ask myself if I am risk-averse. Do I try to avoid failing? Partially, I suppose. Research is probably an area where I’m not risk-averse. But I don’t dump all my eggs into one basket and usually juggle multiple investigations to increase the chances of success. But I’m protective of my time, and that makes it difficult for me to make a large pivot. I sometimes imagine being able to do so, but then step back and do incremental small changes. This is also true of my teaching. I’m always trying new things, but in small increments. Edmonson made me pause and think about where I can shake things up for a much better payoff especially since almost everything I do as a professor is low-stakes for me.
How about my students? These days students seem much more risk-averse than when I started teaching. Getting a B or a C on an exam can seem like a devastating outcome. I’ve designed my class with lots of low-stakes ways students can engage in the “struggle” to learn the material. Chemistry is novel, and it’s certainly not easy. I’m upfront with my students about this, but I hopefully also convey that they can all learn it if they’re willing to put in the time and effort. But I recognize that if students don’t feel psychologically safe, then they won’t be willing to take the risk and make mistakes as part of learning. As a result, they don’t learn as well as they could. Right Kind of Wrong has challenged me to think about how I can help the students gain better situational awareness and see learning as a transition from novel to variable to routine such that when you’ve practiced a lot, you really do have the material down pat.
I tell students to make their mistakes in class and on the low-stakes homework and quizzes so that they won’t make them on exams. Making mistakes when learning new subject matter is inevitable, even necessary. To err is to (hopefully) learn.
No comments:
Post a Comment