Books I wish I
read earlier in life. The Logic of Failure by Dietrich Dörner. Originally published in German in 1989
and translated into English in 1996, the book’s catchy subtitle is “Recognizing
and Avoiding Error in Complex Situations”. It has an excellent cover to match
with a large bright red F.
Would it have
helped me plan and think through complex issues before I became an
administrator? Possibly, but I’m not sure. That’s a good thing, because Dörner
doesn’t try to sell the reader a “new” method that grooms leaders into
strategic and creative thinkers. He proposes a methodology, but then adds plenty
of buts and caveats. The devil is truly in the details because every complex
situation is different. We can’t always see when, where and why. There is no
substitute for experience, but even experience can cause its own compounding
problems. Depressed yet?
How does Dörner
study the problem? By putting people in computer simulations designed to model
complex situations. Mind you, this is back in the ‘80s. Today’s simulations
would likely be exponentially more
sophisticated, although I suspect the behavioral results of the participants
would be similar today. The setup? There are two. Managing a small fictitious
town in an isolated hilly region in Europe. Or managing an African region with different
tribes subsisting on farming or herding. Both situations feature complex and
inter-related variables, and the point is not just to see who succeeds or
fails, but why. As you’d expect, a few do well, many fare poorly, some learn
from their mistakes, and others don’t. There are common threads among the
successful. The reasons for failure, however, are myriad. But there is a logic
to them. A self-help guru would package the common threads into a
five-steps-to-success program, but Dörner is much more circumspect. His book
closes with a cautionary tale I will discuss at the end.
The introductory
chapter includes an analysis of the 1985 Chernobyl “disaster” (although Richard Muller would argue it is less disastrous then commonly thought). From his
analysis and observing his simulation participants, Dörner describes four
features of complex situations. First, they are complex. His definition: “Complexity
is the label we will give to the existence of many interdependent variables in
a given system. The more variables and the greater their interdependence, the
greater that system’s complexity. Great complexity places high demands on a
planner’s capacities to gather information, integrate findings, and design
effective actions. The links between the variables oblige us to attend to a
great many features simultaneously, and that, concomitantly makes it impossible
for us to undertake only one action in a complex system.”
This situation is
the bane of scientists designing an experiment to answer a question. We try to
keep everything else constant except for one variable, so we can isolate its
causes and effects. And when you’re in a complex situation, without the luxury
of time to gather all the information you think you need… well, I don’t envy
policy makers and administrators (now that I’m no longer one, having some
experience being in such situations). It gets worse. “Complexity is not an
objective factor but a subjective one,” Dörner writes. “We might even think we
could assign a numerical value to it…” Having immersed myself this semester learning
about measuring complexity in chemical systems, I’m very much inclined to
agree. There are several approaches to measure complexity in molecules and
molecular systems, but all the devisers would agree that they are pegged to
subjective reference states.
The other three
features of complex situations are dynamics, intransparence, and
ignorance/mistaken hypotheses. There are many variables for which we have no
direct access, thus the “frosted glass” of intransparence. This contributes to
our ignorance, and we make assumptions (both implicit and explicit) that might
simply be wrong, only to realize later that our actions have made things worse.
Dynamics is one feature particularly difficult for us to grasp – how things
change over time. Possibly a combination of brain evolution and our education
system, we extrapolate linearly when many situations (even simpler ones) might
suggest a power law or exponential behavior. I’m certainly guilty of that as a
teacher. Just this month I have discussed generating simpler linear models (a
common feature in science) multiple times in general chemistry: the
Clausius-Clapeyron equation, first and second-order rate laws, and the
Arrhenius equation. We drew lots of graphs. Why? Graphs help us translate time
into space – turning something dynamic that’s hard to grasp, to something
static and easier for our feeble minds to comprehend. And I’m pretty sure most
of my students don’t grasp the log scale of a graph axis, even if they can build
up the data and sketch the graph.
Dörner devotes a
whole chapter to Time Sequences, and another to Information & Models. Other
parts of his book discuss the important twin features of Goal Setting and
Planning. There are many, many, many places to Fail. I recommend reading the
eye-opening examples in his book. There is a logic to them, and I’m not sure
this logic could have been extracted without running the simulations. If
anything, I felt mildly justified by many youthful hours spent on long
complicated games, my favorite being the ‘80s Avalon Hill boardgame Civilization. Its descendant, the
‘90s Sid Meier computer game of the same name (the first version) is probably
the last computer game I’ve played. Spending most of my working day in front of
a computer, I’m not interested in using it for leisure. I still have my old ‘simulation’
boardgames, but no longer have the time to play them. The shorter ones strip
out some of the complexity, and are still fun, but do not teach the lessons of
complexity. The variables are fewer. I’ve recently pulled out a few older
Reiner Knizia favorites. His designs are exquisite and force tough choices even
with a simple ruleset.
In the book’s
final chapter, Dörner asks: “How can we teach people to deal effectively with
uncertainty and complexity?” The problem: “There is probably no cut-and-dried
method for teaching people how to manage complex, uncertain, and dynamic
realities, because such realities, by their nature, do no present themselves in
cut-and-dried form.” While on average, experienced leader-managers performed
better than students in these simulations, there were more than a fair share of
significant failures among the experienced.
Dörner’s last
example, however, is sobering. Before one of the simulations, participants were
divided into three groups. Here are his descriptions. “The strategy and tactics
groups received instruction in some fairly complicated procedures for dealing
with complex systems. The strategy group was introduced to concepts like system, positive feedback, negative
feedback, and critical variable, and
to the benefits of formulating goals, determining and, if necessary, changing
priorities and so forth. The tactics group was taught a particular procedure
for decision making...”
First, the
self-evaluation results of each group after the simulation (which was conducted
over several weeks). “The members of the strategy and tactics groups all agreed
that the training had been ‘moderately’ helpful to them. The members of the
control group, who had received training in some nebulous, ill-defined ‘creative
thinking’, felt that their training had been of very little use to them.” Just
think of all those snake-oil salespeople selling the latest workshop in innovation or creativity for leadership. Heck, I even sold the idea to a group of
sophomores where we would experiment combining creativity and chemistry.
We’ll reflect on the results at semester’s end.
But how did the
actual participants do in Dörner simulations? (In this case it was being mayor
of the small fictitious town.) No difference in performance. You heard that
right. There was no difference in actual performance! Despite the training.
Yet, participants receiving the training thought it helped them. Why though? Dörner’s
answer gives me goosebumps.
“The training gave
them what I would call ‘verbal intelligence’ in the field of solving complex
problems. Equipped with lots of shiny new concepts, they were able to talk about their thinking, their
actions, and the problems they were facing. This gain in eloquence left no mark
at all on their performance, however. Other investigators report a similar gap
between verbal intelligence and performance intelligence… The ability to talk
about something does not necessarily reflect an ability to deal with it in
reality.”
Does this bring
back memories of listening to a well-spoken leader or administrator who turned
out to be ineffective or incompetent, or worse? I’ve even done the speaking
myself, just a few times, when I’ve had to be on a panel or say something in
public. (I try to avoid these situations when I can.) I’ve also been sent to leadership
training sessions. I think I learned something from them, but maybe it’s simply
just a better vocabulary. I read voraciously to learn, but maybe I’m only
acquiring better ways to sound knowledgeable. At present, I’m slowly working my
way through Thinking in Complexity by
another German author, Klaus Mainzer. It is subtitled “The Computational
Dynamics of Matter, Mind and Mankind.” You can bet that I am able to discuss such
concepts in Astrophysics, Biology, Consciousness, as smooth as ABC. I know all
about non-linearity, attractors, neural nets, and feedback loops. And I am
trying to solve the riddle of the origin of life, a complex problem about how
complexity arises.
I’m likely not to
succeed in solving the riddle. The problem is full of intransparence with too
many interdependent variables. It’s a toy problem to work on, and it keeps my
mind active. Real-world practical problems, however, are subject to
time-sensitive decision making. Delaying a course of action is itself an
action, for good or ill, whether one be collecting more data or concentrating
on some other urgent priority. If anything, reading The Logic of Failure has made me more circumspect. I wish I read it
earlier. Perhaps it could have helped in better decision-making in the past,
but perhaps not. There is a logic to failure, but the situations are so diverse
that a lifetime might not be sufficient encounter to learn how not to fail in
the next complex situation.
No comments:
Post a Comment