Right Kind of Wrong By Amy C. Edmondson
Juan Carlos Zambrano
Gerente de Finanzas @ Tecnofarma Bolivia | Coaching ontologico
Introduction
Success is stumbling from failure to failure with no loss of enthusiasm. —Winston Churchill
The idea that people and organizations should learn from failure is popular and even seems obvious. But most of us fail to learn the valuable lessons failures can offer. We put off the hard work of reflecting on what we did wrong. Sometimes, we’re reluctant to admit that we failed in the first place. We’re embarrassed by our failures and quick to spot those of others. We deny, gloss over, and quickly move on from—or blame circumstances and other people for—things that go wrong. Every child learns, sooner or later, to dodge blame by pointing the finger elsewhere. Over time, this becomes habitual. Worse, these habits make us avoid stretch goals or challenges where we might fail. As a result, we lose out on countless opportunities to learn and develop new skills. This pernicious combination of human psychology, socialization, and institutional rewards makes mastering the science of failing well far more challenging than it needs to be
This book is about what makes learning from failure so difficult to put into practice in our day-to-day lives and in the institutions we build
I believe that part of successfully navigating failure to reap its rewards—and, importantly, to avoid the wrong kinds of failure as often as possible—starts with understanding that not all failures are created equal. As you will see, some failures can rightly be called bad. Fortunately, most of these are also preventable. Other failures are genuinely good. They bring important discoveries that improve our lives and our world. Lest you get the wrong idea, I’ve had my share of failures that were bad, along with some that were good
This book offers a typology of failure that helps you sort the right kind of wrong from the failures that you should work hard to prevent. You will also learn how to think differently about yourself and failure, recognize contexts in which failures are likely, and understand the role of systems—all crucial competencies for mastering the science of failing well
Learning from Mistakes Is Easier Said Than Done
Surprises, often in the form of bad news for a researcher’s hypothesis, are common in research. None last long as scientists if they can’t stand to fail, as I would soon learn. Discovery stories don’t end with failure; failures are stepping stones on the way to success. There is no shortage of popular quotes on that point—many of them are sprinkled throughout this book—and for good reason. These kinds of informative, but still undesired, failures are the right kind of wrong.
Being Wrong in New Territory
In science, as in life, intelligent failures can’t be predicted. A blind date set up by a mutual friend may conclude in a tedious evening (a failure) even if the friend had good reasons to believe you’d like each other. Whether an intelligent failure is small (a boring date) or large (a failed clinical trial), we must welcome this type of failure as part of the messy journey into new terrain, whether it leads to a lifesaving vaccine or a life partner
Intelligent failures provide valuable new knowledge. They bring discovery. They occur when experimentation is necessary simply because answers are not knowable in advance. Perhaps a particular situation hasn’t been encountered before, or perhaps one is truly standing on the front lines of discovery in a field of research. Discovering new drugs, launching a radical new business model, designing an innovative product, or testing customer reactions in a brand-new market are all tasks that require intelligent failures to make progress and succeed. Trial and error is a common term for the kind of experimentation needed in these settings, but it’s a misnomer. Error implies that there was a right way to do it in the first place. Intelligent failures are not errors. This book will elaborate on this and other vital distinctions that we must make if we wish to learn to put failure to good use
Discovering psychological safety
Much later I used the term psychological safety to capture this difference in work environment, and I developed a set of survey items to measure it, thereby spawning a subfield of research in organizational behavior
Psychological safety plays a powerful role in the science of failing well. It allows people to ask for help when they’re in over their heads, which helps eliminate preventable failures. It helps them report—and hence catch and correct—errors to avoid worse outcomes, and it makes it possible to experiment in thoughtful ways to generate new discoveries. Think about the teams that you’ve been a part of at work, or at school, in sports, or in your community. These groups probably varied in psychological safety. Maybe in some you felt completely comfortable speaking up with a new idea, or disagreeing with a team leader, or asking for help when you were out of your depth. In other teams you might have felt it was better to hold back—to wait and see what happened or what other people did and said before sticking your neck out. That difference is now called psychological safety—and I have found in my research that it’s an emergent property of a group, not a personality difference. This means your perception of whether it’s safe to speak up at work is unrelated to whether you’re an extrovert or an introvert. Instead, it’s shaped by how people around you react to things that you and others say and do.
When a group is higher in psychological safety, it’s likely to be more innovative, do higher-quality work, and enjoy better performance, compared to a group that is low in psychological safety. One of the most important reasons for these different outcomes is that people in psychologically safe teams can admit their mistakes. These are teams where candor is expected. It’s not always fun, and certainly it’s not always comfortable, to work in such a team because of the difficult conversations you will sometimes experience. Psychological safety in a team is virtually synonymous with a learning environment in a team. Everyone makes mistakes (we are all fallible), but not everyone is in a group where people feel comfortable speaking up about them. And it’s hard for teams to learn and perform well without psychological safety
What Is the Right Kind of Wrong?
You might think that the right kind of wrong is simply the smallest possible failure. Big failures are bad, and small failures are good. But size is actually not how you will learn to distinguish failures, or how you will assess their value. Good failures are those that bring us valuable new information that simply could not have been gained any other way.
Every kind of failure brings opportunities for learning and improvement. To avoid squandering these opportunities, we need a mix of emotional, cognitive, and interpersonal skills. These will be spelled out in this book in a way that I hope makes it easy to start applying them immediately
But before we go any further, a few definitions are in order. I define failure as an outcome that deviates from desired results, whether that be failing to win a hoped-for gold medal, an oil tanker spilling thousands of tons of raw oil into the ocean instead of arriving safely in a harbor, a start-up that dives downward, or overcooking the fish meant for dinner. In short, failure is a lack of success
Next, I define errors (synonymous with mistakes) as unintended deviations from prespecified standards, such as procedures, rules, or policies. Putting the cereal in the refrigerator and the milk in the cupboard is an error. A surgeon who operates on a patient’s left knee when the right knee was injured has made an error. The important thing about errors and mistakes is that they are unintended. Errors may have relatively minor consequences—cereal stored in the refrigerator is inconvenient and milk left in the cupboard may spoil—while other mistakes, such as the patient who received the wrong-site surgery, have serious repercussions
Finally, violations occur when an individual intentionally deviates from the rules. If you deliberately pour flammable oil on a rag, light a match to it, and throw it into an open doorway, you are an arsonist and have violated the law. If you forget to properly store an oil-soaked rag and it spontaneously combusts, you have made a mistake.
All of these terms can be so emotionally loaded that we may be tempted to simply turn and flee. But in so doing, we miss out on the intellectually (and emotionally) satisfying journey of learning to dance with failure
Bad Failure, Good Failure
Maybe you are one of the many people who deep down believe that failure is bad. You’ve heard the new rhetoric about embracing failure but find it hard to take it seriously in your day-to-day life. Maybe you also believe that learning from failure is pretty straightforward: reflect on what you did wrong (not trying hard enough in math class, steering the boat too close to the rocks) and just do better next time, whether by studying more or ensuring that you have the latest maps for accurate navigation. This approach sees failure as shameful and largely the fault of the one who fails
This belief is as widely held as it is misguided
First, failure is not always bad. Today, I don’t doubt that my failure to find support for the simple research hypothesis that guided my first study was the best thing that ever happened to my research career. Of course, it didn’t feel that way in the moment. I felt embarrassed and afraid that my colleagues wouldn’t keep me on the research team. My thoughts spiraled out to what I would do next, after dropping out of graduate school. This unhelpful reaction points to why each of us must learn how to take a deep breath, think again, and hypothesize anew. That simple self-management task is part of the science of failing well
Second, learning from failure is not nearly as easy as it sounds. Nonetheless, we can learn how to do it well. If we want to go beyond superficial lessons, we need to jettison a few outdated cultural beliefs and stereotypical notions of success. We need to accept ourselves as fallible human beings and take it from there
PART ONE
THE FAILURE LANDSCAPE
CHAPTER 1 Chasing the
Only those who dare to fail greatly can ever achieve greatly. —Robert F. Kennedy
Why Is It So Hard to Fail Well?
Failing well is hard for three reasons: aversion, confusion, and fear. Aversion refers to an instinctive emotional response to failure. Confusion arises when we lack access to a simple, practical framework for distinguishing failure types. Fear comes from the social stigma of failure.
Aversion: a spontaneous emotional response to failure
Rationally, we know that failure is an unavoidable part of life, certainly a source of learning, and even a requirement for progress. But, as research in psychology and neuroscience has shown, our emotions don’t always keep up with our clear-eyed, rational understanding. Numerous studies show that we process negative and positive information differently. You might say we’re saddled with a negativity bias. We take in bad information, including small mistakes and failures, more readily than good information. We have more trouble letting go of bad compared to good thoughts. We remember the negative things that happen to us more vividly and for longer than we do the positive ones. We pay more attention to negative than positive feedback. People interpret negative facial expressions more quickly than positive ones. Bad, simply put, is stronger than good. This is not to say we agree with or value it more but rather that we notice it more
Aversion to failure is real. Rationally, we know that everyone makes mistakes; we know we live in a complex world where things will go wrong even when we do our best; we know we should forgive ourselves (and others) when we fall short. But failure and fault are inextricably linked in most households, organizations, and cultures.
Ironically, our aversion to failures makes experiencing them more likely. When we don’t admit or point out small failures, we allow them to turn into larger ones. When you put off telling your boss about a problem that could derail a critical project—and perhaps miss an important deadline for the customer—you convert a potentially solvable small issue into a larger, more consequential failure
It’s human to feel anger and blame, but it’s not a strategy for helping us avoid and learn from failure
One of the most important strategies for avoiding complex failures is emphasizing a preference for speaking up openly and quickly in your family, team, or organization. In other words, make it psychologically safe to be honest about a small thing before it snowballs into a larger failure. Too many of the large organizational failures I’ve studied could have been prevented if people had felt able to speak up earlier with their tentative concerns
It starts with the willingness to look at yourself—not to engage in extensive self-criticism or to enumerate your personal flaws, but to become more aware of universal tendencies that stem from how we’re wired and are compounded by how we’re socialized. This is not about rumination—a repetitive negative thought process that isn’t productive—or self-flagellation. But it may mean taking a look at some of your idiosyncratic habits. Without this, it’s hard to experiment with practices that help us think and act differently
Clinical psychology research shows that failures in our lives can trigger emotional distress, anxiety, and even depression. Yet, some people are more resilient than others. What makes them different? First, they are less prone to perfectionism, less likely to hold themselves to unrealistic standards. If you expect to do everything perfectly or to win every contest, you will be disappointed or even distressed when it doesn’t happen. In contrast, if you expect to try your best, accepting that you might not achieve everything you want, you’re likely to have a more balanced and healthy relationship with failure
Second, resilient people make more positive attributions about events than those who become anxious or depressed. How they explain failures to themselves is balanced and realistic, rather than exaggerated and colored by shame. If you attribute not getting a job offer you wanted to a highly competitive applicant pool or to the company’s idiosyncratic preferences, you’re more likely to recover from the disappointment than if you think, I’m just not good enough.
Note that healthy attributions about failure not only stay balanced and rational, they also take account of the ways—small or large—that you may have contributed to what happened. Maybe you didn’t prepare sufficiently for the interview. This is not to beat yourself up or wallow in shame. Quite the contrary; it’s about developing the self-awareness and confidence to keep learning, making whatever changes you need so as to do better next time
Each of us is a fallible human being, living and working with other fallible human beings. Even if we work to overcome our emotional aversion to failure, failing effectively isn’t automatic. We also need help to reduce the confusion created by the glib talk about failure that is especially rampant in conversations on entrepreneurship
Confusion: not all failure is alike
Although fail fast, fail often has become a Silicon Valley mantra meant to celebrate failure, and corporate failure parties and failure résumés have become popular, much of the discussion in books, articles, and podcasts is simple and superficial—more rhetoric than reality
Fortunately, this confusion can be reduced by understanding the three types of failure, and how differences in context matter
Our confusion about failure gives rise to illogical policies and practices. For example, meeting with senior executives in a large financial services firm in April 2020, I listened as they explained that the current business environment made failure temporarily off-limits. Understandably concerned about an economic climate increasingly challenged by a global pandemic, these business leaders wanted everything to go as well as possible. Generally speaking, they were sincere in their desire to learn from failure. But enthusiasm about failing was acceptable when times were good, they told me; now that the future looked uncertain, pursuing unerring success was more imperative than ever.
These smart, well-intentioned people needed to rethink failure. First, they needed to appreciate the context. The need for fast learning from failure is most critical in times of uncertainty and upheaval, in part because failures are more likely! Second, while encouraging people to minimize basic and complex failures may help them focus, welcoming intelligent failures remains essential to progress in any industry. Third, they needed to recognize that the most likely outcome of their prohibition on failure wasn’t perfection but rather not hearing about the failures that do occur. When people don’t speak up about small failures—say, an accounting error—these can spiral into larger failures, such as massive banking losses
Interpersonal fear: stigma and social rejection
Adding to our emotional aversion and cognitive confusion is a deep-rooted fear of looking bad in the eyes of others. This is more than just a preference. The fear induced by the risk of social rejection can be traced back to our evolutionary heritage when rejection could literally mean the difference between staying alive and dying of starvation or exposure
This survival mechanism in our brains helped us elude saber-toothed tigers in prehistoric times, but today often leads us to overreact to harmless stimuli and to shy away from constructive risk-taking. The fear response, designed to be protective, can be counterproductive in the modern world when it keeps us from taking the small interpersonal risks that are essential to speaking up or trying new things.
First, fear inhibits learning. Research shows that fear consumes physiologic resources, diverting them from parts of the brain that manage working memory and process new information
Second, fear impedes talking about our failures. Today’s never-ending chore of self-presentation has exacerbated this ancient human tendency
The real failure, I’ve found, is believing that others will like us more if we are failure-free. In reality, we appreciate and like people who are genuine and interested in us, not those who present a flawless exterior.
In my research, I’ve amassed a fair amount of evidence that psychological safety is especially helpful in settings where teamwork, problem-solving, or innovation are needed to get the job done
Have you ever worked in a team where you were genuinely not worried that others would think less of you if you asked for help or admitted that you were wrong about something? Maybe you felt confident that people supported and respected one another—and all were trying to do their best
Yet few organizations have enough psychological safety for the benefits of learning from failure to be fully realized
In sum, our aversion to failure, confusion about failure types, and fear of rejection combine to make practicing the science of failing well more difficult than it needs to be. Fear makes it hard to speak up when we need help to avoid a mistake or to engage in honest conversation so we can learn from a failed experiment. Lacking the vocabulary and rationale to distinguish basic, complex, and intelligent failures, we’re more likely to maintain our aversion to all failures
Failure’s Range of Causes
At first glance, commitment to excellence and tolerance of failure seem to be in tension
When someone deliberately sabotages a process or violates a safety practice, blame is appropriate. But after that, you face a judgment call that cannot be made without more information about the context
Succeeding through Failing
It should be clear by now that not everyone fails at failure
Innovation Never Ends
The successful innovators in our study recognized that they needed to lead differently. They had to make sure that everyone in the operating room could talk openly and immediately about what was needed from one another to make the procedure work. When my colleagues and I analyzed the teams that persisted in mastering the new approach, we found that all of them engaged in a few special activities that reflect core practices in the science of failing well
Practicing the Science of Failing Well
Failures may never be fun, but with practice using new tools and insights they can become less painful and easier to learn from. Our instinctive aversion to failure, our confusion about its different forms, and our fear of rejection keep us stuck. The way out starts with reframing failure—as did so many Olympic bronze medalists—and setting realistic expectations about it. From the small setbacks we experience in our day-to-day lives to the tragic deaths that occurred in the early days of open heart surgery, failures are an unavoidable part of progress. This is as true for our personal lives as for the vital institutions that shape society. This is why it’s so important—and ultimately so rewarding—to master the science of failure. Each of the chapters ahead brings foundational ideas and practices to help you do just that
CHAPTER 2 Eureka
I have not failed. I’ve just found ten thousand ways that won’t work. —Attributed to Thomas A. Edison
The frontier in any scientific field, a thoughtful hypothesis not supported by data is the right kind of wrong
When Failure Is Intelligent
What makes a failure qualify as intelligent? Here are four key attributes: it takes place in new territory; the context presents a credible opportunity to advance toward a desired goal (whether that be scientific discovery or a new friendship); it is informed by available knowledge (one might say hypothesis driven); and finally the failure is as small as it can be to still provide valuable insights. Size is a judgment call, and context matters. What a large company can afford to risk on a pilot project may be greater than what you can afford to risk on a new endeavor in your personal life. The point is to use time and resources wisely. A bonus attribute is that the failure’s lessons are learned and used to guide next steps
With these criteria in mind, anyone can try something out and feel good about the results even when they fall short of a hoped-for success. The failure is intelligent because it’s the result of a thoughtful experiment—not a haphazard or sloppy one
New territory
Life and work place us in new territory all the time. New may mean new to an entire professional field, or simply new to you as in a new sport, a career move, or a first date. If you are picking up golf, it’s all but guaranteed that the first encounter between your club and the ball will qualify as a failure. More meaningfully, most major life events, such as leaving home or moving to a new location, lead to new ground. This is true for happy life events, such as getting married. And sad ones, such as losing a parent
A crucial feature of new territory, whether you’re a first-time parent or starting your first job, is uncertainty. That is part of the risk you assume when you try something new. It is not possible to predict exactly what will happen
Meaningful opportunity
An intelligent failure occurs as part of what you believe is a meaningful opportunity to advance toward a valued goal
Learning to experience intelligent failure starts at an early age. A child taking her first step is well on her way to doing just that. But in elementary school, many children start to believe that getting the right answers is the only valued activity
Do your homework
Intelligent failures begin with preparation. No scientist wants to waste time or materials on experiments that have been run before and failed. Do your homework
Equally important is the desire to understand, as Jocelyn Bell exemplified, why the unexpected happened, or to anticipate what will happen in a new experiment. Gardeners
Keep it small
Because failures consume time and resources, you’re smart to use both judiciously. Failures can also threaten reputations. One way to mitigate the reputational cost of failure is to experiment behind closed doors. If you’ve ever tried on a bold new style of clothing to see if it suits you, you probably did it behind the curtain of a store’s changing area. Similarly, most innovation departments and scientific labs are private, with scientists and product designers trying all sorts of crazy things without an audience
Another best practice for keeping failure as small as possible is the design of smart pilots to test new ideas before the full-scale launch of an innovation. Pilots make sense: carry out small tests of something new to avoid big, expensive, visible failures. But too often this good idea goes wrong in practice: a seemingly successful pilot is followed by a major failure in the launch of the innovation to all customers
The solution is to create incentives that motivate pilots not to succeed but rather to fail well. An effective pilot is littered with the right kind of wrong—numerous intelligent failures, each generating valuable information. To design a smart pilot in your organization, you should be able to answer yes to the following questions:
Let’s review. To be intelligent, a failure must take place in new territory, in pursuit of a valued goal, with adequate preparation and risk mitigation (investing as little as needed to learn)
How to Tell If a Failure Is Intelligent
Diagnostic Questions: Do people already know how to achieve the result I’m pursuing? Is it possible to find a solution some other way, to avoid failure?
Diagnostic Questions: Is there a meaningful opportunity worth pursuing? What goal am I hoping to accomplish? Is the risk of failure worth taking?
Diagnostic Questions: Have I done my homework? Before I experiment, do I have the available relevant knowledge? Have I formulated a thoughtful hypothesis about what might happen?
Diagnostic Questions: Have I mitigated the risks of taking action in new territory by designing an experiment that is as small as possible, while still being informative? Is the planned action the right size?
Diagnostic Questions: Have I mined the lessons from the failure and figured out how to put them to use going forward? Have I shared this knowledge widely to prevent the same failure from happening again?
Learn as much as you can
Taking the time to learn from what went wrong is often the most cringe-inducing aspect of intelligent failure. Not all of us can remain as cheerful as Thomas Edison. You’re not alone if you feel disappointed or embarrassed, and it’s easy to want to push those feelings away. That’s why it’s important to reframe and resist blame and push yourself to be curious. It’s natural to fall prey to self-serving analysis—I was right, but someone in the lab must have altered something—which takes us away from discovery. But a true desire to learn from failure forces us to confront facts more fully and rationally. You’ll also want to avoid superficial analysis—It didn’t work. Let’s try something else—which generates random rather than considered action. Finally, avoid the glib answer I’ll do better next time, which circumvents real learning. What’s necessary is to stop and think carefully about what went wrong so as to inform the next act. (Or decide to abandon the opportunity, which is itself valuable.) Table 2.2 shows some of the ways learning from failure is shortchanged in our lives and how to do better
Practices for Learning from Failure
Don’t Say: I’ll just try harder next time.
Try: Thinking carefully about what went wrong and what factors might have caused it.
Don’t Say: It didn’t work. I’ll just try something else.
Try: Analyzing what the different causes of the failure suggest about what to try next
Don’t Say: I was right, but someone or something else messed it up.
Try: Digging in to understand—and accept—your own contribution (small or large) to the failure.
A Bias for Action Iteration
An intelligent failure is an episode with a beginning and an end. An intelligent failure strategy, practiced by inventors, scientists, and innovation departments around the world, strings multiple intelligent failures together to progress toward valued goals. When we experiment, we hope our hypotheses are right. But we must act to know for sure
Learning from intelligent failure can be slow, whether in our lives or in the technology that shapes our world. Sometimes it takes decades, with multiple people building on others’ failures
Masters of Intelligent Failure
Who does intelligent failure particularly well? Scientists, as we have seen. Inventors, of course. Also, celebrity chefs and leaders of company innovation teams, to name a few. Despite superficial differences, elite failure practitioners have much in common, and these attributes can be emulated by any of us
It starts with curiosity. Elite failure practitioners seem to be driven by a desire to understand the world around them—not through philosophic contemplation, but by interacting with it. Testing things out. Experimenting. They’re willing to act! This makes them vulnerable to failure along the way—about which they seem unusually tolerant
Taking the Intelligence of Intelligent Failure to Heart
In some cases, intelligent failures are especially worthy of celebration because they point us forward toward eventual success. They shut down one path and force us to seek another. The discovery of what doesn’t work is sometimes as valuable as finding what does work
CHAPTER 3 To Err Is Human
The only man who never makes a mistake is the man who never does anything. —Theodore Roosevelt
Basic failures. Unlike intelligent failures, which occur in unknown territory, basic failures involve errors in well-trodden terrain. Basic failures are not the right kind of wrong. In the continuum of failure types, they are farthest from intelligent failures. Basic failures are unproductive—wasting time, energy, and resources. And they are largely preventable. As shown in Figure 3.1, the greater the uncertainty, the lower the preventability. We can never be entirely rid of human error, but we can do much to minimize basic failure. To do that, we need to prevent the errors we can prevent and to catch and correct the rest. For those, we need to disrupt the link between the error and the failure it might trigger if it isn’t caught in time
Remember that errors—synonymous with mistakes—are by definition unintended. Errors often have relatively minor consequences, such as a tiny dent in a car bumper from backing out of the driveway too fast. Call them the daily Oopses and Oh No’s that we learn to brush off and to remedy. We’ll apologize to the friend we accidentally offended. Clean the gutters this weekend
Basic failures are everyday occurrences and many are not terribly consequential. But now and then a basic failure is catastrophic. Errors, even small ones, occasionally have serious repercussions. But none of them is the right kind of wrong, so why is learning about basic failures worth your time?
First, they offer us a chance to practice feeling okay about the fact that mistakes will happen. Beating ourselves up about them is unhelpful and unhealthy. Mistakes and the failures they trigger are a part of life. Occasionally they even bring eureka moments of discovery
Second, if we want to keep getting better at the activities and deepen the relationships we value most, we must be willing to confront and learn from our mistakes. We must overcome our aversion to them
But the best reason to learn how basic failures work is to prevent as many of them as possible. A few insights and practices drawn from an extensive research literature on errors and error management can help you do just that
Much of what we know about error management comes from decades of research and training in the aviation industry. Aviation has an impressive record of devising procedures and systems to reduce the errors that can trigger devastating basic failures. Indeed, preventing and reducing the bad failures that cause us trouble, rather than bringing discovery, is how to approach basic failure
Checklists are not a guarantee against basic failures. They offer an enabling structure—but one that must be used with intention
The Basics of Basic Failure
Nearly all basic failures can be averted with care and without need of ingenuity or invention. The important thing to remember about errors is that they are unintended—and punishing them as a strategy for preventing failure will backfire. It encourages people not to admit errors, which ironically increases the likelihood of preventable basic failure. This is as true in families as in companies
While basic failures don’t have the thrill of intelligent failures, they still present opportunities for learning. And despite lacking the intricacy of complex failures, they can be equally catastrophic
Not all mistakes cause basic failure. This may seem obvious, but many errors do not lead to failure
We all make mistakes. Forgetting to charge your cell phone does not cause a basic failure if you find somewhere to plug it in while continuing your call
Yet all basic failures are caused by mistakes. Missing a phone call scheduled to provide necessary information on a time-sensitive issue is a basic failure that could be caused by forgetting to charge your phone.
What about deliberate errors? A deliberate error is an oxymoron and is better labeled mischief or sabotage. The prankster who deliberately mislabels your kitchen’s sugar and salt canisters is causing mischief
How to Recognize a Basic Failure
The accident presents two characteristic features of basic failures: They occur in known territory. They tend to have a single cause.
Simply put, a failure is basic when errors happen because we do not use knowledge that was available—whether due to inattention, neglect, or overconfidence
Sometimes, failures that initially appear to have a single cause turn out to be embedded in a complex web of causes
As you will learn in the next chapter, when multiple errors, sometimes enhanced with a pinch of bad luck, line up, they cause complex failures
Human Drivers of Basic Failure
More simply, inattention
Inattention
Careless mistakes resulting from inattention are one of the most common causes of basic failure
We are all vulnerable to sleep difficulties stemming from a variety of factors, and, yes, more sleep may help reduce the daily mistakes in your life. But it’s also important to step back and consider what causes sleep deprivation
Inattention is all too human. It’s hard to stay vigilant, to pay close attention in those moments that need it most. Relatedly, sometimes we’re aware that something requires our attention, but we put it off. And nothing bad happens, at least not for a while
Neglect
Human tendency to neglect situations with the potential to reach a breaking point. Neglect tends not to produce instant harm but rather allows the buildup that ultimately results in failure. Because we are forgetful and busy, it’s easy to put things off. In retrospect, it’s easy to see what went wrong. You would have done better on the test if you had studied harder. You should have brought an umbrella when rain was predicted. Fortunately, most of these would have, should have errors in our daily lives do not cause undue harm. Other times, however, neglect can have serious consequences.
Overconfidence
Although some basic failures stem from misplaced steel rods or ignored regulations, simply not reflecting on the implications of a decision is a common underlying cause. People fail to draw on available information or even common sense. What was I thinking? is the vernacular: What was I thinking when I booked two important meetings at the same time? What was I thinking when I forgot to pack a sweater, or socks, when traveling to a cold climate? Often, the answer, as you may have experienced, is I wasn’t thinking. As in, when scheduling a meeting, I didn’t check my calendar. When packing, I didn’t consult the weather reports, I was preoccupied with other thoughts, or both
Faulty assumptions
Assumptions, by definition, take shape in our minds without explicit thought. When we assume something, we’re not directly focusing on it. We fail to challenge assumptions because they seem to us self-evidently true. Assumptions thus leave us with erroneous confidence that our model or our way of thinking is correct, often because it has worked before and has become part of our belief system. We’ve seen this before. We’ve always done it this way
How to Reduce the Basic Failures in Your Life
Research on error management has expanded considerably in recent years. Although generally focused on high-risk organizations, this work offers practices that you can put to work to reduce basic failures in your own life as well. These include making safety a priority, expecting and catching errors, and learning as much as you can from them. But it starts with making friends with error—and with our fallibility
What complicates the quest for friendship is our aversion to errors. We hate to be wrong. We feel embarrassed or ashamed. But we can do better. Not too long after my sailing accident, aware of the research on healthy attribution skills, I recognized that making a mistake was not cause for deep shame
Our aversion to errors prompts people to make sense of them in delightfully creative ways
When presented with the choice between admitting our mistakes or protecting our self-image, the decision is easy. We want to believe we are not at fault, so we find every reason to justify what we did as correct. That makes it hard to learn! A psychological bias known as the fundamental attribution error exacerbates the problem
You always contribute something to the failure, even if other factors also play a role. But, because you are more able to alter your behavior than to, say, fix the sidewalk, focusing on what you might have done differently is more practical and powerful than bemoaning the shortcomings of your environment
Owning our errors becomes easier when we accept human fallibility as a fact and put that acceptance to use in learning and improving
I find it helpful to think about it this way: Vulnerability is a fact. None of us can predict or control all future events; therefore, we are vulnerable. The only real question? Whether you acknowledge it! Many worry that doing so will make them appear weak, but research shows that being open about what you know and don’t know builds trust and commitment. Admitting doubt in the face of uncertainty demonstrates strength rather than weakness
Another best practice is acknowledging your own contributions—no matter how large or small—to the failures that do occur. This is not only wise, it’s practical, for two reasons. First, it makes it easier for others to do the same, making the analytic work of diagnosing failures easier, and second, other people will then see you as approachable and trustworthy and will be more enthusiastic about working with or befriending you
Although it’s easy to think of basic failure as mundane and thus not likely to yield a return on investments of time or money, in fact the potential upside of error reduction is large
In his inspiring book The Power of Habit, recounts that O’Neill opened his remarks to investors and analysts by saying, I want to talk to you about worker safety.
It’s worth noting that Alcoa did not have a safety problem in 1987. The company’s safety record was better than that of most American companies
What O’Neill knew was that worker safety could only be achieved when people at the company (at all levels) committed themselves to what he called a habit of excellence—a habit that would positively affect production quality, uptime, profitability, and, yes, ultimately stock price
To help managers build psychologically safe environments, he encouraged all of them to ask themselves, daily, whether every member of their teams could respond yes to three questions:
Finally, by showing that he cared more about worker safety than profits, O’Neill removed a major barrier to speaking up. When a safety incident happened, small or large, he made it an instant priority
This enormous accomplishment required a first step of befriending human error, then putting systems into place so people could routinely catch and correct it before anyone was harmed on the job
The genius of the Andon Cord lies both in how it functions as a quality-control device to prevent defects and in its embodiment of two essential facets of error management: (1) catching small mistakes before they compound into substantial failures, and (2) blameless reporting, which plays a vital role in ensuring safety in high-risk environments
Mastery in any field requires a willingness to actually learn something from the many mistakes you will necessarily make
Fostering a healthy attitude about human fallibility is the first and possibly most important step for helping us catch and correct mistakes. But to complement and support these behavioral practices, implementing failure prevention systems can dramatically increase your chances of success
Prevention Systems
None of these failure prevention systems is revolutionary. All reflect common sense. Yet few companies or families take the time to get them in place. My favorite of these is blameless reporting—an explicit system to enable early detection of potential harm
Recognizing that bad news doesn’t age well, many thoughtful organizations and families have explicitly (or sometimes implicitly) implemented blameless reporting. Does such a policy imply a tolerance for bad behavior or low standards? Anything goes? Not even close. The policy requests that people speak up quickly about errors and problems, so as to prevent them from turning into larger problems, or serious failures
Blameless reporting also applies in families. Parents of teenagers, for example, can make sure their children know to call at any time of day or night if they need a ride home. No questions asked. The risks created by the combination of alcohol, driving, and adolescence are best managed, such parents believe, by ensuring that the lines of communication are open. They want their children to understand that no questions asked is truly a viable, nonpunitive option. Learning and safety are thereby prioritized above evaluation in dangerous situations.
Psychological safety both enables and is enabled by blameless reporting. The policy sends the message We understand that things will go wrong, and we want to hear from you quickly so we can solve problems and prevent harm.
In short, blameless reporting is part of a coordinated learning system
What do dentistry and cars have in common? Here’s a hint: just as brushing your teeth after meals prevents painful and expensive decay, so does changing the oil in your car at regular intervals prevent engine damage. In both domains, preventive maintenance is essential. This practice is as dull as it is valuable. So what is it about human beings that makes it so easy for us to neglect preventive maintenance?
Part of the answer is found in what psychologists call temporal discounting, the tendency to discount or devalue the significance of delayed responses to actions. Studies show that people give less weight to outcomes that will occur in the future compared to events in the present
No discussion of codification is complete without reference to the book The Checklist Manifesto, by my Harvard colleague Atul Gawande. Since its publication in 2009 it’s helped to popularize and establish the habit of drawing up a series of process steps to ensure consistency and attention to detail and to reduce careless error
Checklists, however, are not foolproof. Medical errors continue to be an enormous challenge for hospitals and the health-care professions
Additionally, checklists need to be updated when knowledge evolves or rules change
Prior to 1967, parents who left pill bottles accessible to young children risked a trip to the emergency room. So many children were becoming accidentally poisoned that Dr. Henri Breault, chief of pediatrics and director of the Poison Control Center at a hospital in Windsor, Ontario, came home at 3:00 a.m. one day and, according to his wife, said, You know, I’ve had it! I am tired of pumping children’s stomachs when they’re taking pills that they shouldn’t be having! I’ve got to do something about it. That was the impetus for the invention of a cap too complicated for children to open. First called a Palm N Turn when introduced in the Windsor area, the invention reduced poisoning accidents by 91 percent
That’s one example of failure-proofing—taking measures to reduce a known risk factor
CHAPTER 4 The Perfect Storm
Unfortunately, most warning systems do not warn us that they can no longer warn us. —Charles Perrow
Many little things.
This chapter digs into the nature of complex failures and why they’re on the rise in nearly every facet of contemporary life
Not all failures qualify as the right kind of wrong! Some are downright catastrophic. Some are tragic. Others are merely a source of chagrin. The science of failing well starts with a clear-eyed diagnosis of failure type—so as to better understand, learn from, and most important, prevent as many destructive failures as possible. Like basic failures, complex failures are not the right kind of wrong
Still, to say that complex failures are not the right kind of wrong is not to call them blameworthy. Some are, but most, as you will see, are not. Like intelligent and basic failures, complex failures can be powerful teachers if we are willing to do the hard work of learning from them
The conplexity of complex failure
Although basic failures are occasionally devastating, complex failures are the real monsters that loom large in our lives, organizations, and societies. While basic failures present reasonably solvable problems with single causes, complex failures are archetypically different. They’re prevalent in settings such as hospital emergency rooms and global supply chains because multiple factors and people interact in somewhat unpredictable ways. Increasingly volatile weather systems are another breeding ground for complex failure. My years of studying complex failures in health care, aerospace, and business have produced a set of remarkably disparate examples that nonetheless share common attributes. Above all, they have more than one cause. Complex failures happen in familiar settings, which is what distinguishes them from intelligent failures. Despite being familiar, these settings present a degree of complexity where multiple factors can interact in unexpected ways. Usually, complex failures are preceded by subtle warning signs. Finally, they often include at least one external, seemingly uncontrollable, factor
It’s this familiarity that makes complex failures so pernicious. In familiar situations you feel more in control than you actually are—say, driving home (familiar) despite consuming alcohol at a party—making it easy to be lulled into a false sense of confidence
Complex failures are not always catastrophic. We’ve all had experiences in which a mini perfect storm derails our plans. You set your alarm for p.m. not a.m., making you late getting out the door; your gas tank hovers on empty, so you stop to fill it
Complex failures have more than one cause, none of which created the failure on its own. Usually a mix of internal factors, such as procedures and skills, collides with external factors, such as weather or a supplier’s delivery delay. Sometimes, the multiple factors interact to exacerbate one another; sometimes they simply compound, as with the straw that broke the camel’s back
In complex failures, an external or uncontrollable factor often enters the mix. You can think of this as bad luck
At times the line between a basic failure and a complex failure blurs. What seems at first a basic failure, such as mistakenly loading a real bullet instead of a blank, turns out, upon further examination, to be complex when that initial obvious single cause reveals causes of its own
Finally, complex failures are generally preceded by small warning signs that get missed, ignored, or downplayed
A thorough diagnosis of a complex failure usually identifies missed signals, along with giving us a deeper understanding of who and what were accountable
Making Matters Worse
Large-scale crises such as that of the Torrey Canyon that make headlines often combine multiple smaller failures and a mix of failure types. No one knew how to control the breaking ship and the leaking oil; nor did anyone know how to cope with the tons of crude oil as it spilled across hundreds of miles of pristine coastline. One thing went wrong, followed by another and another. Many little things added up to one big disaster. Whereas basic failures present reasonably solvable problems, complex failures, as this tragedy exemplifies, are archetypically different. Unfortunately, these perfect storms are not a thing of the past. And sometimes they unfold over decades
Unfolding over Decades
Sometimes, one must go back decades to understand the origins of a complex failure
As is often the case in large organizations, a suppressive cultural attitude toward criticism
What each of us must take to heart is that uncertainty and interdependence in almost every aspect of our lives today means that complex failure is on the rise. Academic research can help us understand why. It can, as you will see, also help us do better. Once you understand the factors and what they mean for your organization and your life, it may at first feel daunting—but it is in fact empowering. Seeing the world around you for its complex-failure propensity sets you up well to navigate the uncertain future ahead.
Complex Failure on the Rise
The most obvious cause of modern complex failures is the increasingly complex information technology (IT) that underlies every aspect of life and work today. Factories, supply chains, and operations in many other industries rely on sophisticated computer controls where a small glitch in one part of a system can spiral out of control
Social media has altered business, politics, and friendships, making going viral a household term. The global financial industry links every bank, and countless households in every country, making us vulnerable to human error taking place on the other side of the world
How Systems Spawn Complex Failures
The thinking that eventually developed into my framework for categorizing failure began to take shape thirty years ago
Perrow’s classic model, with new labels I created for each quadrant. The upper right captures Perrow’s core idea that interactive complexity and tight coupling, such as are found in nuclear power plants, create a danger zone. Perrow used railroads to illustrate the combination of tight coupling and linear interactions in what I call the control zone. A typical manufacturing plant presents loose coupling and linear interactions. Because classic management works extremely well in such contexts, I refer to this as the managed zone. Lastly, complex interactions combine with loose coupling in a university, with its ongoing negotiations to keep things organized and functioning, giving rise to what I call the negotiated zone
FIGURE 4.1: Perrow’s Model Revisited
In my study of medical accidents, I wondered if systems of patient care in hospitals were interactively complex and tightly coupled. If the answer to both was yes, then according to Perrow’s framework, patient care failures in hospitals were simply inevitable and irreducible
Hospital-based patient care offers considerable interactive complexity. For example, a physician writes a prescription that gets filled by a pharmacist, delivered to the floor by someone else, and administered by several nurses during the hospital stay. I determined, however, that the links in this chain are loosely coupled. A failure in one part of the system can be caught and corrected at any time. This is the silver lining for complex failure here: the handoffs are human. I concluded that hospitals did not fall into Perrow’s worst-case quadrant. That meant that it should be possible to achieve zero harm.
Nonetheless, system failures continued to happen, so I decided to look more closely into this. I learned that loose coupling doesn’t preclude systems from breaking down. It just means it’s possible to catch and correct errors before a complex failure occurs
Health-care error experts today use what’s called the Swiss cheese model to explain this kind of system failure.
Swiss cheese model calls attention to the defenses that normally prevent consequential failures in complex systems such as hospitals. Holes in the Swiss cheese are likened to small process defects or errors. A hole in your block of Swiss cheese can be seen as a flaw—an empty space that does not contribute to your nutrition. Fortunately, the holes in the cheese are discrete and contained, says Reason, leaving the cheese intact. But occasionally, the holes line up, creating a tunnel—a line of defects that compound and end in a consequential accident
Complex failures range from the small to the catastrophic. Their complexity, along with their increased prevalence, may make us pessimistic about preventing them
How to Reduce Complex Failures
Function with interactive complexity and tight coupling was that so many such organizations did in fact function without mishap for years, even decades. Nuclear power plants operated without incident nearly all the time. So did air traffic control systems, nuclear aircraft carriers, and a host of other inherently risky operations. A small group of researchers led by Karlene Roberts at the University of California, Berkeley, set out to study how they did it. What they discovered was more behavioral than technical
The term high reliability organization, or HRO, captures the essence of the theory. HROs are reliably safe because of how they make everyone in them feel accountable to one another for practices that consistently catch and correct deviations to prevent major harm. Vigilance is one word for it. But it’s more than that
To me the most interesting part of HRO research is the observation that rather than downplaying failure, people in HROs are obsessed with failure. My colleagues Karl Weick, Kathie Sutcliffe, and David Obstfeld wrote a seminal paper highlighting the culture of HROs as preoccupied with failure, reluctant to simplify, acutely sensitive to ongoing operations (quick to detect subtle unexpected changes), committed to resilience (catching and correcting error, rather than expecting error-free operations), and valuing expertise over rank. In other words, HROs are weird places. Rather than holding back to see what the boss is thinking, people there don’t hesitate to speak up immediately. A frontline associate, to avert a crisis, can tell the CEO what to do. Failure is clearly seen as an ever-present risk that can nonetheless be consistently averted
From the research on complex systems, human error, and HROs, I take away that complex failure is a worthy foe. We should not underestimate the challenge that lies ahead—but nor should we shy away from it. Whether you’re more intrigued by the Swiss cheese model or the cultural attributes of HROs, the consistent message that runs through these expert perspectives is that we can reduce the occurrence of complex failures in our lives by following a set of simple (not easy!) practices, starting with learning as much as you can from the complex failures that have already occurred
Catastrophic complex failures often become the wake-up calls that spark investigation and change in training, technology, or regulations
Although post-catastrophe investigations are important, the rise in frequency and severity of complex failures means we can’t afford to act only in their aftermath. Reducing complex failures starts with paying attention to what I call ambiguous threats. Whereas clear threats (a Category 5 hurricane will hit your neighborhood tomorrow) readily trigger corrective action (evacuate your house), we tend to downplay ambiguous threats—missing chances to prevent harm. Downplaying ambiguous threats is the opposite of what occurs in high reliability organizations
Sometimes, as you have no doubt experienced, failures come out of the blue—meaning no one saw them coming or even worried a little about the possibility
An ambiguous threat is just that: ambiguous. It could be a real threat of failure, or it could be nothing. Your car may run without mishap, your teen may act responsibly, and the stock market dip may be nothing
Human cognition and organizational systems both conspire to suppress subtle signals of danger, making complex failures more likely
Given the inherent challenge of responding to ambiguous threats, what can we do to prevent the complex failures in our lives? To answer that question, let’s take a look at some unusual organizations that do it well
To counteract our tendency to downplay ambiguous threats, think about the window of opportunity in which recovery is still feasible before a complex failure occurs. The window opens when someone detects a signal—however weak—that a failure may lie ahead. It closes when the failure occurs. Recovery windows can last anywhere from minutes to months
Recovery windows can be seen as valuable opportunities for fast learning. This is true even when a threat turns out to be benign. For example, parents’ frank discussion with their teens about the dangers of drinking and driving and letting them know that it’s okay to call anytime for a ride home with no questions asked is a smart response to an ambiguous threat and may prevent a tragic accident. But these windows depend on a willingness to speak up without certainty that a failure may lie ahead. This is one way that a psychologically safe environment helps prevent the wrong kind of wrong
领英推荐
How can we sense complex failures before they happen? The very nature of complex failures—their multiple factors interacting in unique, unprecedented ways—makes that idea seem a fool’s errand. Yet there are simple, elegant ways to try
It starts with changing your attitude about false alarms
Recall that any worker in a Toyota factory can pull an Andon Cord to alert a team leader of a possible error before it turns into a production failure. The team leader and team member examine the potential problem, however small, and together either fix or dismiss the threat. If only one of twelve pulls of the Andon Cord stops the assembly line for a genuine problem, you might think the company would be upset by wasting supervisors’ time chasing the eleven false alarms.
Catching and correcting mistakes to make truly safe workplaces, whether in a factory, hospital, or aircraft, requires a culture of vigilance. Adopting Andon Cords to prevent small mistakes from becoming big mistakes helps build such a culture, as do consequences for sleeping on the job. Because we know for sure that things do go wrong
Embracing the Possibility of Failure to Reduce the Occurrence of Failure
My decades-long fascination with error, harm, and failure has left me humble about the complexity of these topics. The mix of factors—technology, psychology, management, systems—means none of us can master every aspect of the relevant knowledge to feel we’ve got this. But a few simple practices have emerged from my work that can help prevent complex failures. With these, we all have the power to make that kind of difference—in our own lives and in the organizations we care about.
It starts with framing. Explicitly emphasizing the complexity or novelty of a situation helps put you in the right state of mind. Otherwise, we tend to expect things to go right
Next, make sure to amplify, rather than suppress, weak signals
Finally, make a habit of practicing. Musicians, athletes, public speakers, and actors all rehearse before a performance to be as prepared as possible. In organizations with spectacular safety records—such as Alcoa under Paul O’Neill—don’t be surprised if you see people routinely doing dry runs, drills, or practice sessions. They don’t have great records because they somehow figured out how to eliminate human error. No. They have great records because they catch and correct error. That takes practice. It also helps to build a culture that celebrates it.
PART TWO
PRACTICING THE SCIENCE OF FAILING WELL
CHAPTER 5 We Have Met the Enemy
Between stimulus and response there is a space. In that space is our power to choose our response. In our response lies our growth and our freedom. —Attributed to Viktor E. Frankl
By now, readers of this book will appreciate that being wrong is part of being alive in a complex and uncertain world. There is no shame in being wrong about the future. No matter how well we do our homework and no matter how much thought goes into our predictions, some of them will turn out to be wrong. Just ask Thomas Edison. Or Jennifer Heemstra. Nothing ventured, nothing gained, especially when it comes to intelligent failure
For some, to begin learning from failure takes one large enough to be undeniable. We need to be hit over the head with our wrongness to make us stop in our tracks and start to wonder where we went wrong
We all don’t need a public fiasco to change how we think to help us better navigate the inconvenience and embarrassment of ordinary, not-so-large failures in our day-to-day lives. We just have to learn a new way of thinking—one that favors learning over knowing.
Who Me? Couldn’t Be!
Overcoming the instinct to find someone—something—to blame for even the smallest blip is a good start
This chapter digs into how our spontaneous thinking makes it hard to confront even the most intelligent failures constructively and describes practices that can help
They’ve been developed or practiced by psychologists, artists, athletes, scientists, and physicians. And they have one thing in common: no one can do them for you
How We’re Wired
To begin with, our brains are wired in ways that make it easy to miss our failures, often leaving us blissfully unaware we’ve come up short. I am not talking about willful denial—but rather about how we literally miss crucial signals that point to the need for corrective action. Even if you were familiar with the concept of confirmation bias, chances are that you rarely stop to consider the role it plays in your day-to-day life. Have you ever found yourself driving along, convinced you’re heading toward your intended destination, then suddenly realizing you’re lost? Possibly you dismissed puzzling signals along the route (How weird, they moved that sign) that could have enlightened you? I know I have. For me, it’s somewhere between embarrassing and laugh-aloud funny when the truth of my error suddenly becomes undeniable (maybe the location of the sunset tips you off).
Even experts in data interpretation can be fooled by their beliefs. All of us readily notice signals that reinforce our existing beliefs and unconsciously screen out signals that challenge them. This is true both for specific situations (the direction I’m driving right now) and for general opinions about the world (climate change is a hoax). To see how this works, you don’t have to look much further than how you gravitate to news feeds that supply updates reaffirming your existing interpretation of certain events
The sunk-cost fallacy—the tendency to persist in a losing course of action after investing time or money in it when stopping would be more beneficial—is a type of confirmation bias. We can’t quite believe we were wrong in our initial assessment, don’t want to reconsider, and thus dig in, becoming even more wrong—throwing good money after bad, as the saying goes. An unwillingness to believe our initial assessment is wrong is one of the ways otherwise intelligent failures in new territory—such as in a company innovation project—become less intelligent: teams continue to push forward despite a growing unspoken awareness that the project is doomed.
Confirmation biases are fueled by our natural motivation to maintain self-esteem, which helps us tune out signals that we might be wrong. Those who score high in narcissism experience a greater confirmation bias
Neuroscience research identifies two basic pathways in the brain—the low road and the high road. Daniel Kahneman, the psychologist who showed that our aversion to loss outweighs our attraction to gain, popularized this distinction in his 2011 book Thinking, Fast and Slow. Slow (high road) processing is thoughtful, rational, and accurate, while fast (low road) processing is instinctive and automatic. Why are these distinctions important? It’s easy and natural for us to process a failure through fast, instinctive, automatic low road pathways in our brain. The problem is that low road cognition triggers an immediate response to failure in the brain’s amygdala (that fear module for self-protection that in today’s world sometimes holds us back from risk-taking
As we have already seen, how we interpret events affects our emotional responses to them. Fortunately, we can learn how to reinterpret events in our lives to avoid persevering in unproductive negative feelings. To do that, you must override the amygdala, with its superfast pathway from perceived threat to fear, to challenge its automaticity with information and reasoning.
The amygdala, which protected us from many real threats in prehistoric times, operates according to a better safe than sorry logic
But today, that same fear module makes us unwilling to take career- and life-enhancing interpersonal risks that no longer threaten our survival
Meanwhile, distracted by irrational prepared fears, we miss signals of longer-term peril that require slower thinking but constitute true threats to survival, such as the impact of climate change on food supplies and sea levels
Most important, it happens when we stop to ask ourselves, How might I have contributed to the failure?
What fascinates me about the distinction between automatic and considered thinking is that the solutions experts have devised to override habitual human cognition are, at their core, similar. Coming from fields as varied as psychiatry, neuroscience, and organizational behavior, these strategies consistently identify the possibility of pausing to choose how we respond
Failing to Learn from Failure
Eskreis-Winkler and Fishbach also showed that, unsurprisingly, people are less likely to share information about their failures compared with their successes. The first reason is obvious: people don’t want to look bad in front of others. But the second reason was more subtle. When they asked fifty-seven public school teachers if they would prefer to share stories of a past failure or a past success, 68 percent of the participants opted to share success. Even though the stories would be shared anonymously, removing the risk of looking bad in front of others, the teachers still opted for success stories. Why? They believed that failures told them what not to do but not necessarily what to do to succeed the next time
Eskreis-Winkler and Fishbach concluded that unawareness of failures’ useful information made learning from failure difficult. So they designed an experiment in which participants were helped to identify the useful information in their failures, and this made them more likely to share them.
What should we take away from this academic research on learning from failure? Learning from failure is difficult for a host of reasons. Sometimes we miss the failures, other times they threaten our self-esteem or don’t seem to contain valuable information or we don’t speak up about them. These largely cognitive barriers are exacerbated by the unpleasant emotions failure evokes, especially in relation to how we measure up to others
The Quiet Power of Shame
In a world obsessed with success, it’s easy to understand how failure can be threatening. Many live not so much lives of quiet desperation but of quiet shame. No one has done more to explain and lessen the emotional pain this causes than Brené Brown
Driving out shame
A professor at the University of Houston, Brown has popularized her research about shame, vulnerability, and empathy in a series of books, podcasts, and TED Talks. We’ve all experienced what Brown calls the warm wash of shame when we’ve failed in the eyes of ourselves or others. She defines shame as an intensely painful feeling or experience of believing we are flawed and therefore unworthy of acceptance and belonging. Some researchers see shame as the preeminent cause of emotional distress in our time. No one wants to remain long in that intensely painful warm wash
When we see failures as shameful, we try to hide them. We don’t study them closely to learn from them. Brown distinguishes between shame and guilt. Shame is a belief that I am bad. Guilt, in contrast, is a realization that what I did is bad. I am bad because I didn’t do my homework engenders feelings of shame. But if I see my actions as bad (guilt), it fosters accountability. It is thus better to feel guilty than ashamed; as Brown tells us, Shame is highly, highly correlated with addiction, depression, violence, aggression, bullying, suicide, eating disorders… [while] guilt [is] inversely correlated with those things.
What happens if we rethink failure in this way? We can help ourselves learn from failure if we simply reframe a situation from I was not promoted because I am a failure to I failed to get the promotion. Our relationship to failure improves when we unlearn the belief that I am a terrible nurse because I made that mistake to understand instead that I made a mistake and to ask, What can I take away from it that will help me avoid making the same one in the future?
Social media, as a relatively new communication phenomenon, capitalizes on our age-old reluctance to share our failures. Social media’s relentless visuals make it easy to focus on how we appear to others and to feel ashamed if we don’t somehow match up to the group’s ideas of perfection
Multiple studies have concluded that social media usage is harmful for teenagers’—especially teen girls’—sense of self, aggravating body image issues and contributing to feelings of low self-esteem
Social comparison is natural. One of the most ubiquitous and enduring features of human society, social comparison has helped people behave in ways that contributed to cooperation and health for countless generations. But this natural human tendency is transmogrified by the ease with which social media expands the comparison set, while systematically biasing the content toward unrealistic standards
It stands to reason that social media is shaping our behavior in ways that make sharing problems, mistakes, and failures harder than ever. Both research and firsthand accounts focus on the harmful effects
Given what we know about the pressure to only share the ups so as to look perfect in the public eye, the willingness of a handful of superstar athletes to come forward and admit their vulnerability is all the more admirable
As Brené Brown says about parents, When you hold those perfect little babies in your hand, our job is not to say, ‘Look at her, she’s perfect. My job is just to keep her perfect—make sure she makes the tennis team by fifth grade and Yale by seventh.’ That’s not our job. Our job is to look and say, ‘You know what? You’re imperfect, and you’re wired for struggle, but you are worthy of love and belonging.’
Choosing Learning over Knowing
Whether from the research or your own life experiences, it is probably clear by now that the deck is stacked against us having a lighthearted, learning-oriented relationship with failure—a relationship this book seeks to nurture. The fears and defensive habits that buffer us from some of failure’s unpleasantness and bolster our self-esteem also place limits on our ability to grow and thrive. The good news is that we can learn to think differently—so as to find more rewarding and joyful ways of navigating life in an uncertain and constantly changing world. Wharton professor Adam Grant devoted his compelling book Think Again to the idea that, with conscious effort, we can indeed learn to challenge our automatic thinking. A few research-backed suggestions for stretching your boundaries and for feeling better about your inevitable failures follow
The overarching skill that ties the self-disciplines of failing well together is framing—or more precisely, reframing. Framing is a natural and essential cognitive function; it’s how we make sense of the continuous , overwhelming, confusing information coming our way. Think of a frame as a cluster of assumptions that subtly direct attention to particular features of a situation—just as a physical frame around a painting draws attention to certain colors and shapes in the artist’s work. We experience reality filtered through our cognitive frames, a fact that is neither bad nor good. But it gets us in trouble when we fail to challenge frames that don’t serve us well. When confronting failure, most of us automatically frame it as bad, triggering self-protective reflexes and shutting down curiosity
Fortunately, reframing is possible. This means learning to pause long enough to challenge automatic associations. Realizing you will be late for an important meeting, you can challenge the spontaneous panic response—taking a deep breath and reminding yourself that it will be possible to make amends, and your survival is not at stake
In a far more dramatic example, Nazi concentration camp survivor Viktor Frankl elucidated the power of reframing for readers of his timeless book, Man’s Search for Meaning. Enduring concentration camps, including Auschwitz, in part by imagining himself in the future sharing stories with those on the outside of the courage he saw in others, Frankl deliberately reframed the meaning of the horrors he was experiencing. Trained as a psychiatrist and psychotherapist, he recalls this as a moment of transformation—a shift from minute-to-minute suffering and fear to hope grounded in a plausible vision of the future. Frankl’s remarkable story of resilience shows how seeing the same situation in a new way can be life enhancing.
Reframing
Modern psychologists have identified a handful of opposing cognitive frames in which one frame is healthier and more constructive but the other is more common. Essentially, the more constructive frames embrace learning and accept setbacks as necessary and meaningful life experiences . The more common and natural frames, in contrast, interpret mistakes and failures as painful evidence that we’re not good enough
One of the most popular and powerful of these frameworks, identified by Carol Dweck at Stanford University, contrasts a fixed mindset with a growth mindset
A business leader who has taken Carol’s work to heart is Microsoft CEO Satya Nadella, who worked hard to change the culture at his company to embody a growth mindset. Speaking in a prerecorded video for a course I taught in January 2022, Nadella recalled, I was lucky that I picked a metaphor that spoke to what people wanted. The growth mindset helps them be better at work and at home—a better manager, a better partner. They are able to push themselves to learn and make the organization around them better. That is a powerful thing.
The power of the Stop—Challenge—Choose framework lies in its simplicity. As an aid to reframing, it’s also consistent with insights I gained from studying with Chris Argyris, who conducted research with teams of senior managers in companies. To put its wisdom simply, one could say the fundamental human challenge is this:
It’s hard to learn if you already know
Unfortunately, we are hardwired to feel as if we know—as if we see reality itself rather than a version of reality filtered through our biases, backgrounds, or expertise. But we can unlearn the habit of knowing and reinvigorate our curiosity
Cognitive habits for responding to failures
What It Means: Pause to disrupt automatic emotional responses to situational stimuli to make it possible to redirect the spontaneous emotional and behavioral responses.
How to Do It: Take a deep breath to prepare to examine your thinking and consider its impact on your ability to respond in a way that (1) protects your longer-term health and (2) gives you more options.
Useful Questions: What is going on right now?. What is the big picture?. How was I feeling before this happened?
What It Means: Consider the content of your spontaneous thoughts to assess their quality and usefulness for achieving your goals.
How to Do It: Verbalize (to yourself) what’s going on in your mind in response to this situation, and ask yourself which thoughts (1) reflect objective reality, (2) support your health and effectiveness, and (3) will be likely to elicit a productive response. Identify alternative interpretations of the situation that are based in objective reality and more likely to help you elicit a productive response—that is, deliberately reframe the situation in a way that helps you move forward and feel better.
Useful Questions: What am I telling myself (or believing) that is causing how I am feeling?. What objective data support or negate my interpretation?. What other interpretation of the situation is possible?. Based on all of the information I have, was my interpretation in my best long-term interests
What It Means: Say or do something that moves you closer to achieving your goals.
How to Do It: Respond in the way your reframed thinking suggests, so that you say and do things that help you move forward.
Useful Questions: What do I truly want?. What is going to best help me achieve my goals?
Choosing learning
Once we’re humble enough to admit we don’t know, we’re ready to approach situations in a new way
Chris identified our cognitive programming (how we think) as a vital lever we can learn to pull to become more learning oriented and effective, and, I’d add, more joyful. The joy comes from realizing that we can break the link between what happens to us and how we respond. To reframe. As Frankl allegedly put it, In our response lies our growth and our freedom.
Chris Argyris called this the uncovering of the non-learning theories-in-use, which protect our egos but get in the way of our being truly effective (especially in difficult conversations with others). Choose learning over knowing
The message is the same. Pause to challenge the automatic thoughts that cause you pain and embarrassment. Next, reframe those thoughts to allow you to choose learning over knowing. To look outward and find energy and joy from seeing what you missed. At the core of the reframing task lie the words we use to express our thoughts, privately and aloud. Am I failing, or am I discovering something new? Do I believe I should have done better—and I’m bad for not having done so—or do I accept what happened and learn as much as I can from it? Am I okay with the discomfort that comes with new experiences? Will I give myself permission to be human? Permission to learn?
Permission to Learn
As the comic strip character Pogo purportedly said, we have met the enemy and he is us. Our distorted, unrealistic expectations for avoiding all failures are indeed the culprit. Mastering the science of failing well must therefore start with looking at ourselves. Self-awareness is the first, and most vital, of the three competencies we need to develop. The other two, situation awareness, covered in the next chapter, with system awareness immediately following, can only be developed when we give ourselves permission to keep learning.
CHAPTER 6 Contexts and Consequences
We cannot direct the wind, but we can adjust the sails! —Dolly Parton
Practicing the science of failing well requires awareness of two dimensions of context: (1) how much is known and (2) what’s at stake. The first dimension concerns the degree of novelty and uncertainty. The second is about risk—physical, financial, or reputational. Roughly speaking, are the stakes high or low? Stepping on a square that beeps in a classroom exercise would be a good illustration of low stakes. Sending a space shuttle into orbit? High stakes. This is often a subjective assessment; for instance, what might be high stakes financially for me might be low for you. Reflecting on both the uncertainty and the stakes in a situation, subjective or not, is a crucial competency for elite failure practitioners
The Varying Contexts in Our Lives
Will you fail today?
It depends, to a large degree, on the situations in which you will find yourself. The odds of failing differ dramatically based on the level of uncertainty. How much a failure matters differs, too. Is human safety at risk? Could a failure bring serious financial or reputational harm?
This chapter looks at how contextual unawareness leads to avoidable failures in some settings and unwarranted anxiety in others. Contextual awareness, in contrast, allows you to practice vigilance when necessary and to relax when the stakes are low. It’s a Stop—Challenge—Choose opportunity to assess the situation, challenge your automatic beliefs about it, and choose the right mindset
Context is partly shaped by the level of uncertainty
Consistent contexts bring the certainty that novel contexts lack. When procedural knowledge is well developed—as in following the cookie recipe—uncertainty is low and the odds of failure are low. In contrast, in novel contexts, knowledge of how to get the result you want lies somewhere between nonexistent and incomplete, as when you set out to write a book, design a new product, or find a nonbeeping path across an electric maze. Failures are all but guaranteed when uncertainty is high. But those failures don’t have to be painful. They provide valuable information, and contextual awareness makes this easier to appreciate
The consistent contexts in your life trigger no anxiety about whether you’ll be able to achieve a desired result. In these situations you can say, with confidence, I’ve got this. I don’t want to suggest that emptying the dishwasher brings joyful exuberance, but rather that it’s reassuringly familiar, not to mention satisfying when everything is back where it belongs. The problem is that we too readily treat situations as consistent when they are variable or sometimes even novel
The variable contexts in our lives keep us on our toes
Variable contexts bring more uncertainty than truly consistent contexts, but your ability to navigate the situation is rarely in doubt.
Because of the complexity of the world we live in, the majority of situations we encounter day-to-day are variable and demand at least some of our attention. Even situations that seem consistent may be more variable than you think. Maybe you’ve made the soufflé countless times at home, but you don’t know how it will turn out in someone else’s oven when you’re visiting for the weekend
Finally, just as at IDEO, where everyone worked on innovation projects, the novel contexts in your life present possibility without a guarantee of results. Achieving success in these contexts necessarily requires trying something new, and it’s unlikely to work perfectly the first time
Without wandering into novel contexts now and then we are at risk of stagnation, losing out on the chance to try an unfamiliar activity or achieve a new goal. Just as scientists in laboratories do, we must embrace failures in new territory. You cannot avoid them; you might as well welcome the opportunity to learn from them
What’s at Stake?
When practicing situation awareness, the second thing to consider is what’s at stake—financially or physically or for your reputation. A good rule of thumb is to cheerfully accept failures with low consequences and to take measures to prevent high-stakes failures. Situations are defined by a combination of uncertainty and potential consequences. When physical, financial, or reputational harm could occur, the stakes are high
Unloading the dishwasher, cooking, or trying to navigate across a beeping rug are low-stakes situations, where failure is unlikely to have serious consequences. When you drop a dish while unloading the dishwasher, it’s a fairly inconsequential failure in a predictable context. Adopting a no big deal response and quickly moving on—maybe pausing to remind yourself to pay attention when your hands are wet—is healthy
Elite failure practitioners such as Child take advantage of low-stakes situations in novel contexts. At best, you will discover something new. At worst? It’s simply a beep going forward
Getting into the habit of recoding the risk level in many of our activities, along with the stakes we incur in carrying them out, is a vital, life-enhancing capability. By cultivating this habit, we lighten the emotional load. We have more than enough situations in our lives where vigilance is essential; when it’s not, we can proceed in a more playful and lighthearted way—even when we’re doing things that are important to us (cooking, writing an essay, learning a new language). In consistent contexts with low stakes (folding the laundry, going for a run), a casual, business-as-usual approach is fine. Pausing to consider (or, more typically, reconsider) the stakes allows us to titrate vigilance, mitigating its emotional and cognitive tax
In contrast, when the stakes are high—especially for human safety—you want to take an approach that ranges from mindful execution to cautious action to careful experimentation
Three Dimensions of Consequentiality
Higher Stakes: Activities with life-or-death consequences, or the potential for grave injury, such as flying an aircraft or conducting surgery
Lower Stakes: Trying out a new sport where you might suffer sore muscles or small injuries
Higher Stakes: Putting a large sum of money into a risky investment
Lower Stakes: Buying a movie ticket without knowing anything about the film
Higher Stakes: Activities subject to wide public scrutiny for which you may be underprepared or unqualified
Lower Stakes: Expressing a controversial opinion at a party to someone you don’t know well
Situation Unawareness and Preventable Failures
A lack of situation awareness can spawn a variety of preventable failures—usually due to a cognitive bias called na?ve realism. As described by Stanford psychologist Lee Ross, na?ve realism gives you an erroneous sense that you see reality itself—not a version of reality filtered through lenses created by your background or expertise
Situation awareness in failure science means appreciating the level of uncertainty and what it brings. It’s about pausing, however briefly, to consider where you are on the continuum from consistent to novel, so as to proceed with an appropriate approach
Mapping the Failure Landscape
The relationship between context type and failure type has probably already jumped out at you. For example, new contexts and intelligent failures go hand in hand. A 70 percent failure rate (nearly all of the failures intelligent) is not atypical for scientists at the top of their field. In novel contexts, you must experiment to make progress, and intelligent failures come with the territory. Each is a useful discovery. Although authors can’t easily quantify their failure rates, by the time I finish writing this book, more words will have been deleted than retained. In novel territory, this can’t be avoided
As uncertainty increases, the chances of failure increase, and the type of failure tends to differ accordingly.
In predictable contexts, we often generate basic failures because of the temptation to do it in your sleep. Mistakes creep in despite access to foolproof knowledge about what to do to get the result you want. Maybe you forget to set the timer and burn the cookies
Complex failures are especially common in variable contexts
To thrive in the variable contexts in our lives we must be vigilant and resilient
When we venture into new territory—moving to another city, starting relationships, learning a language, creating a recipe—failures are inevitable
As I’m sure you’ve already realized, you can generate an intelligent failure in a consistent context. But you can also experience a basic failure in a novel context. Let’s call these off-diagonal failures. Every other combination is possible, too, as you will see. Marrying the three failure types with the three context types gives us nine failure-context combinations, as shown in Figure 6.2. Along the diagonal are the three iconic failure types in their home turf. Now let’s take a look at six other failure stories to get a sense of the rest of the failure landscape
Expect the Unexpected
While sizing up the uncertainty and the stakes, you might ask yourself, Is this something I’ve done before? Are there experts or guidelines I can use to increase the chances of success?
CHAPTER 7 Appreciating Systems
A bad system will beat a good person every time. —W. Edwards Deming
You probably already know that Silver’s laboratory failure would become the opening chapter in a multibillion-dollar business success story called Post-it notes. But you may not appreciate the degree to which 3M had built a system that dramatically increased the chances of successful innovation
The journey that eventually turned a failed aircraft adhesive into a wildly popular product—and how easily it could have been missed altogether without a special combination of persistence and collaborative happenstance—sheds useful light on the nature of systems. In addition to organizational systems such as 3M, all of us operate in systems in our everyday lives—family systems, ecosystems, and school systems, to name a few
Systems and Synergy
As we saw in chapter 4, systems with interactive complexity and tight coupling are vulnerable to breakdowns. By taking the time to consider how a system works, many complex failures can be avoided. This starts with understanding how a system’s elements interrelate and what vulnerabilities those relationships create
System design is not just for preventing failures. Equally important is the opportunity to design systems thoughtfully to achieve particular goals. For instance, later in this chapter we’ll look at how 3M designed a system to foster innovation, rather than simply announcing innovation as a goal and hoping for more of it
Experiencing Systems
Supply chains are particularly vulnerable to system failures, as we experienced during the COVID-19 pandemic when factory shutdowns and shipping delays in one part of the world affected what people could buy in another. Had more companies made decisions based on the capacity of other players in the system, the disruption might have been far less
They start to wonder, Where else might I be contributing to the very failures I blame on other people or situations outside my control?
Systems Thinking
Systems thinking is not a panacea, and simply learning about it won’t magically solve problems created by its absence. But with repeated practice, your thinking habits can be changed to build system awareness into your life
Practicing systems thinking starts with consciously expanding your lens from its natural preference for here and now to include elsewhere and later
Two simple questions can help:
Most of us know to be wary of the quick fix—the Band-Aid that plasters over but doesn’t solve an underlying problem—but we still take these tempting shortcuts often—ignoring, or failing to make the connection to, the part where the problem recurs or even worsens. We’re vulnerable to falling into the trap of what Senge calls the fix that fails. This classic system dynamic describes a short-term solution that ends up exacerbating the problem it was intended to fix.
Our mental models are partially to blame. A mental model is a cognitive map that captures your intuitive notions about how something in the external world functions. Its power comes from being taken for granted: you don’t consciously pay attention to your mental models, but they underlie your understanding of how things work and thus shape your responses in largely invisible ways. Most important, mental models encode beliefs about cause and effect
This is neither good nor bad—just descriptive of how your brain works. Mental models are invaluable in helping us make sense of the complex and chaotic world around us so we can navigate it without being paralyzed—unable to make simple decisions—in the face of complexity. But our default mental models don’t usually include system effects, until we learn to pause and challenge some of our automatic thinking
Take the proverbial toddler having a tantrum and demanding candy while you’re at the grocery store. The easiest fix, especially for a frazzled parent, is to simply give the child the candy. But that only works for a short time—until the sugar rush abates and the bad mood returns. Worse, it sets a precedent for rewarding bad behavior, increasing the chances for future demands. The quick fix ignores both the near-term feedback loop (today’s sugar rush) and the longer-term effects (behavioral problems that set in).
We discovered that nurses’ responses to process failures fell into two categories. What we called first-order problem-solving was a work-around to complete the task without addressing causes of the problem
In contrast, for 7 percent of the process failures, nurses engaged in what we dubbed second-order problem-solving. This could mean simply informing a supervisor or someone in charge of linens about the shortage. Second-order problem-solving got the immediate task done and did something to prevent the problem from recurring.
We can easily understand why busy nurses rarely engaged in second-order problem-solving. But this left them vulnerable to continued frustration because the work-arounds didn’t reduce the frequency of future process failures
You redraw the boundaries of your decision or action when you go beyond here and now
We soon discover additional dynamics playing out over time. Eight additional relevant factors are included in the redrawn, expanded hospital system
For example, many of the nurses we spoke to described a hero feeling from using work-arounds that ensured that patients got the care they deserved. Whether walking down the hall to locate the extra linens or going to the pharmacy to obtain a missing drug, the nurses derived gratification (factor 3) from overcoming the many little hurdles their jobs threw in their path. But this hero feeling lessened their motivation to engage in second-order problem-solving, as depicted by the negative arrow linking gratification to second-order problem-solving (factor 4).
Worse, over time (two slashes on an arrow between elements in a system diagram indicates a delayed effect) the effort and time nurses put into work-arounds contributed to burnout (factor 5). This further diminished their capacity for second-order problem-solving, in turn reducing the effectiveness of such efforts (factor 6) and allowing process failures to continue unabated (factor 7
Given this problematic system dynamic, what’s a nurse (or nurse manager) to do?
For the toddler throwing a tantrum for candy, the levers are those that help you practice positive redirection and limit-setting, including a regular nap schedule, to help a toddler develop healthy, happy behaviors. These levers exist in the larger system of parenting, not in the moment of the breakdown. Doing this thus starts with redrawing the boundaries of the system—to go beyond simply reacting to problems in the moment and instead step back and anticipate the downstream consequences of decisions that make sense in the here and now
Designing Systems
I suspect that you, like me, have spent time in at least one organization where the incentives encouraged counterproductive behavior
This kind of disconnect happens often. Management practices are designed by experts in one part of a complex organizational system, reflecting logic that makes sense to them. Meanwhile, unintended consequences in another part of the system circle back to thwart the best-laid plans. Let’s say a retail company, in an effort to attract midweek shoppers, decides to move a special promotion from a typically busy Friday to Wednesday. Sounds like a good idea at headquarters, right? But the store manager must then shift schedules from Friday to Wednesday, forcing employees to rearrange their lives, which in turn drives absenteeism and turnover.
Systems thinking lends itself to better system design. It’s possible to design organizational systems—or family schedules—so that many elements reinforce a key priority, say, quality or safety or perhaps innovation. Let’s take a look at some best-in-class systems in each of the categories.
How do you increase the chances that a failed adhesive turns into a brilliant product? With a system designed to bring curious risk-takers together. Encourage and celebrate boundary spanning. Provide resources and slack time. Normalize intelligent failure and celebrate pivots. Declare that you want a significant portion of your company’s revenues (or school’s curricula or family’s activities) to come from new and different products, courses, or experiences. Successful innovation does not come from the lone genius. Importantly, each of these familiar elements of innovation is reinforced by each of the others. The whole is more than the sum of the parts
In designing a system to reduce basic failures and promote continuous improvement, no company rivals Toyota. It’s not incidental that Toyota calls its approach, which has evolved over decades of experimentation, the Toyota Production System, or TPS. Manufacturing experts agree that this system creates far more value than the mere sum of its parts
Large modern hospitals encompass a nearly incalculable number of interconnected processes that intersect with myriad health-care professionals and patients every day. This complexity and variability combine to create the potential for a dizzying array of complex failures
Use Systems Thinking to Change How We Think about Error
Understanding Systems to Better Navigate Failure
Appreciating the dynamics of systems is the last of the three competencies for practicing the science of failing well. After self-awareness and situation awareness is system awareness. Mastering system awareness starts with training yourself to look for wholes rather than zooming in, as we naturally do, on the parts. It’s about expanding your focus, even if briefly, to redraw the boundaries and see a larger whole and the relationships that shape it
Much of our education and work experience has taught us to diagnose and become experts in parts, shortchanging the value of looking at the relationships that tie them together. We can learn to see and appreciate systems and use this knowledge to reduce preventable failures
Don’t forget that appreciating systems helps us see that we are not wholly responsible for all the failures in our vicinity. This is not to let us off the hook for our contributions to failures, but rather to help us see that we are parts of larger systems, with complex relationships, some of which are beyond our ability to predict or control.
Neither systems thinking nor system design are simple, straightforward skills. Systems offer endless complexity. The boundaries of a system can always be drawn differently. What part of a system you consider of interest is a judgment call, and drawing boundaries is inherently creative
The point isn’t to capture the correct system boundaries but to undertake the systemic thinking that helps you make a decision more mindfully. That can seem distressing (there is no right answer!) but also empowering (you get to choose!). The choices you make can expand your opportunities for experimenting and learning.
CHAPTER 8 Thriving as a Fallible Human Being
For me, losing a tennis match isn’t failure. It’s research. —Billie Jean King
Embracing Fallibility
How do you thrive as a fallible human being? I first heard Maxie Maultsby, the brilliant psychiatrist, use this term thirty years ago. He’d even abbreviate it—FHB. I smile when I think of Maxie’s earnest desire to help all of us FHBs thrive by learning to think differently. Thriving, he might add, starts with accepting our fallibility
A certain freedom comes from learning to live comfortably with who you are. Fallibility is a part of who we are. Self-acceptance can be seen as brave. It takes courage to be honest with oneself, and it’s a first step in being honest with others. Because failure is a fact of life, failing is not a matter of if but when and how
But thriving as a fallible human being also means learning to fail well: preventing basic failures as often as possible, anticipating complex ones so as to prevent or mitigate them, and cultivating the appetite for more frequent intelligent failures. Learning to recognize and learn from each of the three failure types and strengthen each of the three awareness zones is a lifelong process
We can learn to live joyfully with our fallibility. Though it may seem counterintuitive, failure can be a gift. One of its gifts is the clarity a failure can bring about which of our abilities need work; another gift is insight about our true passions. Failing a multivariable calculus exam in college was caused by my inadequate studying. But it forced me to ask myself hard questions about the work that I truly loved—and that which I was likely doing to please or impress others. That was a gift, even if it didn’t feel like one at the time.
Failure can also be seen as a privilege. As journalist and University of Colorado professor Adam Bradley points out in a New York Times article, One of the greatest underrecognized privileges of whiteness might be the license it gives some to fail without fear. He explains that being a member of a minority culture often means your failures, especially if they become public, are seen as representative of an entire group. Your individual failure reflects badly on everyone else like you
Women, especially women in academic science, also lack the luxury of failing unobtrusively. We are at risk of feeling pressure to succeed at all times lest we spoil other women’s opportunities
Sometimes accepting fallibility means accepting society’s fallibility so as to respond with equanimity to an injustice
The more I study the research on the psychology, sociology, and economics of inequality, the more massive the undertaking of correcting these societal failures feels. At the very least, I argue that as a society we should aspire to creating a world where everyone has an equal license to fail intelligently. That is not the case today. But I believe that we’re ever so slightly closer to that aspiration than we were even just a few years ago. Recognizing our heteronormative lens is an important first step. Nonetheless, I regret not focusing on these challenges earlier
What’s the relationship between failure and regret? At first glance, one might think people would dwell on and regret their biggest failures. But the research suggests otherwise. To better understand regret, bestselling author Daniel Pink collected regrets from more than sixteen thousand people in 105 countries
Pink categorized the regrets into four categories, one of which he calls boldness regrets. These were especially plentiful. People regretted not having been bold enough to take a chance with a business or a long-held dream. They regretted not having been brave enough to ask out a person they were interested
Perfectionism, or holding yourself to excessively high standards and self-criticism, is the subject of considerable research
Another problem is that those suffering from perfectionism have a hard time trying something new because they can’t tolerate that they might fail. In an ever-changing world this reluctance puts them at risk of falling behind. Perfectionists are also particularly vulnerable to burnout
When parents understand the psychological dangers of the perfection trap and the crucial role failure plays in learning and development, they more easily welcome both failures and successes from children. No child learns to ride a bike without falling over. By making it safe to fail, parents and teachers encourage children to embrace a growth mindset that supports learning. Parents who detect perfectionism in their children can help them reframe failures from shameful or even just disappointing to necessary elements of learning something new. Saying Falling is just part of learning to ride a bike is preferable to Too bad your clothes got dirty when you fell off the bike.
The most important reason to embrace our fallibility is that it frees us up to take more risks. We can choose, more often, to play to win
Another way to fail more often is to pick up a new hobby. When my friend Laura decided to take up ice hockey in her early forties, I was equal parts perplexed and impressed
Our fear of being bad at it can make it difficult to try a new sport, language, or other endeavor—remember Jeffrey almost quit bridge for good
Hobbies present a great arena in which to practice failure. Hobbies are about fun and the stimulation of learning something new rather than about achievement or making a living—a low-stakes context. Also, it is less embarrassing to fail at a new hobby than in your career
In any new context, it’s crucial to pause to consider where to experiment next. What is it that most needs to be learned to get us where we want to go? We can think of the pivot as a way to tell a different story. Instead of We made a plan and then we failed, and here’s the moral of the story, it’s a narrative about change
Whether you pivot a project onto a better path, or pivot yourself into a new role or a better relationship, pivots are integral to navigating the uncertainty that comes with novel contexts. Celebrating pivots is an easy way for managers in a company or parents in a family or partners in a relationship to reinforce acceptance of the fallibility of any person, project, or plan
Mastering the Science of Failing Well
If accepting fallibility is a first step, what else helps us thrive as fallible beings in an imperfect world? Failing well is not an exact science. The manual is still being written and will forever be revised. To begin, when you consciously stretch to try new things, your experiments necessarily bring the risk of failure. This is how you get more comfortable with it. When you take more risks, you will experience more, not less, failure
To do this, it helps to incorporate a few basic failure practices—persistence, reflection, accountability, and apologizing—into your life. Although not intended to be a complete or perfect list, each of these practices can help you build a healthy relationship to failure
How do you know when to persist and when to give up? A rule of thumb to justify persistence is to find a credible argument that the not yet realized value you seek to create is indeed worthy of continued investment of time and resources. To make sure your stubbornness is not misguided or that you are not clinging to an unrealistic dream, you must be willing to test your argument with others in your target audience. Make sure to go to people willing to tell you the truth! Blakely believed in and wanted the product she was developing for herself, a sentiment reinforced when she saw how much her friends and family loved her new design
In life, as in music practice, we have a rich source of failures we can learn from. Instead of looking away from them in denial, it’s better that we dig in and learn from them. Mining near misses can be especially gratifying
Taking accountability for our failures requires a small act of bravery. But an important part of thriving as a fallible human being is noticing and taking responsibility for your contribution to a failure without feeling emotionally devastated by it, or wallowing in self-blame or shame.
A beautiful strength lies in the willingness to say I did do it rather than blaming others, which is our default position
With fallibility comes failure, and with failure comes an opportunity to apologize. A good apology wields almost magical powers in repairing the relationship damage failures cause. According to recent research on forgiveness, thorough apologies increase positivity, empathy, gratitude, and, yes, forgiveness, while reducing negative emotions and even lowering heart rates. But if apologies are so effective, why do we so often avoid them? And are all apologies equally effective?
The quality of our apology matters. Considerable research arrives at a common set of attributes of an effective apology: clearly express remorse, accept responsibility, and offer to make amends or changes going forward.
While offering excuses (It’s not my fault because my alarm didn’t go off) backfires, explaining your actions can sometimes work (I am so sorry I didn’t call—my mother had a fall and I was so frantic to get her to the hospital I simply forgot). A successful apology communicates that you value the relationship and are willing to make amends for your shortcomings (I’m really looking forward to talking with you. When is convenient to reschedule our call?). Ultimately, an apology means accepting and admitting that you have failed
An effective public apology—much like a private one—must demonstrate care for the relationship by expressing remorse, taking responsibility, and making amends
A Healthy Failure Culture
When failures occur, we learn from them with an open mind and a light heart and keep moving forward. Freed from self-protection, we can play to win. This book is about helping you—as an individual—practice the science of failing well, but it’s a lot easier in a healthy failure culture. A few practices can help you build such a culture in the communities that matter to you
A simple but powerful step that follows thoughtful consideration of the stakes you face, along with the level of uncertainty, is to call other people’s attention to what you see
It’s difficult to like (and not be bored by) people who only boast about their accomplishments, especially when these boasts are delivered with dashes of arrogance. Sharing failures makes us more relatable and likable—and human
A cheerful humor accompanies instituting failure awards in your company or family
Managers around the world have asked me, How do I know if my team has a healthy failure culture? I answer—after checking that the team’s work involves uncertainty, novelty, or interdependence—with a question: What percent of what you hear in a given week is good news versus bad, progress versus problems, agreement versus dissent
Most of them get it immediately. Their eyes get wide—as they realize that what feels good is probably not good from the perspective of a healthy failure culture that has psychological safety for speaking up with problems, concerns, and questions
The Wisdom to Know the Difference
The science of failing well, like any other science, is not always fun. It brings good days and bad
Discernment is also needed for diagnosing situations and systems. How high are the stakes? How should uncertainty be assessed? What relationships matter most for predicting the system’s behavior? Where do you draw the boundaries to identify the system that you want to diagnose or alter? All of these challenges come down to judgment and experience. The more practice you get with the science of failing well, the more you will become comfortable and fluent in using its concepts. This book does not end with an exam on the right kind of wrong that you can pass or fail. It ends with an invitation to practice and thereby help develop the science of failing well.
Most important, discernment matters in developing the self-awareness to confront our failures, the smaller and the larger, the personal and the professional. Acknowledging our shortcomings requires and builds wisdom. Wisdom allows us to know when we’ve done as well as we can, and confronting ourselves will always be the hardest part of failing well
Also, the most liberating
ibook store
YouTube with the author