People often think of failure in one of two ways: as something that hinders the pursuit of success, or as something that’s a necessity in obtaining it — as in the Silicon Valley mantra that recommends failing fast and often.
There’s truth to both ideas, but neither offers a complete picture of failure. That’s because there isn’t just one kind of failure, but three.
Here to unpack what those three types are is Amy Edmondson, a professor of leadership at the Harvard Business School and the author of The Right Kind of Wrong: The Science of Failing Well. Today on the show, Amy shares which type of failure is most productive, which types are less fruitful, and how to best use the former, prevent the latter, and learn from failure of every kind. We also talk about how to organize potential failures into a matrix that will help you best approach them. Along the way, we dig into examples, both big and small, of how individuals, organizations, and families can put failure to work for them.
Resources Related to the Podcast
- AoM Podcast #646: How to Win at Losing
- AoM Article: Clausewitz on Overcoming the Annoying Slog of Life
- AoM Podcast #517: What Big-Time Catastrophes Can Teach Us About How to Improve the Systems of Our Lives
- AoM Article: The Power of Checklists
- AoM Article: How Reframing Builds Resilience
Connect With Amy Edmondson
Listen to the Podcast! (And don’t forget to leave us a review!)
Read the Transcript
Brett McKay: Brett McKay here, and welcome to another edition of The Art of Manliness Podcast. People often think of failure in one of two ways, as something that hinders the pursuit of success, or as something that’s a necessity in obtaining it as in the Silicon Valley mantra that recommends failing fast and often. There’s truth to both ideas, but neither offers a complete picture of failure. That’s because there isn’t just one kind of failure but three. Here to unpack what those three types are is Amy Edmondson, a professor of leadership at the Harvard Business School, and the author of The Right Kind of Wrong, The Science of Failing Well. Today on the show, Amy shares which type of failure is most productive, which types are less fruitful, and how to best use the former, prevent the latter, and learn from failure of every kind. We also talk about how to organize potential failures into a matrix that’ll help you best approach them. Along the way, we dig into examples both big and small, of how individuals, organizations, and families can put failure to work for them. After the show’s over, check out our show notes at aom.is/failwell.
Amy Edmondson, welcome to the show.
Amy Edmondson: Thank you so much for having me.
Brett McKay: So you got a new book out called The Right Kind of Wrong. It’s all about how we learn from our mistakes. So you spent your academic career researching failure. How did that happen? And not a lot of people end up researching failure. How did you end up in that field?
Amy Edmondson: Well, I suppose you could say it was a little bit by accident. And let me explain that to say I was interested in learning, and even more specifically, I was interested in the problem of organizational learning, that organizations need to keep learning in a changing world. And it turns out that is easier said than done for a whole host of factors. And one of the factors, one of the major factors that just kept coming up again and again in my research was that was our allergy to failure. The reality is things will go wrong, especially in an uncertain and complex world. And if we don’t have a healthy response to that, we don’t learn. We don’t learn and grow as individuals, we don’t alter our systems in ways we need to, so our organizations are more effective and so on.
Brett McKay: Well, it seems like in recent years, I’d say in the past 15, 10 years, there’s been this mantra of fail early, fail often that’s come outta Silicon Valley where it seems like they’re trying to rehabilitate failure saying it’s okay to fail. But you argue that this mantra, this idea, misses the mark when it comes to the benefits of failure. How so?
Amy Edmondson: So, it’s not so much that it misses the mark is that it’s very woefully incomplete. Right? Or it’s not specific enough or not… It’s too broad brush and it doesn’t say under these conditions, this is not good advice. And just to illustrate with an obvious one, you wouldn’t say to your surgeons and operating room personnel fail often, have a great time. Of course not. Right? You would say, “Gosh, can you try to get it exactly right?” And they would know that there was that job. So the problem with this, with the fail fast mantra isn’t that it’s wrong, is that it only applies to certain kinds of contexts.
Brett McKay: Right. You wouldn’t want like a nuclear reactor to fail early.
Amy Edmondson: No. No. Not even close. Or even in automotive assembly line. You don’t say fail fast. You say, Hey, how about Six Sigma? Right? Let’s get it right. We know there might be things that go wrong, but if we can have our quality be at the one error per million events level, we’re really good. And there’s no reason we can’t get there.
Brett McKay: So what you do in this book is you walk readers through your research that you’ve spent your career looking at on how we can get the benefits of failure and when it’s okay to fail early or fail fast, and when it’s better to, okay, well, we might make mistakes, but let’s try to reduce that. And you categorize failures into three failure archetypes. You have intelligent failures, basic failures, and complex failures. Let’s talk about intelligent failures. What makes a failure intelligent?
Amy Edmondson: Intelligent failures are the right kind of wrong in my view. So a failure is intelligent, should be applauded and used as often as possible when in pursuit of a goal in new territory with a hypothesis as small as possible, you turn out to be wrong. So to say that more clearly a failure is always an undesired outcome. We don’t wanna fail, we wanna succeed. But if that undesired outcome happens in new territory, in pursuit of a goal where you’ve done your homework, you’ve thought it through, you have good reason to believe it might work, and you don’t expend more resources or take a greater risk than you have to, to get the knowledge you need to go forward, then it’s an intelligent failure.
Brett McKay: And you, in the book you highlight back in the mid-century, the 20th century, when doctors were trying to figure out how to do open heart surgery, this was an example of intelligent failure. The risk was really high, but the payoff, they were in new territory, but the payoff was significant, but they did all they could to reduce failure.
Amy Edmondson: To mitigate it.
Brett McKay: Yeah.
Amy Edmondson: Right. And one of the ways they mitigated risk, or one of the ways they kept the failures, which must have been devastating emotionally and intellectually to experience, one of the ways they kept those failures as small as possible is that they, of course only operated on people who truly had no other choice. You don’t operate on a healthy patient, say, “Hey, this might not work. We’ve never done anything like this before.” No, of course not. They were operating on patients who had very little chance of survival over any kind of long term. And so, in a sense, these patients only hope was a surgical repair, and yet such a thing hadn’t been attempted before. So the odds were not truly high that it would work out well the first time. And I described this it’s such a visceral example of yes, a series of very intelligent failures that ultimately led us to the stunning success that open heart surgery is today.
Brett McKay: Yeah. Now it’s just a matter of routine.
Amy Edmondson: Practically. Yep.
Brett McKay: Yeah. But you also argue that, okay, these are the right kind of wrongs, but we still, even though when we know like, okay, we’re doing something that’s new, there’s gonna be a big payoff. We still have that aversion.
Amy Edmondson: Yes.
Brett McKay: So what’s holding us back from making these intelligent failures?
Amy Edmondson: Well, I think that aversion is part of it. And so that leads us to be risk averse. And the word is captured right in there. I mean many people, in fact, most people are more… Are sufficiently risk averse that they will fail to make progress on desired goals or fail to experience all sorts of things that they would dearly love to do, but they can’t ’cause if you want everything to go perfectly, then you will not do anything that has the risk of failure. And if you choose not to do anything where there’s a risk of failure, then you’re not growing, stretching, venturing into new territory.
Brett McKay: And then the other aspect you talk about too is that there’s that social fear. No one wants to be a loser, no one wants to be a failure. So there’s that element as well.
Amy Edmondson: Absolutely. I think most people can think of a thing they’d be willing to do behind closed door. I’ll take a risk, I’ll try something and it might fail, but as long as no one else sees me doing it or knows about it, then I’m okay with it. We don’t want the embarrassment or the humiliation of the failure.
Brett McKay: And then you talk about this idea, I think people might have heard about of, if you wanna avoid that social fear, that social stigma is, groups need to develop what’s called psychological safety. What is psychological safety?
Amy Edmondson: Well, psychological safety describes a climate, an interpersonal climate, where you really don’t feel afraid to take risks, like speaking up with an uncertain idea or disagreeing with someone, or admitting a mistake or acknowledging a failure, right? And so, all of those kinds of behaviors, those interpersonal behaviors are quite, they can be uncomfortable because you worry that people might think less of you, you might be rejected. And so in an environment where, you know yeah, that’s just what we do, it might not be easy, but I believe my team members will welcome this, or I believe my family expects me to be candid about this. So that describes psychological safety. It’s quite literally an environment where you can take interpersonal risks of the kind that are so necessary for problem solving and innovation and ideation and all of those good things.
Brett McKay: Well this goes back to a study you did early in your academic career where you thought it was a failure, but you actually found out this actually led you down some new research paths. I think early on you were researching teams like, what makes a good team? Does a good team culture reduce mistakes? And your study found, oh my gosh, this team that looks awesome. It seems like they’re cooperative.
Amy Edmondson: Yep.
Brett McKay: Everyone’s fantastic they actually made more mistakes. And you thought, oh my gosh, my hypothesis was wrong. But I think what you found out was like, actually no, what happened is this team, the teams that look like they have good camaraderie, they’re talking, et cetera, they actually just talk about their mistakes more than the teams that don’t have that connection.
Amy Edmondson: It’s exactly right. And of course, it took me hours to even think that might be what was happening, to realize that is a possible interpretation of the confusing data. ‘Cause the confusing data we’re suggesting, as you said, that the teams with higher camaraderie, higher quality relationships were ones that also had higher error rates. Now if, that just seems so wrong on multiple levels until you start to realize, well, wait a minute, the error rates, how objective are they? How do we get those data? And you begin to realize if really the only way you’re getting data on people’s errors is if they’re telling you about them or if they’re not hiding them, if they’re letting them be discovered, which is not easy for people to do. So I began to suspect that these better teams weren’t making more mistakes, they were just more open about them. And later on, I called that difference in climate psychological safety, found a way, developed a way to measure it, and ultimately found that that measure is very predictive of team performance in a huge variety of industry contexts.
Brett McKay: And we’ve had guests on the podcast that deal with family psychology, one thing that you often see is that couples who don’t argue, there’s probably problems there if they’re not having arguments.
Amy Edmondson: Yes.
Brett McKay: The same sort of thing, if you don’t feel comfortable raising concerns with your spouse, that’s a problem.
Amy Edmondson: Right. Right. It can’t be that you just have no disagreements or that you just see eye to eye on everything. There’s never a conflict, there’s never a trade off. There’s never who’s gonna pick up the kids at daycare today kind of moments, right? That just isn’t… That’s probably not possible, right? So, any relationship, any healthy relationship is going to have disagreements, conflicts, challenges, arguments. And so if you’re not hearing any of that, if that isn’t happening, it might at first glance seem like a good sign. But it probably isn’t. It probably means people are either afraid to disrupt the apparent harmony or maybe one partner not the other feels afraid to be themselves and just to speak up openly and honestly.
Brett McKay: Okay. So recap, intelligent failure it’s when you’re going for it’s new territory. What were the other factors?
Amy Edmondson: Well, it’s new territory in pursuit of a goal with a hypothesis as small as possible. So there are four criteria.
Brett McKay: Okay.
Amy Edmondson: And what we’re really talking about, of course, is an experiment.
Brett McKay: Yeah.
Amy Edmondson: I mean, you’re acting, but at some level you know that what I’m trying to do here to achieve a result that I care about may or may not work. And so it’s an experiment technically, and you try not to have experiments that are larger than they need to be when there’s uncertainty. Meaning you don’t want to invest more money than you can afford in an uncertain investment. You don’t wanna use more patients in a clinical trial than you need to, to be able to demonstrate efficacy of the treatment.
Brett McKay: That makes sense. So if you’re thinking about starting a business, you don’t, might not necessarily wanna quit your day job and then mortgage your house.
Amy Edmondson: Right. Right. Yeah.
Brett McKay: And bet it all on this business. There’s other small steps you could take to get eventually start your own business.
Amy Edmondson: Yes. So mitigate risk by doing it incrementally.
Brett McKay: Yeah.
Amy Edmondson: Doing it… And that might sound too risk averse, but it isn’t because each of those incremental steps involves some risk. But you’re managing risk, which is smart.
Brett McKay: So another type of failure you talk about is a basic failure. So what’s a basic failure and how does it differ from an intelligent failure?
Amy Edmondson: Basic failures are pretty simple. They’re undesired results that were caused by human error. And human error means there was a right way. You could think of it as a recipe. There was a recipe that could and should have been used to get the result you want, but you made a mistake and it led to a failure. So basic failures are simple. That doesn’t mean they’re small. They can be big, they can be small, but technically, or at least theoretically, they’re preventable. When we’re really at our best, when we’re alert, when we’re vigilant, when we’re exercising good teamwork, we can catch and correct most basic failures.
Brett McKay: Okay. So basic failure would be swapping out salt for sugar in a recipe.
Amy Edmondson: Exactly.
Brett McKay: Okay.
Amy Edmondson: Exactly. And the cookies will taste… Or the sauce will taste bad because now it’s got too much sugar in it and not enough salt.
Brett McKay: Right. And yeah. But these basic errors, they can be big. You talk about a Citibank employee who did something wrong with the computer and like transferred, I think it was millions.
Amy Edmondson: $800 million. So it was essentially, you could think of it as a checking the wrong box in an online form, which means you’re not being vigilant enough, right? If you have a system that can allow you to accidentally transfer essentially the principle of a loan rather than the interest, it’s probably not a well designed system. But nonetheless, it was a system that needed to be used with great care and human error led to this massive kind of economic loss.
Brett McKay: And you also talk about a basic failure. Unlike an intelligent failure, a basic failure occurs in known territories.
Amy Edmondson: Right.
Brett McKay: So like you know the recipe, you know the protocol for transferring money.
Amy Edmondson: Right.
Brett McKay: It’s just you messed up somewhere along the way. Yeah.
Amy Edmondson: You messed up. Right. And to say that it’s in known territory, and it’s not as kind of valuable as an intelligent failure, doesn’t mean you can’t learn from it. We can definitely learn from our basic failures, right? I try very hard to learn from my own basic failures, but the sort of amount of learning is arguably less, much less than for new failures, intelligent failures where there just wasn’t a recipe. So you had no choice but to kind of create a recipe.
Brett McKay: Yeah. I think the things you can learn from a basic failure is you learn about human nature. Like when is our tendency to be not paying attention? When do we have a tendency to neglect things? And so you can develop systems or protocols to counter that.
Amy Edmondson: Yes. Yes. Self-awareness and systems or protocols. So, an example of a basic failure would be if you text and drive and then got into an accident, that would be a basic failure. And obviously the thing you learn is pretty darn clear here. Do not do that. Which I think most of our listeners know that already, but we know it still happens as well.
Brett McKay: One of the things that people have done over the years to counter these basic failures, a simple checklist can get you through a lot of things. This started off with pilots during World War II. They have a checklist they go through. And then now in the medical field, they also have very basic rudimentary checklists. You think, why would I need to go through, did I wash my hands? Did I…
Amy Edmondson: Right.
Brett McKay: But it works.
Amy Edmondson: Busy people, but you know, busy people have a lot on their mind. And sometimes just to have that concrete checklist in front of you will remind you, put you back into the mindset you need to be careful. And of course, Atul Gawande, the fabulous surgeon and author wrote this marvelous book, The Checklist Manifesto about the power of something so simple as a checklist. And it’s really all about the prevention of basic failure. And it’s also the case that just having a checklist won’t itself be enough or they have to be used with intent.
Brett McKay: Well, yeah you highlighted a case of a pilot. There was a checklist they were supposed to defrost or not, something like that, but they didn’t. Yeah.
Amy Edmondson: Yes, it’s Air Florida. It’s a famous and tragic accident. It’s Air Florida Flight 90 back in January of 1982. And the pilot and the co-pilot went through the checklist essentially in their sleep. You know, it’s Air Florida. So you can imagine that most of the time when you’re in Air Florida, you’re in warm weather most of the time when you’re saying anti-ice off, the correct answer is off. And that’s exactly what they did. The pilot said Anti-ice off. And the co-pilot said off, as if that was the right answer, on a cold January blistery icy day in Washington DC that was in fact decidedly the wrong answer. Not going through the anti-ice routine led to the downing of that flight and the death of 78 people.
Brett McKay: Okay. So it’s great application for this in everyone’s just day-to-day life, look at the things you do on a regular basis where you’ve noticed simple mistakes happening because of you not paying attention and develop a checklist for that and follow it. And this could be like when you’re packing for a trip, right?
Amy Edmondson: Yeah.
Brett McKay: Instead of thinking, oh, I forgot whatever. Have a have a checklist you use every single road trip.
Amy Edmondson: Absolutely. I finally did that myself because there were too many times where I’d get somewhere I realized I forgot socks. [laughter] What are you thinking? You weren’t thinking. You need a little help.
Brett McKay: And that’s why you have the checklist. Okay, so that’s basic failure. The next type of failure is a complex failure. What makes a complex failure complex?
Amy Edmondson: A complex failure is complex because it’s multi-causal. The characteristic complex failure has multiple factors that contributed to the failure. It’s the interaction between the factors that led to the mess. ‘Cause any one of these factors on their own would not cause a failure. So they’re not like the basic failures where you just need this one mistake and boom, there’s a failure. There could be mistakes involved, but you don’t need mistakes involved. You just need an unfortunate coming together of several factors that lead to a kind of breakdown.
Brett McKay: What’s an example of that, that we’ve seen in recent years?
Amy Edmondson: Well, recent years, I mean, in one of my research studies in a hospital context, there was a young patient, a 12-year-old boy named Matthew. Actually not his real name, but nonetheless, he was mistakenly given, mistakenly meaning by the system, I guess, a potentially fatal dose of morphine. Fortunately, it was caught and counteracted. But in my analysis, I identified like seven different factors, any one of which on their own wouldn’t have led to this sort of accidental overdose. The first thing was that the hospital’s ICU were post-surgical patients, which Matthew was, ordinarily go was full. So then he had to go to be cared for by people who don’t usually care for post-operative patients. So they were kind of less specialized. He happened to be cared for by a brand new nurse who maybe didn’t have as good instincts about the processes of care. There was a pump that is used for this pain medication that was in a sort of dark corner, making it harder to read.
There was sort of a printing error in the… Or not error, a printout of the concentration that was folded in a way that made it hard to read. So I don’t need to go into all the details, but the point is that any one of those factors on their own wouldn’t lead you to give this overdose, right? It was the fact of them all coming together at the same time that created the perfect storm.
Brett McKay: No, this reminded me when I got to this section, reminded me of Clausewitz, the famous military strategist who wrote On War. And he had this idea of friction. And he talked about this. He says, “Everything in war is very… ” He’s talking about war here, but I think you can apply this to anything. He says, “Everything in war is very simple, but the simplest thing is difficult. The difficulties accumulate and end by producing a kind of friction that is inconceivable unless one has experienced war.” So yeah, friction isn’t just one thing, it’s a bunch of stuff coming together. And then he goes on and he says, “Countless minor incidents, the kind you can never really foresee, combine to lower the general level of performance so that one always falls far short of the intended goal.” Yeah. So in war, as in life, it’s not just one thing. It’s like a bunch of different stuff that comes up.
Amy Edmondson: It’s many little things. And I opened the chapter on complex failures with a classic and tragic environmental spill back in 1972 called the Torrey Canyon, which is the name of the ship. And the captain himself describes this as, “Many little things.” And that’s exactly the same sentiment. And any one of those little things on their own wouldn’t have led to this horrible spill. But the unfortunate coming together of these factors leads to an outcome that nobody wants.
Brett McKay: Right. This can happen. You probably see this in your daily life when you’re late for work.
Amy Edmondson: Yeah.
Brett McKay: It’s like your kid gets sick and then the other kids lost their homework and then the car needs gas.
Amy Edmondson: And then there’s more traffic ’cause it’s raining and…
Brett McKay: Right. And it’s not one of those things. If one of the things happened, you would have been okay, but all of them together combined, it just created, that created the problem.
Amy Edmondson: Right. And the thing about complex failures, ’cause they sound kind of like, oh, well, there’s nothing we can do about it, is that in fact, they offer many little opportunities for course correction, right? Because there’s so many different factors. In many cases, all you needed to do was kind of notice and alter one of them to prevent the failure.
Brett McKay: We’re going to take a quick break before we word from our sponsors. And now back to the show. Well, one thing you talked about in the book and in your research is that complex failures are increasing in our modern age. Why is that?
Amy Edmondson: Well, I think we have so much more complexity in our world. I mean, information technology maybe is first among equals for this reality, which is we’re interconnected in ways that we never have been historically. So the phrase going viral, you can… A photo or a tweet or something that you really didn’t want to get attention may inadvertently get attention simply because we’re so interconnected. It’s just so quick. Things can spiral out of control very quickly. I also think we seem to have more complex and unpredictable weather patterns, more complex and interconnected supply chains that are global so that we’re more likely to be impacted by random events in another part of the world in our day-to-day life than was ever possible in history.
Brett McKay: Yeah. We experienced, I think everyone experienced to some extent, the supply chain things during COVID. Right?
Amy Edmondson: Right.
Brett McKay: And this is a perfect example. Everything’s so interconnected. You make one small change here, then the distributor has to make a change and the retailer has to make a change and then the consumer has to make a change and then everything just gets mucked up.
Amy Edmondson: Few people can’t show up for work because they’re sick, and then that leads to shortages here and something doesn’t get delivered and then that spirals out of control.
Brett McKay: But you say that, okay, complex failures, we’re not completely at the mercy of complex failures. You say there’s often warning signs that there’s a problem.
Amy Edmondson: No.
Brett McKay: How do you know when there’s a warning sign with a complex failure?
Amy Edmondson: You really won’t know unless people. Let’s go back to psychological safety again, because so many times in many of the complex failures that I’ve studied in company and government settings, it is not the case that the bad news or the failure came in totally out of the blue. It is nearly always the case that there was one or more people who had a worry, thought something wasn’t quite right, but really didn’t feel safe to speak up about it in a timely way that might have allowed for corrective action. So psychological safety plays a really important role in our ability to prevent complex failures. And if it sounds like a fool’s errand, it really isn’t because there’s a whole body of research called high reliability organizations that addresses the question of why do some inherently risky and dangerous undertakings like air traffic control or nuclear power plants operate safely just essentially nearly all the time? Right? And it’s through vigilance and mindfulness and incredible teamwork and psychological safety to speak up and listening to people and thanking them when they raise a concern, even if that concern turned out to be nothing. So there’s a set of management and cultural practices that allow us to cope with our complexity and our interconnectedness.
Brett McKay: And also you talk about there’s things you can do to, you can have contingency plans in place to kind of mitigate complex failures. Like I’ve experienced this with my podcast. Over the years, I’ve learned that technology is fickle. Sometimes things are working great and I never know if someone’s on a PC or a Mac or whatever. And so I’ve been able to develop systems where I’ve got my main recording, then I got a backup of recording, and then I got a backup of the backup and it saved me. But it’s helped me reduce some of the risk with complex failures.
Amy Edmondson: Exactly. And you’ve only done that because you’ve… You may have learned from a failure, right?
Brett McKay: Yeah. No. Yeah.
Amy Edmondson: But you recognize that, yeah, as much as we know, this is not new territory, right? In this point in history, recording a podcast is not new territory. And yet you understand that several factors could come together in just the wrong way. My computer might be different than the last guest’s computer, exactly as you said. So you figure out in advance that because things might go wrong, I’m gonna think about the backups.
Brett McKay: And also this idea of complex failures, you talk about the idea that some complex failures are tightly coupled and some are loosely coupled. What do you mean by that?
Amy Edmondson: Well, this is work by famous sociologists named Charles Perrault, who passed away a few years ago. And he just pointed out that when you have highly interconnected complex systems with tight coupling, you’re really at greater risk for complex failures. And tight coupling refers to the following idea that if there’s an action in one part of the system, it leads inexorably to a result in another part. So just to have a simple example, you put your ATM card into the slot, it just pulls it right in, right? And it’s too late. Like you can’t get it back until then the system decides to give it back to you, right? So tightly coupled refers to things that once they’re underway, you really can’t stop them.
Brett McKay: Okay. So an example of that is the Citibank employee who transferred $800 million. Like they did that one thing and it was done.
Amy Edmondson: Right, right. So that was a tightly coupled system. By the way [chuckle] nowadays we have… And this was only a few years ago, but you know, we have the duo. What are the… Help me out here.
Brett McKay: Double opt-in. The… Whenever, yeah, whenever I do transfers with my bank account, it has to be like one person and then another person.
Amy Edmondson: And then they send you a code to your phone and then you have to enter that code. And those things are in there. Sometimes they feel very annoying, but they’re in there to prevent the tight coupling that would just lead to so many more failures than would be desirable for anybody.
Brett McKay: Yeah. One of the takeaways I got from the tight coupling and loose coupling. So loose coupling is the opposite of tight coupling.
Amy Edmondson: Right.
Brett McKay: So you basically, you have some slack. So if you do have a failure, you’re able to correct. And it made me think of just adding more margin in your life, right?
Amy Edmondson: Yeah.
Brett McKay: So instead of trying to like get to the last minute before you leave, you’ll give yourself 10 minutes. ‘Cause that might help you if there’s traffic, it’ll accommodate for that.
Amy Edmondson: Brett, I love that because it’s build it in. Like we all know this, we all know we’re moving too fast from meeting to meeting, and then you’re gonna jump into your car and get to that next thing. Deliberately build in. At the beginning of the week or the beginning of the month or however you do it, build those buffers in. Slack and buffers are essentially the same concept. And very few of us have enough of them to actually operate effectively or as effectively as we would like to operate. So build in the buffer.
Brett McKay: Alright. So let’s recap what we’ve talked about here. We talked about the three types of failures. We’ve got intelligent failures, and these are just as experiments. Treat your life like an experiment. You can do this with your job. So maybe there’s some type of a new position you wanna go for. Well, try and experiment with it and see what you can do with it. Or I mean, a date could be an intelligent failure.
Amy Edmondson: Yeah. Absolutely.
Brett McKay: Trying a new hobby could be an intelligent failure. Then there’s a basic failure and that’s just basic mistakes.
Amy Edmondson: Yep.
Brett McKay: And the way you counter that is a simple checklist. You can often do the trick.
Amy Edmondson: Checklist, vigilance, attention.
Brett McKay: Right. And then we have complex failures. It’s multifaceted. It could be any one thing, but they all come together to create the problem. So, paying attention to warning signs, not being afraid of speaking up or pointing out a potential issue, coming up with contingency plans and adding buffer.
Amy Edmondson: Love that.
Brett McKay: Okay.
Amy Edmondson: Beautiful. Yeah.
Brett McKay: So that’s the first part of your book. You kind of lay this out and you go into more detail and I hope our listeners go and read the book. But I love in the second half of the book, you talk about some mindset shifts that people need to make when it comes to failure. And one thing you talk about is we need to overcome our tendency to find someone to blame for a failure. So how does that help people learn from failures? Then how do you hold people accountable if you don’t find someone to blame?
Amy Edmondson: [chuckle] Great questions. So first of all, it’s very natural to want to find someone to blame because it lets us off the hook, like psychologically and emotionally, right? If something goes wrong and I can instantly reassure myself that, “Hey, it’s not my fault,” right? Then I’ve bolstered my self-esteem. I’ve kept myself comfortable. I don’t have to confront my own contribution to the failure. And so it’s just a very natural instinct. I even described in the book, a 2-year-old who, when his father has a small sort of collision with the passenger side mirror in driving too close to the parked cars, the little child immediately hears the bang and says, “I didn’t do anything, Papa, right?” It’s like, of course not. He’s sitting in the toddler seat in the back seat. But this instinct to dodge blame is very, very deep and well-learned. But it’s unhelpful, right? It’s unhelpful for our ability to sort of learn and grow because even if you’re… I mean, I think there are very few times where you’re 100% to blame for things, but you always have some kind of contribution that you could look at thoughtfully and wonder about and work on doing better next time and so forth.
So it’s actually consistent with accountability. Because I think the way many people and many organizations think about accountability is counterproductive, right? They’ll say, “Well, we need to hold someone accountable and that’s almost synonymous with punishment. We need to punish someone or else it’ll happen again.” Whereas the truth is, when you really punish someone for something going wrong, what happens is it just decreases the chances that you hear about things in a timely way. But you know, back to psychological safety. But the way we need to think about accountability is to be willing to fully account for what happened, right? To take a clear-eyed scientific look at the events that unfolded, understand what role you played, what role other factors played, so that you can really learn its valuable lessons and improve your practices next time.
Brett McKay: Gotcha. And this isn’t to say, you talk about this in the book too, that if someone is being malicious.
Amy Edmondson: Yeah. Right.
Brett McKay: Then like you need to punish them. Right, yeah.
Amy Edmondson: Yeah. I’m all for blame when people engage in what I’ll call blameworthy acts.
Brett McKay: Right.
Amy Edmondson: It’s just that in our organizations and to a certain extent in our families, those aren’t the norm, right? The norm is just human beings who make mistakes and are sometimes a little thoughtless and sometimes a little busy and all the rest. But very few people are really waking up in the morning and sort of intent on sabotaging each other.
Brett McKay: Right. So, yeah, if someone makes an intelligent failure, you don’t wanna do any blame there for sure.
Amy Edmondson: No.
Brett McKay: To tryna…
Amy Edmondson: No, you want to celebrate it.
Brett McKay: Yeah, you wanna celebrate it. A basic failure, you know, if they didn’t sleep enough or they’re out partying, then you might want to have that conversation. But it could be something’s just going on at work where it made it tough for them to pay attention, something like that, yeah.
Amy Edmondson: Right. I mean, they may have been having to do cover for other people who were out sick or they may have had a sick child last night, not got enough sleep. What you want to do is first understand what happened and then figure out what kinds of protective measures to put in place to ensure that that same thing doesn’t happen a second time.
Brett McKay: Another thing you talk about is reframing failures. What does reframing failure look like?
Amy Edmondson: Reframing means acknowledging that there’s always a frame, like a picture frame. We’re looking at reality. We’re looking at the events in our lives through frames that are largely unconscious, that stem from our background, our expertise or our various biases that we all have. And reframing is learning to kind of stop and challenge how you see a situation. Wonder what you’re missing. Ask yourself, is there another way to see this? Right. So if you have that reflexive instinct to just blame, “Oh, I know what happened there.” It’s the self-discipline to say, “Well, I have a partial view on what happened there. And I’d love to understand it better before I draw any conclusions.”
Brett McKay: See, I like the idea of just thinking of failures like a scientist. Right?
Amy Edmondson: Yeah.
Brett McKay: What can I learn from this? I think that’s a good reframe to have. And you also talk about, one of the things you talk about throughout the book is this idea of context, like understanding the context of failure happens. How can that help you learn to fail better?
Amy Edmondson: You know that the issue of context brings us back to why the Silicon Valley talk is limited because it only works in certain contexts. And so the way I describe a context with the two dimensions I use to figure out context is, one is how much uncertainty is there, right? How much uncertainty is there and whether or not I can produce a batch of chocolate chip cookies, like very little, right? Unless all the power goes out or something. It’s almost a guarantee that if I follow the recipe, I can produce those cookies. How much uncertainty is there that I can find a new drug that will cure certain kinds of cancer? Well, very high uncertainty indeed. And so that matters in terms of guiding my actions. Number two is what are the stakes? And for me, stakes can be financial, they can be reputational, or they can be human safety. And when you’re dealing with high stakes, you proceed far more cautiously, or you should, than when you’re dealing with low stakes. And so if you’re dealing with high stakes and high uncertainty, you’re conducting your experiments. You have to experiment because you don’t have the answers, but those experiments should be very thoughtful and very small.
Brett McKay: Okay. I like that. And you have a nice chart to kind of walk people through that. So yeah, if it’s low stakes and it’s something new, it’s like, yeah, have fun experimenting.
Amy Edmondson: Have fun.
Brett McKay: Yeah.
Amy Edmondson: Yeah, exactly. And I literally have seen both errors in my research. I mean, meeting sort of these psychological errors we’d make where I’ve seen people be a little reckless in very dangerous situations and then bad things happen. And then I’ve seen people be overly cautious ’cause they wanna get it right, or they wanna look good in situations where there really isn’t a right answer. And the only way to make progress is to get out there and try stuff.
Brett McKay: Gotcha. So a high stakes context, that’s where you really need that mindfulness, like if you’re an air traffic controller.
Amy Edmondson: Right. Super high stakes, but really well understood territory. So there’s where you’re operating as mindfully and vigilantly as possible.
Brett McKay: Right. And then if you’re in a situation where it’s new, so like you’re doing experimental surgery and the stakes are high, you’re gonna want to do careful experimentation. Right?
Amy Edmondson: Right.
Brett McKay: Yeah. And try to mitigate the risk as much as possible.
Amy Edmondson: Exactly.
Brett McKay: Okay. I love that. And you also recommend four failure practices. What are those and how can they help us fail better?
Amy Edmondson: So the four failure practices that I write about actually in the last chapter, how to thrive as a fallible human being are persistence, right? There will always be obstacles in your path in anything that you are truly hoping to do in your life or in your work, right? If you have stretch goals, there will be obstacles. And so persisting, trying again, not being crippled by the inevitable failures that do happen is one of the practices. Reflection, I think all of us can benefit from doing more explicit reflection that could be formalized and keeping a journal or could be weekly team reflections with your team at work to kind of what went well, what didn’t go well, but being systematic about our learning from our own experiences is a super important practice.
Brett McKay: Well, I know that in Special Forces, they do what’s called after action reports.
Amy Edmondson: Yes.
Brett McKay: So after a mission, they’ll just get together and it’s formal but informal.
Amy Edmondson: Yes.
Brett McKay: And there’s no blame. They just talk about what went right and what went wrong.
Amy Edmondson: It’s very clean, right? It’s what do we set out to do? What actually happened? What’s the difference and why? What will we do next time?
Brett McKay: Right.
Amy Edmondson: Like very scientific, not emotional, but it’s almost storytelling to ourselves about to take our own experience and turn it into an explicit narrative so we can understand it better. Because we all make the mistake of thinking, “Oh, yeah, yeah, right. I had that experience. I’ll learn from it.” But we won’t unless we pause to do the explicit reflection. And the third one is accountability. And that’s very related to reflection because it’s being willing to confront and be honest about the whole account. Again, what happened and that willingness to say, “Oh, here’s what I did that may have contributed to that failure. Here’s what I did that contributed. Here’s what I failed to do that might have helped prevent it.” So it’s about being willing to own your role, which can seem scary, but it’s also quite empowering if you think about it.
You’re facing the fact that you do matter, right? That you have an impact and it’s important to be willing to take, sort of to own it, I guess is what I mean. And then finally, I talk about apologies as a real valuable practice that we can all learn to do more of in an uncertain, fallible world as uncertain, fallible people. Apologies play a very important role in relationships. If something goes wrong, it’s just so powerful to be willing to apologize for it. And that means, of course, being willing to take account of where you contributed. But good apologies, first of all, they signal that you care about that relationship. You care about the other person. They express remorse. I don’t mean they don’t have to be like horrifyingly remorseful, but just, I feel bad that I didn’t call when I said I would. You express that remorse. You offer to make amends, you know, how can I make it up to you? And you own your part in it. You don’t sort of duck the accountability part.
Brett McKay: And then beyond that, celebrate good failures, the right kind of wrong in your work with your kids. If they do something that ended up in a mistake, but it’s actually, they learned something from it, be like, “Hey, that’s great. You learned something.”
Amy Edmondson: Yes. I divide the… I sort of have the individual practices that each of us can do, take accountability, reflect and so on. And then the kind of collective practices that work in a team or a family or a company, which include calling attention to the context, being very thoughtful about like how much uncertainty or what the stakes are here, encouraging failure sharing, you know that detoxifies failure. If we can sort of more cheerfully tell each other about the things that have gone wrong in our projects or lives, it helps a great deal. And then really reward that honesty. If you’re in a family and a child sort of speaks up quickly about something, you know I broke a glass if they’re a little kid or I was it a party I shouldn’t have been at if they’re an older teenager, being deeply appreciative of that honesty will build the right habits and the right climate for learning in that family.
Brett McKay: Well, Amy, this has been a great conversation. Where can people go to learn more about the book and your work?
Amy Edmondson: Well, the book can be found, Right Kind of Wrong, anywhere books are sold, I hope. And you can go to my website, amycedmondson.com, for information about other papers and other writings and activities.
Brett McKay: Fantastic. Well, Amy Edmondson, thanks for your time. It’s been a pleasure.
Amy Edmondson: Pleasure to talk with you, Brett.
Brett McKay: My guest today was Amy Edmondson. She’s the author of the book, Right Kind of Wrong. It’s available on amazon.com and bookstores everywhere. You can find more information about her work at our website, amycedmondson.com. Also check out our show notes at aom.is/failwell, where you can find links to resources where you can delve deeper into this topic.
Well, that wraps up another edition of the AOM podcast. Make sure to check out our website at artofmanliness.com where you can find our podcast archives, as well as thousands of articles that we’ve written over the years about pretty much anything you think of. And if you haven’t done so already, I’d appreciate it if you take one minute to get us reviewed on Apple podcast or Spotify. It helps out a lot. If you’ve done that already, thank you. Please consider sharing the show with a friend or family member who you think we get something out of it. As always, thank you for the continued support. And until next time, this is Brett McKay reminding you not only to listen to AOM podcast, but put what you’ve heard into action.