We live in an age of disruption. Companies that were once stalwarts are overtaken by small, plucky upstarts. Our personal lives can also be disrupted. We lose a job or a business fails.
My guest today says that instead of waiting to be disrupted by outside forces, you’re better off using techniques developed by intelligence agencies and the military to disrupt yourself first. His name is Bryce Hoffman and he’s the author of the book Red Teaming: How Your Business Can Conquer the Competition by Challenging Everything. We begin our show discussing what red teaming is and the history of its development, from wargaming by 19th century Prussians to more sophisticated techniques developed by the US military during the war on terrorism. Bryce and I discuss the hidden biases that red teaming is designed to counter, and then get into the specific red teaming techniques you can start using today to challenge your assumptions, stress-test your strategies, identify unseen threats, and make better decisions in both your personal life and your business.
- What is “red teaming”?
- How the failures of 9/11 brought the idea of red teaming to the forefront
- How Prussians developed the idea of war games and the early concept of red teaming
- How red teaming has greatly enhanced America’s defense systems
- The ways that businesses — large and small alike — use these concepts
- What are the biggest errors people and organizations tend to make in their decision making?
- What Adam Smith got wrong with his rational choice theory
- The biases and heuristics that lead us astray
- Using red teaming ideas in your personal life
- What is a key assumptions check?
- The power of red teaming in a group
- What are the four ways of seeing?
- What’s a pre-mortem?
- How do you bring up contrarian views without stepping on toes?
Resources/People/Articles Mentioned in Podcast
- How to Get Better at Making Life-Changing Decisions
- Tools of Titans
- How to Lead an Unstoppable Team
- Red Cell
- The 9/11 Commission Report
- The Iraq troop surge of 2007
- Using Mental Models to Make Better Decisions
- How to Think Like a Poker Player
- Rational Choice Theory
- Thinking, Fast and Slow
- Availability heuristic
- Abilene paradox
- The Science of Insights
- Plans Are Useless, But . . .
Connect With Bryce
Listen to the Podcast! (And don’t forget to leave us a review!)
Listen to the episode on a separate page.
Subscribe to the podcast in the media player of your choice.
Recorded on ClearCast.io
Listen ad-free on Stitcher Premium; get a free month when you use code “manliness” at checkout.
Brilliant Earth is the global leader in ethically sourced fine jewelry, and THE destination for creating your own custom engagement ring. Get a FREE surprise gift when you buy an engagement ring and shop all their selections at BrilliantEarth.com/manliness.
Zicam. Other cold medicines only mask cold symptoms, but Zicam is homeopathic and clinically proven to shorten colds when taken at the first sign. Visit Zicam.com/manliness to receive a $2 coupon on your next purchase.
Proper Cloth. Stop wearing shirts that don’t fit. Start looking your best with a custom fitted shirt. Go to propercloth.com/manliness, and enter gift code “MANLINESS” to save $20 on your first shirt.
Click here to see a full list of our podcast sponsors.
Read the Transcript
Brett McKay here, and welcome to another edition of the Art of Manliness Podcast. We live in an age of disruption. Companies that were once stalwarts overtaken by small plucky upstarts, are personalized, can also be disrupted, can lose a job or business fails. My guest today says that instead of waiting to be disrupted by outside forces, you’re better off using techniques developed by intelligence agencies and the military to disrupt yourself first.
His name is Bryce Hoffman. He’s the author of the book red teaming how your business could conquer the competition by challenging everything. We begin our show discussing what red teaming is and the history of its development from war gaming by 19th century Prussians to more sophisticated techniques developed by the US military during the war on terrorism. Bryce and I discussed the hidden biases that red teaming is designed to counter and then get into specific red teaming techniques you can start using today to challenge your assumptions, stress, test your strategies, identify unseen threats, and make better decisions in both your personal life and your business.
After the show’s over, check out our show notes at aom.is/redteaming. All right. Bryce Hoffman, welcome to the show.
Good morning, Brett. Thanks for having me on.
So, you are the author of the book, Red Teaming How Your Business Can Conquer the Competition by Challenging Everything. Now I’m sure some of our listeners have heard of this concept of red teaming. It’s come out of the military. We’ll talk a bit of more about that. But for those who aren’t familiar with red teaming, what is this concept?
Red teaming at base is really a system for confronting hard truths that hold us back from moving forward in the best direction possible. It was developed by as you said, Brett, it was developed by the military and the intelligence community as a result of the failures and intelligence that led to the terrorist attacks on 9/11 and in the case of the US army as a result of the really faulty assumptions that the army came to believe were responsible for turning what seemed to be easy victories into long protracted counterinsurgency in Iraq and Afghanistan.
So, it was designed intentionally to help these organizations challenge their own thinking, deliberately try to poke holes in their own plans to make their plans better and to make better decisions in the future.
So, red teaming sort of became a systematic in the aftermath of 9/11 and during the Iraq and Afghanistan wars. But there’s a history that goes back further than that in the military. Like when did we start seeing sort of proto red teaming in militaries?
So, the origins of red teaming in the military really go back to the Prussians in the 1790s and after they had been defeated by Napoleon, they, which if you think about it, for the Prussians, was really a big deal, because the Prussians considered themselves to be kind of the most badass country on the face of the earth at that time. And so, the fact that they were defeated, not just by the French, but by a French corporal, was really humiliating for them. And so, they decided they weren’t going to take it, that they were going to wait for the right opportunity and, and attack Napoleon again, but they knew they’d only get one chance.
So, they did two things that are really important that they kind of set the stage for red teaming. One is they recognized and this is important, this goes to what I was talking about, Brett, about confronting the hard truths. The Prussians realized that none of them were equal to Napoleon, that none of them was as good a general as Napoleon was. But what they then did is say, “Well, you know what? None of us is as good as an employee, but Hans over there is just as good at logistics as Napoleon is.
And Fritz is as good at artillery strategy as Napoleon is and Gunther over there is as good at cavalry tactics as Napoleon is.” And so instead of having one general lead their army, they found the best general in each of the key areas that a general needed to be adept in and created a team of generals, what they called the general staff to lead their military. And they told their King, “We’re not going to have one general in charge. We’re going to have this group in charge. And together, we’re going to be as good as Napoleon.” And they were right.
And that concept was so successful that militaries all over the world adopted, and it’s still in use today pretty much in every country. So, that’s a core red teaming concept, because it goes to this idea that all of us are smarter than any of us, so that the, the, the, the wisdom of the group is greater than the wisdom of any individual unless that individual happens to be Napoleon or in a business example, Steve jobs perhaps. So, that was the first thing that happened. Second thing is they recognize they were only getting a one chance at this and they better make it count.
So, to make sure it did, they prepared their strategy to fight Napoleon and then they divided themselves in two groups and they set up a tabletop exercise with little wooden pieces to represent all the different units and the terrain and stuff like that. And they fought battle after battle on the tabletop with one half of the general, the Prussian officers play in the Prussians, and the other half playing the French and trying to figure out how the French could defeat the Prussian strategy.
Now, if you had Crayola crayons as a kid like I did, you will remember that there was a cool color called Prussian blue in your Crayola set. And the reason it was called that is because Prussians in the 1790s wore this kind of spiffy blue uniforms that that color is named after. So, they were the blue team and since they were planning their fight against the, the revolutionary French, they painted the pieces for the French red and called them the red team. So, that’s where this idea of taking a group of your own people and in deliberately trying to defeat your own strategy to poke holes in your own plan comes from.
And that was also, they called it Kriegspiel, which in English is war gaming. And that also spread throughout the world and still used today. So, what the type of red teaming that we’re talking about, which we call decision support red teaming is the formal name for it is taking that same approach and not just using it to, to kind of figure out how the enemy’s going to react, but to just consciously kind of assault your own assumptions to make sure they can kind of withstand that rigorous scrutiny, because in doing so, you and your organization can make better decisions.
Okay. So US military obviously continue this tradition of war gaming after the Prussians came up with it throughout the 19th century, in the 20th century. And you said it was after the 9/11 attack that intelligence agencies and the militaries decided, “We need to take war gaming and do something more, make it more systematic.” So, what’s the story there? How did, who were the organizations, the individuals involved with creating this more systematic approach to red teaming?
So, two things happen in parallel though at slightly different times. The first thing that happened, Brett, was on September 12th, 2001 literally as they were still pulling people out of the, the rubble of the world trade center and the Pentagon, CIA director, George Tenet activated or, or probably more appropriately reactivated a group within the CIA called the red cell. And the red cell is the CIA’s red team. And, and Tenet told these folks, he said, “Look, we should have seen this coming.
We knew that there was about to be a terrorist attack on the United States, but we failed to connect the dots in time. And the reason we failed to connect the dots is not that we didn’t have the information, it’s not that we didn’t have the intelligence…” Because if you think back to like the 9/11 commission, some of the revelations that came out of that were pretty fascinating. I mean things like the director of a flight school in Florida calling the FBI and saying, “Hey. Just a heads up here, I’ve got this group of guys from, from the Middle East who want to learn how to fly jumbo jets, but they don’t want to learn how to land them and they want to do all their simulator training over New York city.”
Things like that that the intelligence agencies have, but didn’t piece together. And so, Tenet said, “What I want to do is have this group, the red cell, take everything that we believe to be true every day. Look at all of our intelligence assessments and try to argue that the opposite is true. Try to argue that we’re wrong. Try to poke holes in what we have concluded based on the intelligence that we have. Not because we are necessarily wrong, but because A, we could be and maybe you’ll figure out the correct answer or B, even if we’re right by stress testing our conclusions, you’ll make them stronger.”
So, the red cell got to work and obviously the work of the red cell is highly classified. But one thing the CIA has said publicly is that the work of the red cell is directly responsible for having prevented a number of major terrorist attacks on the scale of 9/11, if not, larger since 2001. The other thing that the red cell did, which is more in the public domain, is they started creating a document a few weeks after this. Every day they started creating a document called the Alternative Intelligence Assessment.
And what this was as your listeners probably know that every day the president of the United States gets a black book, I believe it’s about six pages in it of the what’s called the daily intelligence assessment and it is a breakdown of what’s happened in the world in the past 24 hours, what the CIA believes it means and what options the president has for responding to this based on the conclusions the CIA has drawn from this intelligence.
So, the alternative intelligence assessment was a one page document slipped in in the back of that, that book every day that said, “You know what? Mr. President, you’ve just read what we believe is going on in the world and what you can do about it. But we might be wrong and if we’re wrong, here’s the other ways you could interpret these events and here’re some other options you might consider. Now, it’s worth noting, Brett, that both president Bush and president Obama said publicly that they found the Alternative Intelligence Assessments be one of the most important things they looked at every day.
President Trump, about two weeks after he took office, said he found it confusing and asked to be removed from his daily briefing. So, the CIA still prepares it. They just don’t give it to the president. So that’s what, that’s one thing that happened with the CIA. Now, a couple of years later, the US army kind of had a similar epiphany. See, they, they thought they won the war in Iraq and then suddenly they found that they were losing it. And that created a real catharsis in the leadership of the US military and the leadership of the US army in particular that said, “How did this happen?
How did we so easily win this war and now find ourselves locked in this insurgency that we’re actually losing?” And so, the then director of the US army, the then chairman of the army who is named General Schoomaker, a former green beret, great American, General Schoomaker said, “We’re going to set up a lessons learned team to figure out how this happened, how we got in this mess, and how to make sure we never get in this mess again.” And what this team concluded very quickly, Brett, was that we had become victims of our own success.
By we, I mean the US military had become victims of our own success. The military had so easily won the war in Iraq in the early 1990s, the first Gulf war with so little cost. And they’d so easily won the war in the Balkans in the, in the later part of the 1990s again with so little cost that they have concluded that that because we had this immense mastery of information, because of spy satellites and drones and all this stuff, and because our weapons were so superior to anyone else’s, that we basically had become invincible.
And they really believe that. If you go back and look at the stuff that was written in the Pentagon, in, in the, in the run up to, to the invasion of Iraq, people really believed that they were invincible. And then suddenly they found they weren’t. And so, the lessons learned team made several recommendations, but one of them was to recognize that a lot of the problem was not just that we thought we were invincible, but that a lot of the assumptions we made, because we thought we were invincible, were wrong.
And so, they recommended creating a team within the US army at every level of the US army that the job would be to tech-take every strategy that was developed and try to stress test it, try to break it apart, try to figure out what could go wrong with it and how to make it better. And they called this red team. And they came up with a formal system of tools and techniques and they started training senior officers in these tools and techniques, so that they could go and do this to make sure this sort of thing didn’t happen again. And they created it, even created their own school.
They called it red teaming university. Informally, it had a code name, the university of foreign military and cultural studies, because they didn’t want our enemies to know what it really was. And they set it up at Fort Leavenworth in Kansas to train officers in these techniques. And I became the first and still only civilian from outside government to go through the red team leader training program there. That’s how I learned about this.
And did red teaming influenced or changed the way or changed decisions, like strategic decisions the army made in regards to Iraq and Afghanistan?
It really did. And again, a lot of what came out of red teaming is, is classified, but one that was very public and it shows both the, the real opportunity and the challenge of red teaming, Brett, was the surge. So, General Petraeus was in charge of Fort Leavenworth when red teaming was set up and he was an early advocate for red teaming. And so, when he was put in charge of the war in Iraq in the mid-2000s, he used red teaming techniques to come up with this idea of the surge that if we… And red teaming is really about contrarian thinking and, and, and looking at things differently.
So, what he came up with was the president said, “We want to pull out of Iraq,” by using red teaming, he was able to determine the only way we can pull out of Iraq is to send more troops into Iraq to get the situation to the point where we can safely pull out for ourselves and for the Iraqi people. And so, the surge was, was given the green light and it went forward and was very successful. It really changed everything pretty quickly. It dramatically reduced the violence in the country, the number of bombings, number of terrorist attacks.
So, the surge was really working in the way it was intended. But the problem is that the politicians in Washington saw that it was working. And even though General Petraeus had said, “We can’t just wait until it starts to work. We’ve got a, we’ve got to keep this in place for, for a period of time here to ensure that the situation really stabilizes in Iraq before we pull out.” They said, “No, this is close enough. Let’s pull out now,” and then it started to fall apart again.
And so, Brett, what this really illustrates is one of the problems, one of the challenges with red teaming, not a problem. What it really illustrates is one of the challenges of red teaming, which is that if you don’t have the support of senior leadership, if you don’t have buy in from senior leadership, it doesn’t matter how good the ideas you come up with are, they’re not going to work because they can’t. And that’s what happened with the surge. So, that even though a lot of what has been done in the, in the military of the red teaming as classified, the way that you know it’s working is that very quickly after they started to implement red teaming in the late 2000s in the US military, very quickly, it spread around the world.
So, the British adopted red teaming, the Canadians adopted red teaming, the Australians adopted red teaming, New Zealand, even NATO ultimately adopted red teaming. Though they decided that the idea of teams was a little bit too confrontational and so they called it alternative analysis. But the point is, is it was so successful that most of the countries in the world that are allied with the United States have now set up their own red teaming programs or a red team training programs and are using this kind of across the world.
So, red teaming has been used by intelligence agencies, militaries to defeat terrorist win wars. But then you talked about, you started seeing companies using red teaming techniques as well. When did you start seeing that? What are some example of private or civilian organizations using red teaming techniques?
Yeah, so Brett, before I decided to give up an almost half a year of my life to go through the red team training program at Fort Leavenworth, I wanted to make sure that it wasn’t just me that thought this was a good idea, that others in business saw this as valuable and something that they would, they would want to learn and use. So I, I talked to several friends of mine who are in senior leadership positions with some of the most disruptive companies in the world, the companies that other companies have really come to fear, because they’re so good at disrupting other industries.
And I talked to folks and what I found out was that while none of them were really aware of this type of red teaming, that the military and the CIA and others were doing, that when I described the tools and the techniques that were involved, they were really similar to some of the ways in which these companies approached their business. So for example, I have a friend who is in a fairly senior position at Amazon and when I described red teaming to him and described some of the tools to him, he said, “I’ve never heard of red teaming. I don’t think Jeff has ever heard of red teaming.
But what you described is very similar to some of the things that we do.” And he said, “We have an internal process that we do to constantly stress test our own strategies, to constantly challenge our own assumptions and to really try to look at different parts of our business and try to disrupt them to try to look at them the way a competitor who wanted to, to, to disrupt us would look at them, so that we can disrupt ourselves before someone does it for us.” And he said, “A lot of times I go to conferences and stuff and I, I talk with executives from other companies and they say, ‘How can we be more like Amazon?’
And he said, “I always feel like kind of a jerk, because the only thing I can think of to tell them is, ‘Well you could start over,’ because this is so intrinsic in our DNA. It’s so core to what we are as a company. And it’s been part of our company since, since Jeff started it.” He said, “This is the first time I’ve heard of kind of a systematic way that you could teach someone to think more like Amazon.” And so, when I heard that, that was really validating and I heard that from other companies as well. I heard that from folks at Kleiner Perkins on Sand Hill Road, who said, “Some of these techniques you’re describing are very similar to the way that we vet companies for investments,” that sort of thing.
And when I heard that, Brett, I knew that this was really something that was valuable. It was worth taking the time to learn how to do, to write a book, to share with other people, and to set up a company to teach people how to do this.
So, companies are just doing it by themselves, but have there been companies who once they learn about what the military was doing with this more systematic approach to red teaming, they’re like, “Let’s do that in our company”?
Absolutely. And a lot of our clients, because of MDS I can identify by name, but we’ve worked with companies in pretty much every industry from aerospace, transportation, technology, telecommunications. One company that that I can talk about briefly is, is Verizon, which has really made red teaming a central part of its strategic planning process and has really figured out how to use these tools in a very effective way to evaluate every major strategy before it’s approved.
And it has been game changing for them. They have, they have, they have changed the direction of some of their major strategic initiatives as the result of what they’ve learned from their red teams and it’s, it’s been really powerful. Another organization that has used red teaming, one of the first that we trained was a development bank of Japan, which is Japan’s sovereign wealth fund, and they wanted to use these tools to look at companies as investment targets that they were thinking of investing in to make sure that they were really putting their money in companies that could use it effectively and to, and to make sure that the ways that the companies wanting to use the money they were giving them would lead to the success that they hoped to achieve.
So, there’s a lot of companies that, that have been able to use this since the book came out and it’s really spreading. There’s a lot of interest, like I say, not just in the United States, but around the world. I just got back from a trip to Great Britain where I talked with a number of companies and a number of organizations over there about how to use red teaming to really kind of like I said, disrupt yourself before someone else disrupts you. But it’s not just companies. It’s not just large organizations that can do this. Individuals can red team themselves as well. And that’s important to know
And we’ll talk about that, because I think that’s… That was the big takeaway for me personally as I read this book is like I can apply this to myself, my own life, but also to my own company as well, my own business. And we’ll talk about that here in a bit, but let’s talk about the power. Like what do you think red teaming does and you make the case that red teaming is really powerful in helping us overcome human biases when we are as individuals, we make decisions as well as in groups. So, what are like the most common biases or errors in thinking that humans make when they make decisions?
It’s a great question, Brett. And it’s really important to understand that the, the military, that the CIA didn’t just come up with these tools kind of on a whiteboard somewhere that these are really based on science and they’re based on primarily the research that’s been done over the past 40 45 years in cognitive psychology, human decision making by people like Dr. Daniel cameraman who wrote Thinking Fast and Slow, the Nobel Laureate, his colleague, Amos Tversky, Dr. Gary Klein and others.
And what these scientists have seen and what these scientists have proven in thousands of experiments is that Adam Smith was wrong. And what I mean by that, Brett, is for the better part of the past 300 years, most economists and most people have thought that Adam Smith’s notion of what he called rational choice theory was the way people made decisions. And what rational choice theory holds really simply is and it is a simple theory, is that we make the best decisions possible with the information available to us.
And that if we make a bad decision, it’s because we didn’t have enough information to make a better decision or because we were swayed by strong emotions like love or hate or a really unhealthy obsession with two lips. And so economists thought, “This is how this other world works. People, people do their best and if they, if they, if they make a bad decision, they just needed, they would’ve made a better decision if they had more information.” But what Tversky and Kahneman, and Kein, and others have proven is that that’s not how people make decisions. That people decisions, that’s how we wish we made decisions.
But the people’s decisions, no matter how smart they are, no matter how well educated they are, no matter how experienced they are, no matter how successful they are, are shaped by an array of biases, blind spots, heuristics which is really just a fancy for mental shortcuts, and that these, these things which are kind of hardwired into our brain skew our decision making in ways that we’re just not aware of.
And we could talk about what some of those are. But it’s important to know that these are things that, that exist for a reason. These biases and these blind spots exist. These shortcuts exist because if you were a hunter or gatherer on the African Savannah and you stopped to really think deeply about whether the lion that was approaching you was going to eat you or not, you probably wouldn’t survive to kind of finish that analysis. So, our brains are wired in such a way to help us make really quick decisions when we need to.
But the problem is, is that we use that same approach to deal with today’s problems, which are much more complex, much more complicated than the ones that people were encountering on the African Savannah a thousand of years ago. And that’s where we run into difficulty.
So, what are some of these heuristics or biases that you see frequently that really cost organizations or individuals whenever they’re making a decision?
There’re so many. One of the ones that that I think is really kind of endemic in business is sunk cost bias. And sunk cost bias is basically the tendency that we all have when we lose something, when we lose money in particular to want to recoup our loss. And that desire is so strong that it can often make us do really stupid things that end up costing us more money in the long term. So, for instance, at the simplest level, sunk cost bias is why you see people hitchhiking out of Las Vegas, because they don’t have enough money for a Greyhound bus, because they’ve lost their money at the gaming tables and they keep betting in the hopes of, of recouping their losses until they’re left with nothing.
But it also affects companies that should know better. So, you see companies do things like build a factory and open a new factory, and the factory will be unprofitable. The factory doesn’t make money, because the workers aren’t productive enough, the equipment isn’t good enough, whatever the reason. And rather than saying, “You know what? We invested $250 million to build this factory. it’s really just kind of been a boondoggle. Let’s cut our losses and move on.”
What did they do? They say, “Well, let’s put 50 more million dollars into this factory and try to boost productivity.” Okay, maybe that works, but it doesn’t work. They keep pouring money and they say, “Let’s do another $25 million. Let’s invest another $50 million. And pretty soon, the $250 million factory cost $700 million and it’s still not turned to profit. So, that’s one bias that is really dangerous and that red team thinking is designed to overcome.
Another one is availability heuristic. So, the availability heuristic is really simple. We are much more aware of information that we’ve just been given than information that we know from the past.
So, if you every day are seen on the news stories about how great some new technology is, you’re much more likely to look at it favorably because of that recent information, despite the fact that you saw much more detailed information two years ago that said this technology had fundamental flaws in it, because it’s not available to you in your mind as readily. And there’s tons of them. There’s negativity bias, which is that we tend to recall negative or painful experiences much more strongly than positive ones.
So, if we were successful, for instance in a particular business strategy for three years and then all of a sudden we blew it one time, we’re likely to stop pursuing that strategy because of that one bad experience and ignore the fact that in three of the past four instances, it was incredibly successful, because the pain is stronger in our mind than the success. Like I said, it’s hardwired into our brains and it affects our thinking on complex problems as well as simple problems like that.
So, those are example of decision heuristics we make that would be done on a group base but also on an individual basis. But you also highlighted that our thinking can change and lead us astray when we were starting, when we get into groups, and one of the ones that stood out to me was the Abilene Paradox. What’s the Abilene Paradox and how can that lead us astray?
The Abilene paradox, simply put, Brett, is what happens when we say yes but we mean no and you know it’s something that anyone who’s worked in an organization is probably really familiar with someone, will pose a question, there’ll be a problem that the group is trying to solve and no one has a good answer and then someone will throw out an answer and say, “We could do X.” Now, everyone at the table thinks that X is a horrible idea, but they ultimately do it anyways, because no one comes up with a better idea.
And the reason that it’s called the Abilene paradox, and no offense to your listeners in Abilene, Texas, is that the psychologist who first identified this several decades ago was sitting at home with his family on a weekend. And it was a Sunday. They had no idea what to do, was hot, trying to figure out how to kill the afternoon. And they’d been sitting around for several minutes trying to figure out how they could, how they could spend the day.
No one had a good answer. And then finally someone said, “We could always go to Abilene. Now, no one in the room wanted to go to Abilene. Everybody apparently hated Abilene. I’ve never been to Abilene, but presumably it’s not a great place to visit on a Sunday afternoon. And yet after a few minutes, somebody else said, “Yeah, we could go to Abilene.” And then the next thing you know, somebody else said, “Yeah, that’s great. Let’s go to Abilene.”
And then everyone’s in the car, they go to Abilene, they have a horrible time. And on the way back, everyone’s grumpy and unhappy. And the conversation takes the turn that that you probably expect it to, which is somebody says, “Hey, why did you want to go to Abilene?” And mom says, “I didn’t want to go to Abilene. I only said we should go to Abilene, because dad said we should go to Abilene.” And dad says, “I didn’t want to go to Abilene. I only said we should go to Abilene, because grandpa said we should go to Abilene.”
And grandpa says, “I only said we should go to Abilene, because I couldn’t think of anywhere else to go.” And it’s a funny story, but it really illustrates a big problem that companies deal with all the time, which is when people agree to something that they don’t really believe in.” And it’s very dangerous that even more dangerous is something that people are probably a little more familiar with, which is a group think, which is the pathology of every organization is that over time it will start to drink its own Kool-Aid and stop challenging its own beliefs.
And everyone will start kind of aligning their thinking. And that’s really dangerous, because as General Patton once said if everybody’s thinking alike, somebody isn’t thinking. And that’s what red teaming and red team thinking is designed to do is make sure that everyone is not thinking alike to kind of promote divergent thinking, so that you can converge on the best idea regardless of where it comes from in the organization by the way, Brett.
I mean lot of the tools that are involved in red teaming that were created by the military were created, because they recognize that they were existed in an incredibly strong hierarchical culture, and that the hierarchy of the culture of the military prevented good ideas from being surfaced if they didn’t come from the most senior people in the room. So, a lot of the tools we teach, a lot of the techniques we teach are really designed to help people anonymously surface their ideas and let people evaluate those ideas independent from who surfaced them, so that the best idea wins regardless of where it came from.
Well, let’s talk about some of these red teaming techniques that organizations and people do. And your book, Red Teaming, it’s more about how businesses can use red teaming in a systematic, a large scale approach. But as you pointed out in the book too, you can also use these things on an ad hoc basis in your own small organization or with yourself as well. So, let’s talk about one that stood out to me was key assumptions check. What is that? What’s the goal of a key assumptions check?
This is a really important technique, a key assumptions check. It’s, it’s a little bit of a complicated technique, but simply put, it’s a way of conscientiously and intentionally challenging your own assumptions and making sure that they’re, they’re strong enough to base your plan on. Now, it’s important to understand there’s nothing wrong with assumptions. We have to make assumptions in order to make any decision, to make any plan, to develop any strategy.
Assumptions are essential to the planning process. The problem is that a lot of people get confused between assumptions and facts. So, simply put, a fact is something that is objective true right now that you can prove. It’s not something we hope will be true in the future. It’s not something that, that we wish were true. It’s really true right now. And you can go out and show someone that it’s true.
So, if I say our company made $150 million last quarter, that’s true unless our accountants were cooking the books. And you can go and get the financial report and see that that is true. And assumption ideally is something that is not yet true, but will be true in the future. It’s a fact that’s not yet true, but will be in the future. So, if I say we’re going to make $150 million next quarter, that’s an assumption. Even if I’ve got the most bulletproof quantitative analysis by the top folks on Wall Street, that tells me this is exactly how much money I’m going to make, it’s still an assumption, because it hasn’t happened yet.
So, ideally, like I said, assumptions are just facts that are not yet true. And we make our plans based on them. The problem, Brett, is that too often assumptions are really just wishful thinking and so, a key assumptions check is about identifying the assumptions that underlie your strategy, your plan or your decision and then subjecting them to a series of questions that are really designed to poke and prod them and make sure they don’t pop on close inspection.
And some of those questions or things for instance like is this assumption based on biases or preconceived notions? Going back to what we just talked about biases and heuristics. Is this assumption based on a historical precedent? And if so, is that historical precedent valid? Because a lot of times we make assumptions based on our past experiences and it’s not really analogous. Other questions that we ask include things like “What has to happen for this assumption to be true?” which is something people often don’t think about.
And another equally important one, if this assumption proves true, does it remain true under all conditions? So, that’s the type of questions that we ask. And the important thing is that it’s not just saying check your assumptions. It’s a very systematic process for checking your assumptions. I mentioned the Development Bank of Japan earlier and when I taught this technique to some of the senior leaders at the Development Bank of Japan, and taught them how to do a key assumptions check, during the break, one of one of them said to me, “This is really important, because we have a written process of how we evaluate investment targets.”
And one of the steps on that process is to check the assumptions that this investment plan is based on. He said, “The problem is that the way it works in practice is this. We all sit around in a nice conference table in our office in Tokyo and we get to that point on the checklist and whoever’s running the meeting says, ‘Have we checked the assumptions that this investment plan is based on?’ And we all nod very earnestly at each other. And then we checked that box and we move on to the next box on the checklist.”
He said, |The process that you’ve taught us doesn’t allow for us to escape that task, because you have to go and ask these specific questions.” So, that’s why these tools are as kind of intense as they are, is to really make people do the work, not just say they did the work.
So, like on a personal level, you can use this if you’re making a decision, “Should I buy a house?” And there’re some assumptions there when you say, “Yes, I’ll buy a house.” Well, there’s assumptions like, “Well, I’m assuming I’m going to be able to get home insurance. I’m assuming the mortgage will get approved. I’m assuming that I’ll have a job in the future where I can pay the mortgage.” And so, that sort of check will help you make sure that you’ve stress test it and you’re able to plan for contingencies where those assumptions are true.
Absolutely. What you’ve just pointed out there is really important, which is that while these tools can be done in a very formal setting, they can also be used informally. Now, let’s be clear on what the differences though. If you do what you just said, that’s valuable and it’s a lot, you’re much more likely to make a good decision by asking those questions that you just asked than if you just said, “Hey, I want to buy a house. Sounds like a good idea to earn enough money in my bank account for down payment. Sure, let’s do it. Pass my credit check. We’re good to go.”
However, I just want to be clear. It’s not as good as having somebody else look at your assumptions and ask those questions, because like we talked about, all of us no matter how smart we are, no matter how well educated we are, no matter how successful we are, can’t see what we can’t see. Nobel prize winning, economist, Thomas shelling said the one thing that no one can do no matter how smart they are is come up with a list of things that would never occur to them.
And that’s what you get from red teaming in a group, is another set of eyes that looks at the problem that looks at the decision and helps you see what you can’t see. So, that’s the difference, still valuable, still incredibly valuable to ask those questions as an individual. Even more valuable if you can do it even with a small group.
Well, on a personal level, one thing you can do to help you do some red teaming with someone else is like on that house buying decision, have a personal financial advisor.
Consult them and they can start saying, “Well, let’s think about these.” They’ll start picking at it and be like, “Well, let’s check out these things as well.”
Exactly. Find someone you trust to ask you those tough questions and to help you answer them honestly. That’s really what it’s about. But you know what? As we’ve ruled red teaming out around the world, we found that there are companies like Verizon and like Development Bank of Japan and some of the other large companies we’ve worked with who’ve been able to set up red teams in their organizations on an ad hoc basis. Train people in these tools and techniques and then let them really spend several days evaluating important strategic decisions.
And that’s incredibly valuable, but we found a lot of other companies, a lot of other organizations and a lot of individual leaders who say, “I really want to use these tools and techniques to make better decisions, but I don’t have the ability to set up a team. I don’t have the ability to get half a dozen people trained in these techniques. What can I do?” So, we figured out a way to modify some of these techniques and teach people how to do them individually or in just grabbing a couple people going into a conference room and using them on a less formal basis.
Like I say, it’s not quite as powerful as the formal process, but it’s still effective and it’s still better than doing nothing. It’s still better than not challenging your own thinking.
Another technique that can be used on an ad hoc basis that I thought can be potentially powerful is this four ways of seeing, what’s that and what’s the goal there?
That is a really cool technique and it’s really one that I think it’s important for people to use. And you’re absolutely right. This is something you can do as an individually and effectively. So, at the simplest level of four ways of seeing is something that the military created to deal with the recognition that they’d often failed when they were putting together plans to consider how those plans looked from other people’s perspective in the US armies.
So, I’ll give you an example. My instructor when I went to the red team in university, Colonel Kevin Benson was literally the person, literally the head of the team who planned the invasion of Iraq. And so, he’d seen firsthand why we needed these tools. And when we were, when we were learning this tool, four ways of seeing, he told our class a story. He said, “I led the team that planned the invasion of Iraq and we made a lot of assumptions about the Iraqi people, even though we never talked with any of them.”
And he said, “One of the biggest and most damaging assumptions we made, because it was so colossally wrong, was that we believed that inside every Iraqi was…” and this was his exact words, “Inside every Iraqi was a little American just dying to get out.” And he said, “What that belief led us to do was to decide things like it’s okay that we’re going to black this country out, take out its power grid. And yes, people are going to lose their air conditioning in the desert in the middle of the summer. Yes, their refrigerators are going to stop working and all their foods going to spoil. Yeah.
They’re probably not going to be able to get clean drinking water for a while. But they’re going to be so happy that we’ve freed them from the boot of Saddam Hussein. That they won’t care, that they’ll be good with that.” He said, “As soon as you look at it that way though, as soon as you say is that really true, you realize it’s just complete BS, right?” because anyone who’s familiar with Maslow’s hierarchy of needs knows that self-actualization is the top of the pyramid, and things like food and water and shelter and security are the bottom of the pyramid.
And you can’t get to the top of the pyramid without making a strong foundation on the bottom of the pyramid. So, in reality, if you don’t have food and water, you don’t really care much about whether you’re living in a dictatorship or a democracy. So, they created this tool called four ways of seeing to force themselves to look at assumptions, look at problems, and look at plans from the perspective of other key stakeholders like in the case I just gave, the Iraqi people. And the way it works, Brett, is really simple. You create a quad chart, typical business quad chart, though it’s important to remember on this one, there’s no right box.
All boxes are equal. In the upper left-hand corner, you look at how do we view ourselves. Now, we could be your company or it could be you as an individual. On the upper right-hand box, you say, “How does X view X?” X could be anyone. It’s any stakeholder you’re looking at. It could be a competitor, it could be your customers, it could be your trade union, it could be your boss. If you’re doing this to figure out how to get a raise and you write how do they view themselves? Then in the lower left-hand quadrant, you write, how do we view them?
And in the lower right-hand quadrant, you write, how do they view us? And you spend some time filling these boxes out. And what you get from this, Brett, is a better understanding of what other people’s perspectives are, what their pain points are, what their issues are. And then you can craft your message. You can craft your plan to address some of those if you want to make it more likely to succeed. So, if you’re using this at the simplest level to figure out how best to approach your boss and get a raise, by doing this [inaudible 00:43:47], you might find out for instance that your boss is under tremendous pressure to keep costs where they’re at.
However, your boss is also under tremendous pressure to increase sales in your department by 12% this year. So, you’re more likely to succeed then if you go to your boss and instead of saying, “Hey, I’m a hard worker, give me more money,” to say, “Hey, look, I recognize that you’re under a lot of pressure to keep cost down, but I also recognize that turned a lot of pressure to increase sales. I’ve been working my tail off. I’ve been putting in extra hours to help you do that and I’m willing to continue to do that. But to do that, I need to a little bit more on my end as well.
So, if you will give me what I’m asking for, I’ll commit to you that I will help you achieve that goal the next year. And here’s what I’ll do to do that. So, you see, you’re tailoring your message to meet their needs, you’re speaking to their needs rather than your needs, and that can be really effective. That can be a real powerful technique.
Another technique you highlight in the book that I’ve used personally is a pre-mortem analysis. What’s that?
This is one of my favorite techniques, Brett. It’s a, it’s a technique that was developed by Dr. Gary Klein, who I’ve had the pleasure to work with both while researching the book and since then. And this is a technique that is basically all about contemplating failure to answering the question “What’s the worst that could happen?” but not stopping there, not just saying, “What’s the worst that could happen?” but then working backwards and looking at what are all of the steps that would have to happen between that colossally bad failed state that I’ve now envisioned and the present day.
So, when we do this technique and practice, we usually put a timeframe out there. So, we’ll say for instance the plan we’re looking at, it’s going to launch on January 1st, 2020. Let’s assume it’s January 1st, 2022 and our plan has failed colossally. It hasn’t just failed to meet its target. it’s actually caused real damage to our organization. And then you think about what does that look like, what does that look like. And then you work back and say, “Okay, what are the steps that would have to happen from this moment, from January 1st, 2020 when we say yes to this strategy to January 1st, 2022 when this colossal failure has occurred?”
The value in doing that is it shows you the things that could lead to a bad outcome earlier in the process while there’s still time to avoid the bad outcome. So, if you for instance find that one of the things that led to that colossally bad outcome was hiring an additional staff and not giving them proper training, two things can happen as a result of that. One is you now know that you should make sure before you approve the plan that you include adequate training resources for the staff you’re planning on hiring.
You can also put a flag post out there in the future and say, “After we’ve hired these folks, let’s find a way of checking after three months to make sure they’ve got the skills they need to be effective in their new jobs, so that we make sure that they got the adequate training.” So, it’s a way of identifying ways that you could prevent that failure from happening more than just figuring out what that failure looks like.
So yeah, I’ve used that. So, you could use this on like an individual level. Like I think a decision, a big decision that people often make is, “Should I quit my job and go all in on my business?” And yeah, you got to have some optimism to do that. But it also helps to think about, okay, let me ask this question. The business has failed. What happened to cause the business to fail? And then you can start. Yes, you started using your imagination and then figuring that out and then finding, creating plans to prevent those possible failure points to occur.
Absolutely. By the way, this exercise is a lot of fun too.
No, it is a lot of fun, but I think one of the dangers with pre-mortem analysis is that if you’re neurotic, it could be unproductive in a way if you’re not careful, because what you start doing is you start thinking like the sky’s falling. You let that negativity bias go on, like hijack your thinking. But with pre-mortem, like you’re thinking worst case scenario, but also thinking of solutions to those problems come up with.
Well, you raised a really good point, Brett, which is that when you’re doing your pre-mortem analysis and then really when you’re doing any of these red teaming tools, the point of doing this is not because you think you have a lousy plan and that your plan is going to fail. The point of doing this is you think you have a good plan, but you want to make it better. And that’s really the point of red teaming, is job of a red teamer is not to make a better plan. The job of red teamer is to make the plan better.
So, by doing these tools, instead of courting disaster, you’re really trying to ensure success and you have to approach your work that way both as an individual and as a group when you’re red teaming.
My next question, which is I think a lot of people or organizations avoid red teaming, because it is constrained in nature and it brings up conflict. So, let’s say you’re doing this within a small group like your small business or the group you belong to in a larger corporation. How do you bring up contrarian views that come about through red teaming without them being rejected and stepping on toes?
Another really good question. So, one of the things that we stress and it’s so important for successful red teaming, when we teach people how to red team, we don’t just teach them these tools and techniques. We teach them how to communicate the results effectively. Because if you don’t, it doesn’t matter how great the insights you come up with are. If the people who are going to be making the decisions can’t hear and act on what you’ve recommended, then your whole red teaming exercise has just kind of been a collective navel gazing excursion.
So, to avoid that, it’s really important. It’s critical whether you’re red teaming in a group or as an individual to approach the work of red teaming in a constructive and collegial manner and to recognize that there’s a difference between being a skeptic and being a cynic. There’s a difference between being contrarian and being critical. So, a skeptic doesn’t necessarily believe that that the plan is bad, they just want to be shown and see the proof that it’s good.A contrarian doesn’t want to rip things apart. For the sake of ripping things apart, a contrarian wants to look at things from different perspectives to make sure that the problem has been examined from every facet. So, when you are approaching your work as a red teamer, you need to be approaching it with the mindset of helping the people who develop the plan. If you’re working in a group, helping the people who developed a plan, make that plan better. Not from the perspective of showing everyone that you’re smarter than them, that you are more clever than them, that you saw what they missed.
It should be constructive and collegial. If you’re doing red teaming individually too, if you’re going to share your work, don’t be a jerk. That’s kind of our ground rule. It’s real simple. If you’re going to share your red team work, don’t be a jerk. If you’re a jerk about it, nobody will listen to what you’re saying. The first rule that I learned when I was going through the red teaming training in the army was rule number one of red team is don’t be an asshole.
Because if you, if you go to the group that has asked you to do a red teaming analysis and you say, “You know what? We looked at your plan and it’s really stupid. You guys failed to account for these three things. And by the way, what would happen if X happened? You didn’t think about that, did you? Well, we did. And here’s what would happen,” that will guarantee that A, no one listens to your red teaming analysis and B, that you will never get a chance to do another one.
The way to approach it is to say, “Hey, we looked at your plan, looked at your strategy, we think you guys came up with a really good plan here at base, but we see some key areas where it could be made even stronger. And here’s what those are.” So, it’s really about how you present your findings, how you approach your work, and to be constructive and collegial.
Another thing that you pointed out that that was useful too in the book is let’s say you’re a leader and you have a group of people doing red teaming, coming up with contrarian views and showing weaknesses. That doesn’t mean you have to take their advice and put it into action, right? Like it’s just more information for you, like you’re still the leader, you still get to make the decision.
Absolutely. And red teaming is not, is not, is not take away any decision-making authority from leaders because red teams don’t make decisions. Red teams provide leaders provide decision makers with additional information so that they can make better decisions. That’s really what red teaming is about and it’s important to understand that it’s not about saying, “Here’s what you have to do.” It’s about saying, “Here’s another option,” or, “Here’s another way of thinking about this problem that you may want to consider before you make your final decision.”
That’s very different. And that’s about empowering leaders, not about taking away their decision-making authority.
And I think another thing too, like you’re red team, you’re red teaming sometimes as well, because I can see a situation where a red team makes a suggestion or they show a leader contrarian information, but like they don’t have all the information, right? There’s other factors that the leader is taking into consideration when making the decision, but the red team didn’t even think about that themselves.
Absolutely. And that goes back to this point that red teaming is not about coming up with a better plan. It’s about making the plan better. So, you’re just simply offering some additional observations, some additional insights. Maybe they’re not valid. Maybe they are. The point of a red team is not to be right. It’s to make the organization think more deeply. So, if you look at it, the Israelis, for instance, have a red teaming organization in their military intelligence directorate called . . . which I don’t speak Aramaic, but I’m told in Aramaic translates into, on the contrary, the opposite is probably true.
And one of the things that’s been key to the success of this organization, and I’ve talked with, with Israeli intelligence officers who explained this to me that the people who are in this group are not rated by how often they are right and the organization is wrong. They’re rated on how much their analysis gets the organization to think more deeply about its own conclusions.
Well, Bryce, it’s been a great conversation. Where can people go to learn more about the book and your work?
So, the book, Red teaming How Your Business Can Conquer the Competition by Challenging Everything is available on Amazon or wherever books are sold. And you can come and visit our website, redteamthinking.com. Redteamthinking.com will tell you more about red teaming, give you more resources and also give you information about upcoming workshops that we’re offering if you’re interested in getting trained in some of these tools and techniques.
Fantastic. We’ll, Bryce Hoffman, thanks for your time. It’s been a pleasure.
Likewise, Brett, really enjoyed the conversation.
My guest here is Bryce Hoffman. He’s the author of the book Red Teaming. It’s available on amazon.com and bookstores everywhere. You can find out more information about his work at his website, brycehoffman.com. Also check out our show notes at aom.is/red teaming where you find links to resources ring, delve deeper into this topic. Well, that wraps up another edition of the AOM podcast. Checkout our ourwebsite at artofmanliness.com where you can find our podcast archives.
There’s thousands of articles we’ve written over the years about personal finances, how to be a better husband, better father, you name it. We’ve pretty much, we’ve probably covered it. And if you haven’t done so already, I’d appreciate you take one minute to give us a review on iTunes or Stitcher, it helps it a lot. If you’ve done that already, thank you. Please consider sharing the show with a friend or family member who you think would get something out of it.
And if you like to enjoy ad-free episodes of the AOM podcast, you can do so with Stitcher Premium. Head to our Stitcher Premium signup, use code Manliness for a free month trial and you can start enjoying ad-free episodes of the AOM podcast. As always, I appreciate the continued support. Until next time, this is Brett McKay reminding you not only listen to the AOM podcast, but put what you’ve heard into action.