The following is a conversation between Olivier Sibony, Author of You’re About to Make a Terrible Mistake! How Biases Distort Decision-Making and What You Can Do to Fight Them, and Denver Frederick, the Host of The Business of Giving.
Denver: We make decisions all the time, yet even the smartest and most experienced among us make frequent and predictable errors as a result of all too common cognitive biases. What are some of those biases? And which ones have come into play as government and others make decisions about the COVID-19 pandemic? To find out, it’s a pleasure to have with us, Olivier Sibony, the author of a fascinating new book called You’re About to Make a Terrible Mistake! How Biases Distort Decision-Making and What You Can Do to Fight Them.
Welcome to The Business of Giving, Olivier!
Olivier: Good morning, and thanks for having me!
Denver: Let me begin by asking you: What is a cognitive bias?
Olivier: A cognitive bias is a predictable distortion of our thinking. It’s a predictable error. It’s a way in which we go wrong, which we can anticipate. It’s not a random error. Basically, when we make a mistake and someone could have said in advance, “Hey, you’re about to make a mistake here,” which is the title of my book, that’s a bias.
We are going to have those biases, and we’re going to keep having them despite the fact that we know about them. That’s what actually makes them interesting. And so, if we want to avoid their consequences, we need to rethink the way we make decisions, not just be aware of the biases.
Denver: Are there some common misconceptions we have about these cognitive biases?
Olivier: Misconceptions about the biases? You mean errors that we make in thinking about those errors? Yes. There are some. One frequent misconception we have is that we think we can just learn about the list of those biases and avoid making them. If it were that simple!
Denver: If only!
Olivier: If we could just, for instance, read my book and never make those mistakes again. That would be very good. Unfortunately, I don’t promise that because that is not possible. We are going to have those biases, and we’re going to keep having them despite the fact that we know about them. That’s what actually makes them interesting. And so, if we want to avoid their consequences, we need to rethink the way we make decisions, not just be aware of the biases. It helps to start there, but that’s not enough.
…we should be careful not to mistake a bad outcome for a bad decision or a good outcome for a good decision. It’s a bit different.
Denver: So let’s talk about decisions. I make a decision that turns out pretty good. I end up with the successful outcome that I wanted. Would that qualify as having made a good decision?
Olivier: Not necessarily. In fact, one of the mistakes that we make when we think about our decisions and our results is that we suffer from something called “hindsight bias,” which is that in hindsight, we think that we made the right decision, in your example, because we were successful, or the wrong decisions, because we were unsuccessful.
It may be the case that you made all the right calls, and you did the right thing with the information you had, but because you took risks — and it’s good to take risks and we need to take risks — and because the world is an unpredictable place, stuff happened that you couldn’t anticipate.
If you were opening a restaurant six months ago, and you had done everything you could do to make your restaurant a successful place. You had a fantastic chef and a great menu and a fantastic guest list for your opening nights, and you opened it the night before COVID locked down your city. Well, did you make a bad decision? Obviously, you didn’t have a very good outcome. You’re in deep trouble now, and I feel your pain. But I can’t say you made a bad decision because… what could you have done differently? How would you have known?
Olivier: So we should be careful not to mistake a bad outcome for a bad decision or a good outcome for a good decision. It’s a bit different.
Denver: That’s a great distinction. Well, the COVID-19 pandemic has provided, I don’t know, a case study of sorts to see many of the biases that you discussed in your book.
So, first, what took us so long — and I’m speaking specifically about Europe and America — to realize what we were actually facing at the beginning of the COVID-19 pandemic, and why did the Asian nations just jump on it?
Olivier: It’s actually a very interesting case study. There’s more than one reason for everything, just to be clear, but the one factor that is at play here is clearly what I call the pattern recognition biases or the mental model biases you might call them.
When the Asian countries saw this thing, this bizarre unknown thing, which wasn’t even called SARS-CoV-2 at the time, when they saw it, they thought, “This is unknown, but it looks a lot like SARS. Not as bad as SARS, not as lethal, but it looks like it. Gee, let’s not make the same mistake we’ve made before in underestimating the risk. Let’s be really, really careful. There is no danger of overreacting. There’s a danger of under-reacting.”
European countries, on the other hand, and to a lesser extent, the US… or actually in different regions of the US, you get different responses, so it’s a little more complicated to read. But in Europe, the dominant reaction at the beginning was “This looks like the flu. Yes, it’s a bit more severe than the flu, but basically, it’s a flu. And we’ve had epidemics of flu before.”
And when we have overreacted, and we had one about 10 years ago where we overreacted. It was H5N1 or H1N1, and we went and bought millions of doses of vaccines and started vaccinating the whole population. And thank God, the whole thing fizzled, and it didn’t turn out to be a big pandemic. And the politicians were accused of having overreacted, of being in cahoots with the pharmaceutical industry to vaccinate the entire population. There was a big conspiracy theory around it.
So in retrospect, thinking about this event back six months ago, a lot of people were thinking, “Well, let’s not make that mistake again. Let’s not overreact.” And you could hear not just the politicians, but a lot of doctors here saying, “What’s dangerous about this is the overreaction of the politicians. This is just the flu. Let us take care of it.” So they could have been right, by the way. At the time, we didn’t know if it was closer to SARS or closer to the flu, or as it turned out, somewhere in between.
What’s interesting in this example is that because of different experiences, you look at the same facts and you have very different responses, and that’s very typical of how we make our decisions in general.
…when we see someone else facing a situation like this, we immediately look for the reasons why we are different… if you’re going to make those comparisons, you should also look for the things that make you similar, not only for the things that make you different.
Denver: Explain what’s going on when people continue to insist that the virus won’t happen here. Now, we in America thought that when it was ravaging Europe. You said, “Oh, it’s never going to come over here.” And even in this country, Olivier, when it was in New York, the people in Texas were saying, “Well, that’s New York. It’s never going to come down here.” What is going on?
Olivier: And I could tell you, Denver, when it started in Italy in late February-early March, the people in France were saying, “Well, it’s Italy. It won’t happen here.” And then you would be hard-pressed to think of a country that is more similar to France than Italy in just about every possible respect. It’s very hard to think of a country that is more comparable.
And yet, when you talk to people at the time, they said, “Well, Italy…It’s different. It’s a little bit messier out there. They are not as well organized as we are. Well, we’re not the Germans, but they are the Italians. And, of course, their health system, it’s OK in the North, but it’s not that great in the South. And you know those Italians, they have all those multi-generational families with the grandparents living under the same roof with the kids. And of course, their population is older.”
And some of those explanations may be partly right. It’s true that the average age in Italy is a bit higher than in France, but the difference is something like one- or two years. It doesn’t make a difference between a country that is immune to a pandemic and a country that is ravaged by a pandemic. So, the funny…. or actually not-so-funny thing here is that when we see someone else facing a situation like this, we immediately look for the reasons why we are different. All those reasons why the Italians are different from the French, or in your case, why Europe is different from the US, or China is different from the US, all those reasons may be true.
I remember reading an eminent intellectual in the US writing that air pollution is so bad in China, that that probably explains why the Chinese have this pandemic. And, by the way, many more people smoke in China than in the US, and that probably makes them more sensitive to pulmonary diseases. All those things may be true. The thing is, if you’re going to make those comparisons, you should also look for the things that make you similar, not only for the things that make you different.
If you start counting all the things that make France similar to Italy, you will find a lot more things than you will find things that make France different from Italy, and that should make you worry that what is happening to Italy will happen to France. But that’s not how we think.
Denver: And I guess if you even took that further, you should look at things in France that make this more susceptible than Italy. And we’ll never look at those.
Olivier: Never. Absolutely not. Very true. Yes.
Denver: Were there some other biases at play that left us so flat-footed and seemingly unprepared for what was about to happen?
Olivier: Yes. They’re all there. This is a sort of perfect storm of biases. One thing that is very, very hard for people to figure out, and even during the second wave that we have in Europe now, is that when you’ve got a time lag between what you do and the results it produces, and when the underlying phenomenon is exponential, it’s very, very hard to see what is about to hit you coming your way, which is why a month ago or two weeks ago, you still had lots of people in hospitals…I’m not talking about laypeople. I’m talking about professionals, people in hospitals in France saying, “Well, so far so good. We’ve got things under control.”
Now, it doesn’t take a lot of computation. It’s well within the intellectual capability of any doctor to realize that if you’ve got something that doubles every two weeks, which is the rate at which it was propagating before we locked down again, all that doubles every three days, which is what it was during the first wave.
Even if you are only at one-quarter of your capacity in 2X two weeks or 2X three days, you are going to be saturated. It’s very simple, but somehow, people don’t make that calculation. They think that if we’re at one-quarter of the capacity, we’re fine. Well, if you are at one-quarter of the capacity, you’re going to be saturated in one month. It’s not a very long time, especially since whatever you do to slow down the transmission of the disease, it’s going to take two- or three weeks to produce results. So the speed at which an exponential phenomenon is happening is completely counterintuitive and defeats everybody’s thinking.
If you read on the social networks, I still read very smart people — bank presidents and CEOs and so on — explaining, “Well, if only we had added more ICU capacity. If only we had been able, during the six months we had to prepare for the second wave, to double the capacity, we wouldn’t have any problems. We could remain open.” Unfortunately, that’s evidently not true. If you don’t slow down the speed at which the disease is propagating — now, it’s two weeks, so double the capacity — you will have the problem in two weeks. But that is so counterintuitive that people just don’t get it.
Denver: That’s really a great point because we are wired to think linear. And even if you can tell someone that this is an exponential problem and they get it, the solution that the bank president is suggesting is a linear solution to an exponential problem. And it’s going to catch you eventually, and it’s going to be pretty fast.
Olivier: Absolutely! And yet everybody seems to think, “Why didn’t we double this thing? Why don’t we double capacity? As if it would have solved the problem!
There’s another one by the way, which is very striking in this whole pandemic, which is that when you hear that there is danger out there, you would need to change your behavior. You would have needed to if I put myself about six weeks ago in Europe– when we’re not in the second wave yet, but we see it coming. If we think about the exponential, we see this problem is going to happen. That is the time when we should change our behavior.
What makes us change our behavior? Do we feel the risk? Do we feel the danger? Well, what do we do, then? We look around, and we see everybody behaving as if there is no danger. Now, we tend to have a bit of a herd behavior in situations like this. When everybody is behaving as if there’s no danger, we behave as if there is no danger, which given that this is a contagious disease, creates the danger.
Oddly enough, when you see now the streets are empty and deserted, it feels vaguely scary, and you hurry back home. When in fact, when you’re on the streets, you are in no risk at all because there is no one there. But somehow the herd behavior makes it very easy to remain in lockdown when you’re in lockdown and makes it very hard to behave prudently when you’re not in lockdown, which is another tricky balance to strike for the people who try to manage this public behavior.
Denver: Yes, it’s sort of imitation bias. I look at the situation we had here in this country when the White House held an event in the Rose Garden, and the people who were at that event– turned out to be a super-spreader event. But people who had worn masks every single day up until that event, took the mask off that day because they looked at the social pressure around them and said, “No one’s wearing a mask. I’m not going to be the only one. Probably the president doesn’t want me to,” and a whole bunch of them got COVID.
But it was that imitation bias of, as you’ve just suggested, you look around, you see what everybody else is doing, and you tend to fall into line.
Olivier: That’s a great example because you’ve got another factor there, which is the social pressure, and in this case, the hierarchical pressure. If the president is your boss, and the president has been on the record and has made a big political point of not wearing masks, you send a message. You make a statement by wearing a mask in the company of people who are not, and vice versa.
In organizations in general, not just in politics, that’s another big source of error: when you basically line up with the opinion of the boss despite the fact that you may have your misgivings about what the boss is saying.
Denver: Maybe you can help with this one, Olivier, and that is what is at work here with some of the irrational beliefs and conspiracy theories that people have about this pandemic?
As an example, there have been some nurses who have reported that their patients who are about to die utter with their literally last breath that this whole pandemic is a hoax. What is going on?
Olivier: Well, that’s an extreme example. I hadn’t heard that one.
So as long as we don’t change the system by which we are exposed to news, if we don’t actually go out of our way to look for information that we don’t like, we are in trouble.
Denver: We had it on the news last night, and these nurses were just devastated.
Olivier: I don’t know, frankly, what can lead someone to be saying that with their literal dying breath. I suppose it’s mostly denial that is at work there. Now, before we get to that dire extremity, why do people believe all sorts of crazy stuff in general? And especially about this pandemic. But it could be about whether there’s been massive voting fraud or just about any topic in the news. How can people believe in conspiracy theories of all kinds?
I think the obvious driver here is the fact that people get their information, get their news from very restricted set of sources of information, which is self-selected either by the media that they choose to follow, or more frequently by the social media that they choose to connect with, and where they are surrounded by people who are like them in every possible respect.
This is partly intentional because people have friends who are like them, and partly unintentional because the social networks connect them with people who think like them and present them with the posts and the opinions of people who think like them. And so you can be exposed almost exclusively to “information” — in quotes information because it’s not information — to claims that support your false beliefs.
Now, I can’t blame someone who is exposed only to fake information for believing that fake information. That is actually not a cognitive bias. It’s not an error to tend to believe everything that you are faced with when you are not faced with contradicting information. That is fairly predictable and fairly rational. So, as long as we don’t change the system by which we are exposed to news, if we don’t actually go out of our way to look for information that we don’t like, we are in trouble.
It used to be easy because we used to be exposed to networks and newspapers that had basic journalistic rules that say, “You need to give an equal air time to various points of view–“
Denver: Yes. The fairness doctrine. Right.
Olivier: Fairness doctrine and so on. So that used to go without saying, you didn’t need to have to make an effort for that to happen. Now, if you don’t make that effort, you will be locked inside a bubble of what could be fine and what could be completely fake.
Denver: Yes. So many of us are living in echo chambers, and then groupthink takes over. And I think there is also this need for people who want to feel part of a community or a part of that “in” group that you were talking about before, so it just reinforces all of that.
When the future is so uncertain, filled with so many unknown variables, many people continue to go ahead and make these bold predictions with a tremendous degree of certitude, even if their last three predictions have been completely wrong. It doesn’t slow them down at all. What is happening there?
Olivier: There are several things. From the perspective of the people who are making those predictions, what you see is overconfidence, which is not unreasonable. It’s more extreme in periods of insurgency because people should be less confident, and it typically does not make them less confident. It tends to make them more confident.
The interesting question though is: Why do we listen to that? Why do we keep listening to people who have been proven wrong many times? Why do we still see on TV and listen to people who were, back in March, telling us that, “Oh, this is just the flu. It’s going to go away in three weeks.”
Denver: Sometimes I feel that. Why do I listen to the weatherman? He’s been wrong the last four days in a row, but I continue to listen to what he has to say.
Olivier: Well, the weatherman is not that bad. The weatherman gets it right. The weatherman has made a lot of progress in the past few decades.
But we crave that sort of certainty. We love to listen to someone who tells us, “Oh, don’t worry. It’s going to go away,” or someone who, in fact, tells us, “Worry a lot. Be afraid, be very afraid. It’s going to be terrible. Hunker down.” Because at least we know what to expect. We hate uncertainty. We hate ambiguity. We’re prepared to pay to avoid ambiguity. And we would much prefer to listen to someone who asserts with a great deal of certainty something that has a high probability of being false than to listen to someone who says, “You know what? I don’t know. I don’t have a clue.”
Denver: Interesting. Yes. We want to have a sense of control, even if it’s not correct. We feel more comfortable thinking that we know what the future is going to bring.
Olivier: Exactly. And we want to trust people who give us a sense of knowing where they’re headed. We want to have leaders who tell us, “This is what the future is going to look like, and we’re preparing for it. Trust me and follow me.” That’s a lot more reassuring, at least at the beginning, than having leaders who tell you, “Well, I don’t know what the future is going to be like.”
So we like to follow people who tell us what the future is going to be like until, of course, the future turns out not to be what they had predicted, which is when they say, “Oh, they weren’t the right leaders after all.” And that’s when they’re in trouble.
Denver: Circling back to the beginning of our conversation about trying to overcome your biases, and as difficult as that may be, if not impossible, even after reading your book, let me ask the question nevertheless: How can a person improve their decision-making process to avoid these biases, at least to the degree that they possibly can?
Olivier: Well, the person can only do a limited number of things unless that person is, as most people are, part of an organization of some kind, part of a decision-making process of some kind. So, therefore, you don’t have to be a CEO. If you’re the leader of a team and you make decisions for that team and together with the team, you’ve got to work with that team to make your decisions.
Now, to be clear, that doesn’t mean that you are delegating the decisions to them. That doesn’t mean that you’re abdicating your responsibility. That doesn’t mean that the decision is going to be made by a committee or by a vote or by consensus. What it means is that you’ve got to make a special effort to elicit the opinions of the people you have on that team, and to make sure that you hear their voices and their judgments.
And you probably think you do; you probably think that you’re a great boss and that people are not afraid of speaking up. Every boss I know thinks that, and many of them are wrong. They underestimate the extent to which they trigger groupthink in their team. One very simple example. When you run a meeting with your team, how do you ask people to give you their views? If you do what most people do, which is to go around the table and to ask people to speak in sequence and to tell you what they think, you are engineering groupthink.
By the way, the first person who speaks probably agrees with you and is going to say what you want to hear or what you’ve telegraphed that you want to hear, because you may have signaled unwittingly or not what you thought the right answer was. The second person, after hearing the first person, is going to be influenced a little bit, and the third person is going to be influenced by the first two. And you’re going to come up with this great consensus–
Denver: Oh, absolutely!
Olivier: –which builds the enthusiasm of the entire team for implementing the decision. What you will not have heard are the three important reasons why some of the members of your team would have disagreed because you didn’t have a good process to hear their voices.
So in things as simple as the way you have people speak in a meeting, you need to think hard about the process for making them speak up. One thing you might do for instance, in a situation like this is simply ask them to write down what they think before they speak up. Now, it doesn’t take a lot of time. It takes a couple of minutes, but it makes a big difference to how people are going to be willing to express themselves.
Or you could do what the psychologist Gary Klein calls a “premortem,” where you imagine that you are in the future and this decision has failed, and you’re doing a postmortem of the decision and you ask people to speak up about what went wrong in this imagined future where you failed.
Or you could ask people to tell you the reasons why they think it’s a good idea, whatever you’re proposing, and the reasons why they think it’s a bad idea, and ask everybody to give you a nuanced balance-sheet view of the proposal.
I’m just giving you two or three examples here to illustrate that there’s a large number of ways or practical techniques you can use to improve the quality of your deliberation and your debates, and that is going to make a big difference to the quality of your decisions. But it won’t happen just because you’re a nice guy.
Denver: Right. You do a wonderful job in the second half of the book about talking about decision-making architecture. And so many of the meetings that I’ve been in, the CEO says, “Let me share with you what my thinking is on this,” and then basically sets the anchor right there. So anybody else is not going to go more than three degrees to the left or right of that anchor, but it’s a sense of “I want to get everybody’s opinion.”
Well, once you do that, there aren’t many opinions that are going to stray too far. So you’re almost looking for the outlier opinion, the one who’s the most against consensus, and have them speak first because that’s how you get to the better decisions, I guess.
Olivier: At least if you want to get to the right decision. What the CEO is doing in the example, in the very typical example that you’re describing, Denver, is actually not trying to hear people’s views on the decision. He’s trying to get people to voice their support for the decision so that he gets their commitments to its implementation.
That is necessary, too. There’s a time for that. If you are at the time before that time, when you are trying to decide whether it’s the right decision, you want to get them to speak up. And if you confuse one thing with the other, you run the risk of confidently and unanimously making a very bad decision.
Denver: I got friends who work in data analytics departments, and they will tell me that leadership will come to them saying, “Give me some information on this to support my decision.” So the decision is leading the data; the data is not leading to decision, because “we’re going to do what we’re going to do anyway, but I need some cover.”
Olivier: Of course, and we all know that if you want to support a decision or whatever it is, you will always find the data to support it. You just need to be selective in what data you look at.
I think that style of leadership, which acknowledges uncertainty, which recognizes ambiguity, and which is willing to admit the inevitable vulnerability that it implies, is the sort of leadership that we’re going to need more of in the future.
Denver: Exactly right. Finally, Olivier, there’ve been so many decisions that have been made in connection with this pandemic in every country across the world. What would be some of the lessons that executives can take from this that will help inform their leadership?
Olivier: I have been struck by how different styles of leadership have played out in this crisis. And interestingly, there is some research now showing that female leaders have been more effective than male leaders when you compare country to country and even in the US, state to state. And at the risk of stereotypes here, I think there’s a greater willingness among female leaders to position themselves as positive, caring, committed to success, but not necessarily all-knowing and not necessarily all-powerful, and to some degree, vulnerable and willing to admit that they cannot tell what the future is going to be like.
I was especially impressed by one US governor actually, Gina Raimondo, who was saying in an interview at the beginning of the pandemic, “I’m going to make mistakes. All I can tell you is that we’re going to make the least bad of the possible decisions. They’re all bad. And all I can promise is that when we realize that we have made a mistake, we will change.” Now, that is a language of honesty that I haven’t seen in many, many leaders anywhere and especially not in Europe, and not, by the way, in businesses either. It’s not just political leaders. I haven’t seen in many leaders that sort of openness.
The beauty of speaking like this, by the way, is that you will be wrong. And so if you’ve told people that you will be wrong, and that when you’re wrong, you will change your mind, when you do change your mind, it won’t come across as you being indecisive or flip-flopping or having been wrong the first time. It will be what you promised. You promised that you would change as the facts change, and you’re doing what you promised.
I think that style of leadership, which acknowledges uncertainty, which recognizes ambiguity, and which is willing to admit the inevitable vulnerability that it implies, is the sort of leadership that we’re going to need more of in the future.
Denver: I think you’re absolutely right. And you know what it also suggests to me, Olivier, because I think that’s such a wonderful point, when a leader takes that tack, let’s say in an organization, it says to me, as one of the staff people there, you need me, you don’t have this all figured out. So instead of just being impressed by the plan, it’s like, you need my help. And all of a sudden, I’ve become an engaged employee trying to help the leader who doesn’t have the answer to everything. And that can change a dynamic dramatically.
Olivier: Absolutely. So not only will it make you more willing to contribute your opinion and to help, it will also make you more committed to the outcome. Because once you’ve been heard, once the leader listens to you and once you’ve been heard, even if your point of view is not the one that actually wins the day, you will be more committed to the outcome because you are part of the outcome.
Denver: The book is You’re About to Make a Terrible Mistake! How Biases Distort Decision-Making and What You Can Do to Fight Them. You would be making a terrible mistake if you didn’t get your hands on it. Thanks, Olivier. It was such a pleasure to have you on the show.
Olivier: Thank you, Denver.
Listen to more The Business of Giving episodes for free here. Subscribe to our podcast channel on Spotify to get notified of new episodes. You can also follow us on Twitter, Instagram, and on Facebook.