The bias I’m talking about here–I’m not quite sure what to call it–is the readiness to assume bias where possibly none exists. Or, more generally, the overestimation of the magnitude of a bias, or the attribution to bias of a phenomenon that can be explained more directly. I’m thinking specifically of
I'll echo the people above who say that it is often genuinely possible to retrospectively identify mistakes, hindsight bias notwithstanding. September 11 is a clear example of this, not only because there is evidence that the right people were warning about something like this at the time and were ignored, but also because the Bush administration's record on everything else is equally horrible, making it highly plausible that they failed here too. If an American city is destroyed by a nuke some time soon, it will be because the government is currently doing nothing to prevent it, and anyone who happens to still be alive to point that out after the fact will be right.
One last thing - I don't mean to imply that the Clinton administration would have prevented 9/11, but rather that there were people with equal status as "qualified professionals" that considered Al Qaeda a bigger threat than the Bush administration did.
Clearly, there is such a thing as hindsight bias. The question is to determine how much hindsight bias we have in a particular instance, when we're judging somebody else's judgement.
And this is very hard because we don't know what they knew. We don't know what they should have known.
Our data is utterly inadequate to make this determination. Consider that we have no evidence against the idea that the Bush administration or Mossad or whoever was complicit in the attacks. Our main reason to think they weren't is that we are nice people who don't want to think badly of them. If we suppose that they might have been complicit, or even that they might play CYA beyond what's strictly legal, then we have very little data about the topic that isn't suspect.
So I think maybe as a general rule the concept of hindsight bias is most useful when people that we believe are honest when they tell us what happened, or when we consider our own mistakes and we can honestly remember what we were thinking at the time and decide whether we should have seen then the things we see now.
Regarding 9/11 specifically: According to Al Franken's book "Lies and the Lying Liars Who Tell Them", Clinton thought of Osama bin Laden's group as a serious problem (they did cause the embassy bombings and such) and considered it important to deal with them. Bush's administration, acting on the grand principle of "do the opposite of whatever Clinton did", actually reduced anti-Al-Qaeda efforts prior to 9/11.
Considering that the Bush administration has been such a colossal failure that they couldn't have done worse if they were deliberately trying to hurt the United States, I don't give them any credit for knowing how to do their jobs right. It's gotten to the point where if they say the sun is shining, I'll start looking for an umbrella.
Incidentally, one case where I do accept "intelligence failure" is the 1973 Yom Kippur War, because you had other agencies screaming ONE warning at the top of their lungs.
1. I am not accusing the CIA etc of having made unforgivable mistakes. Not only am I not an expert in this area, I'm not even particularly well informed. What I'm saying is that it could itself be a bias to dismiss more informed people's criticisms by simply saying "hindsight bias." People, and institutions, do make mistakes, and sometimes these are indeed avoidable. I think that it's useful to recognize the existence of hindsight bias, and to think about the possibility in this case too--but we also have to be careful to avoid it being an all-around excuse that can be used to cover any and all failures.
2. I'm certainly not saying I would've done better. I'm saying that we shouldn't immediately dismiss knowledgeable people who say that the CIA etc could've done better with better procedures (or if they'd better followed existing procedures. Similarly, if I make a serious statistical mistake which another statistician points out, you could agree with that second statistician without having to claim that you personally would've done better.
3. To advance the discussion it might help to have agreement on what is a avoidable mistake. From one (circular-reasoning) perspective, no mistake is avoidable, or they would've been avoided! In this case, hindsight bias doesn't even come into play: there is no such thing as an avoidable mistake in any case. I assume that you're not taking such an extreme position.
Anyway, my point is that hindsight bias will tend to overestimate the extent to which mistakes are avoidable--but that doesn't mean that no mistakes are avoidable. And to make the case about a particular example, for example the 9/11 attacks, I think you have to address the particularities of the case. I don't think it's appropriate to preemptively let the intelligence agencies off the hook for prior warnings of terrorist attacks, mafia activity, and nuclear material for sale, just because of the existence of hindsight bias.
We really don't know about CIA successes. Anything we know about is a less-than-success because we know. Was Chernobyl a CIA plot? Likely one of the biggest single events that dismantled the USSR, but how would we know about the CIA role? A lot of americans would be upset if we found out it was true.
Similarly there's unclarity about CIA failures. There was Project Camelot, where the CIA paid to for polling in a latin american country, and the citizens were outraged. A former czech intelligence guy later claimed he'd been involved in that -- the czech spy group printed up a collection of extremely insulting questionnaires and mailed them with a Project Camelot address, and they were proud to get results so cheap. He was a defector when he wrote the book, was he telling the truth or was he doing damage control for the CIA?
There's bias in our data, which would tend to bias our conclusions. Here's what scares me -- if the CIA was actually doing ruthless and effective work with ineffective oversight, how would they want the public to view them? Compare that to how the public does view them. It matches scarily well. But then, if they're a bunch of bozos and the public knows it, that fits too.
I don't expect the CIA/NSA/FBI to do a good job because they are government agencies that never go out of business or have other ill consequences when they do a poor job. Nobody knows what the NSA has actually done in the past (that's why it's called "No Such Agency"), but the CIA and FBI have a rather checkered past. William F. Buckley used to work for them, and this is what he had to say about an assassination attempt on Indonesian dictator Sukarno: "The recent assassination attempt on Sukarno has all the earmarks of a CIA operation. Everyone in the room was killed - except Sukarno" and Will Rogers said of the FBI after their failed raid on Little Bohemia "Someday when the FBI is shooting some innocent bystanders they may hit Dillinger by accident".
If you accuse the CIA/NSA/FBI of having made unforgivable mistakes on 9/11, what you are really saying is "I would have done better." Why should I believe this claim when the trained professionals did not, in fact, do better? Even if I discounted hindsight bias, my next hypothesis would be the Lake Woebegon effect.
On September 10th, 2001, you wouldn't even have been thinking that the CIA/NSA/FBI was the most important place for you to focus your attention. You would have been thinking about the Department of Education.
Are the studies useful? Let me add two scenarios to the Jury Assessing Negligence scenario. Suppose you have to evaluate someone's competence: you're a manager deciding whether to promote (or fire) a good (or bad) employee, or you're a voter deciding which politician to vote for.
In either case, you might look at a mistake of theirs that were obvious in retrospect, and say, "no one competent would make these gaffes, clearly one of their peers would not have made the same mistake." Knowing the hindsight bias makes me a little more forgiving, compared with if I did not know of the hindsight bias. So, I find knowledge (and reminders) of the hindsight bias *a little bit* useful. Am I over-estimating the strength of the hindsight bias when I do this, and over-adjusting? From Eliezer's article:
"A third experimental group was told the outcome and also explicitly instructed to avoid hindsight bias, which made no difference..."
So, I'm more likely to under-adjust than over-adjust. Let me make this empirical prediction: a follow-up study where the juries are made up of People who Consider Themselves Ultra-Serious About Reducing Bias (PCTUSRB) would produce similar under-adjustment results, and this would be robust even if they were informed beforehand that previous PCTUSRB juries had also under-adjusted. (But if the prospect of over-adjusting really bothers you, adjusting a teeny tiny little bit for important decisions is still a superior strategy to not adjusting at all.)
Note: when I say this is a *little bit* useful, this as actually high praise. Since the vast majority of blog posts and you or I read on the Internet turn out to be *not at all useful* in the Big Picture, any blog where even 10% of the posts are *a little bit useful* to me get immediately added to my RSS feed. Next "open topics" session, I'll write up some unsolicited advice about what type of "Overcoming Bias" blog I would consider "a lot" useful.
Informal logic was the study of "fallacies". The thinking was that if we could "identify" incorrect reasoning models, then by knowing about them we would avoid bad reasoning. The project was a spectacular failure - we can multiply fallacies with ease, identify and categorize them in a number of ways, and to what point?
We don't study the mistakes people make in simple arithmetic and categorize these mistakes as types of bias.
Why should we do so when the object of study is reason?
And on the hindsight of sneak attacks, a good place to start thinking about what went wrong is Tom Schellings introduction to Wohlstetter's Pearl Harbour - Warning and Decisions.
I'll echo the people above who say that it is often genuinely possible to retrospectively identify mistakes, hindsight bias notwithstanding. September 11 is a clear example of this, not only because there is evidence that the right people were warning about something like this at the time and were ignored, but also because the Bush administration's record on everything else is equally horrible, making it highly plausible that they failed here too. If an American city is destroyed by a nuke some time soon, it will be because the government is currently doing nothing to prevent it, and anyone who happens to still be alive to point that out after the fact will be right.
One last thing - I don't mean to imply that the Clinton administration would have prevented 9/11, but rather that there were people with equal status as "qualified professionals" that considered Al Qaeda a bigger threat than the Bush administration did.
Clearly, there is such a thing as hindsight bias. The question is to determine how much hindsight bias we have in a particular instance, when we're judging somebody else's judgement.
And this is very hard because we don't know what they knew. We don't know what they should have known.
Our data is utterly inadequate to make this determination. Consider that we have no evidence against the idea that the Bush administration or Mossad or whoever was complicit in the attacks. Our main reason to think they weren't is that we are nice people who don't want to think badly of them. If we suppose that they might have been complicit, or even that they might play CYA beyond what's strictly legal, then we have very little data about the topic that isn't suspect.
So I think maybe as a general rule the concept of hindsight bias is most useful when people that we believe are honest when they tell us what happened, or when we consider our own mistakes and we can honestly remember what we were thinking at the time and decide whether we should have seen then the things we see now.
Regarding 9/11 specifically: According to Al Franken's book "Lies and the Lying Liars Who Tell Them", Clinton thought of Osama bin Laden's group as a serious problem (they did cause the embassy bombings and such) and considered it important to deal with them. Bush's administration, acting on the grand principle of "do the opposite of whatever Clinton did", actually reduced anti-Al-Qaeda efforts prior to 9/11.
Considering that the Bush administration has been such a colossal failure that they couldn't have done worse if they were deliberately trying to hurt the United States, I don't give them any credit for knowing how to do their jobs right. It's gotten to the point where if they say the sun is shining, I'll start looking for an umbrella.
Related discussion at Volokh Conspiracy
Incidentally, one case where I do accept "intelligence failure" is the 1973 Yom Kippur War, because you had other agencies screaming ONE warning at the top of their lungs.
Eliezer,
1. I am not accusing the CIA etc of having made unforgivable mistakes. Not only am I not an expert in this area, I'm not even particularly well informed. What I'm saying is that it could itself be a bias to dismiss more informed people's criticisms by simply saying "hindsight bias." People, and institutions, do make mistakes, and sometimes these are indeed avoidable. I think that it's useful to recognize the existence of hindsight bias, and to think about the possibility in this case too--but we also have to be careful to avoid it being an all-around excuse that can be used to cover any and all failures.
2. I'm certainly not saying I would've done better. I'm saying that we shouldn't immediately dismiss knowledgeable people who say that the CIA etc could've done better with better procedures (or if they'd better followed existing procedures. Similarly, if I make a serious statistical mistake which another statistician points out, you could agree with that second statistician without having to claim that you personally would've done better.
3. To advance the discussion it might help to have agreement on what is a avoidable mistake. From one (circular-reasoning) perspective, no mistake is avoidable, or they would've been avoided! In this case, hindsight bias doesn't even come into play: there is no such thing as an avoidable mistake in any case. I assume that you're not taking such an extreme position.
Anyway, my point is that hindsight bias will tend to overestimate the extent to which mistakes are avoidable--but that doesn't mean that no mistakes are avoidable. And to make the case about a particular example, for example the 9/11 attacks, I think you have to address the particularities of the case. I don't think it's appropriate to preemptively let the intelligence agencies off the hook for prior warnings of terrorist attacks, mafia activity, and nuclear material for sale, just because of the existence of hindsight bias.
We really don't know about CIA successes. Anything we know about is a less-than-success because we know. Was Chernobyl a CIA plot? Likely one of the biggest single events that dismantled the USSR, but how would we know about the CIA role? A lot of americans would be upset if we found out it was true.
Similarly there's unclarity about CIA failures. There was Project Camelot, where the CIA paid to for polling in a latin american country, and the citizens were outraged. A former czech intelligence guy later claimed he'd been involved in that -- the czech spy group printed up a collection of extremely insulting questionnaires and mailed them with a Project Camelot address, and they were proud to get results so cheap. He was a defector when he wrote the book, was he telling the truth or was he doing damage control for the CIA?
There's bias in our data, which would tend to bias our conclusions. Here's what scares me -- if the CIA was actually doing ruthless and effective work with ineffective oversight, how would they want the public to view them? Compare that to how the public does view them. It matches scarily well. But then, if they're a bunch of bozos and the public knows it, that fits too.
I don't expect the CIA/NSA/FBI to do a good job because they are government agencies that never go out of business or have other ill consequences when they do a poor job. Nobody knows what the NSA has actually done in the past (that's why it's called "No Such Agency"), but the CIA and FBI have a rather checkered past. William F. Buckley used to work for them, and this is what he had to say about an assassination attempt on Indonesian dictator Sukarno: "The recent assassination attempt on Sukarno has all the earmarks of a CIA operation. Everyone in the room was killed - except Sukarno" and Will Rogers said of the FBI after their failed raid on Little Bohemia "Someday when the FBI is shooting some innocent bystanders they may hit Dillinger by accident".
If you accuse the CIA/NSA/FBI of having made unforgivable mistakes on 9/11, what you are really saying is "I would have done better." Why should I believe this claim when the trained professionals did not, in fact, do better? Even if I discounted hindsight bias, my next hypothesis would be the Lake Woebegon effect.
On September 10th, 2001, you wouldn't even have been thinking that the CIA/NSA/FBI was the most important place for you to focus your attention. You would have been thinking about the Department of Education.
Are the studies useful? Let me add two scenarios to the Jury Assessing Negligence scenario. Suppose you have to evaluate someone's competence: you're a manager deciding whether to promote (or fire) a good (or bad) employee, or you're a voter deciding which politician to vote for.
In either case, you might look at a mistake of theirs that were obvious in retrospect, and say, "no one competent would make these gaffes, clearly one of their peers would not have made the same mistake." Knowing the hindsight bias makes me a little more forgiving, compared with if I did not know of the hindsight bias. So, I find knowledge (and reminders) of the hindsight bias *a little bit* useful. Am I over-estimating the strength of the hindsight bias when I do this, and over-adjusting? From Eliezer's article:
"A third experimental group was told the outcome and also explicitly instructed to avoid hindsight bias, which made no difference..."
So, I'm more likely to under-adjust than over-adjust. Let me make this empirical prediction: a follow-up study where the juries are made up of People who Consider Themselves Ultra-Serious About Reducing Bias (PCTUSRB) would produce similar under-adjustment results, and this would be robust even if they were informed beforehand that previous PCTUSRB juries had also under-adjusted. (But if the prospect of over-adjusting really bothers you, adjusting a teeny tiny little bit for important decisions is still a superior strategy to not adjusting at all.)
Note: when I say this is a *little bit* useful, this as actually high praise. Since the vast majority of blog posts and you or I read on the Internet turn out to be *not at all useful* in the Big Picture, any blog where even 10% of the posts are *a little bit useful* to me get immediately added to my RSS feed. Next "open topics" session, I'll write up some unsolicited advice about what type of "Overcoming Bias" blog I would consider "a lot" useful.
Informal logic was the study of "fallacies". The thinking was that if we could "identify" incorrect reasoning models, then by knowing about them we would avoid bad reasoning. The project was a spectacular failure - we can multiply fallacies with ease, identify and categorize them in a number of ways, and to what point?
We don't study the mistakes people make in simple arithmetic and categorize these mistakes as types of bias.
Why should we do so when the object of study is reason?
And on the hindsight of sneak attacks, a good place to start thinking about what went wrong is Tom Schellings introduction to Wohlstetter's Pearl Harbour - Warning and Decisions.