Before the Iraq invasion, President Bush did not say, "I think that there is a 60 percent chance that Saddam has an active WMD program." Al Gore does not say, "I think there is a 2 percent chance that if we do nothing there will be an environmental catastrophe that will end life as we know it."
I wonder if there is not a social biology expiation for the tendency of speakers to overstate their certainty. One could argue that in a small hunter gather tribe, in a crises situation, a decisive response presented the best chance for survival. Indeed, almost any decisive action, for example fight or flee, is preferable to indecision. Groups who followed leaders who were decisive had a greeter chance of survival. Leaders either did not see alternatives or believed that the verbalization of alternatives would be correlated with indecisiveness. As a result we are more “comfortable” with a leader (politician or doctor) who talks in terms of certainty, even when we understand intellectually that there is no certainty.
"Question Authority" - I haven't found the original source.
"Don't pamper yourself, question your own beliefs as relentlessly as you question authority." - William Swift
I have found the best way to keep from falling for any sort of false information is to be skeptical, and to be more skeptical when I would prefer it to be true. Always be most skeptical of what you most want to be true.
With regard to medicine, the evidence-based medicine movement is an attempt to improve physician decision making. Emerging as a term in the early 90's, Googling "evidence-based medicine" yeilds about 1.2 million hits. With regard to calculus, statistics is probably more useful as a pre-med requirement but on the other hand one typical undergrad biostats class likely doesn't make one a Bayesian. To improve clinical practice, specific methods are being developed to summarize the best evidence for physicians. Google "Cochrane Collaboration" to find the mother lode of these efforts.
"Respectful Insolence" is a blogging academic surgeon who covers this topic. Time magazine has an article on EBM under the title "Are Doctors Just Playing Hunches?", Thursday, Feb. 15, 2007 By CHRISTINE GORMAN
And yes, they often are. Medicine involves tinkering with very complex biological systems that have many idiosyncracies.
Barkley Rosser:I put the odds at 20% that the person responsible for the anthrax has been identified, caught, and released.
FWIW, in one sector of climate science it has recently become very fashionable to adopt a position of maximal uncertainty. This has the politically convenient effect of assigning high probability to a future climate-change-induced disaster and thus any reasonably risk-averse cost-benefit analysis supports the taking of strong action.
It's a short step from the vacuous but emotive "we can't rule it out" to assigning a 5-10% probability...
TGGP and Matthew are right - a constant and known degree of overconfidence should not interfere with information transmission - the problem is variation in overconfidence.
shouldn't people then assume that people claiming certainty are exaggerating, and discount those statements more than those of people who claim a high but not 100% certainty?
I do this, both when I feel their opinion is probably true and when I judge it probably incorrect.
True believers and true disbelievers are equally impermeable to new information.
Josh, shouldn't people then assume that people claiming certainty are exaggerating, and discount those statements more than those of people who claim a high but not 100% certainty?
You are basically correct about the WMD issue. However, I cannot resist pointing out that the WMD over which there was actual disagreement were chemical weapons, with the possibility of a bioweapons lab. However, it was fully agreed upon that Saddam had no delivery capability for chemical weapons. Therefore, although there was maybe a slight case over bio weapons (and whoever it was who sent that anthrax back in 2001 still has neither been identified nor caught, not that that amounted to much), there was no reason whatsoever to be concerned about whatever chemical weapons Saddam might have had as an actual threat to US national security. Even if they had existed, there was no justification at all for invading Iraq to go after them.
I had a similar experience with a doctor who was pushing beta blockers. I have high cholesteral and he, without asking, wrote me a prescirption for lexipro or something. I asked why and he told me that it'd reduce my chance fo heart-attack by 40%. I asked what my chance of heart attack currently stood at? 50%? 70%? He didn't know, and wouldn't venture a guess. He had simply arrived at the certainty that a reduction of probablity, without knowing the probability istelf, meant the decision was a no-brainer.
He was also irate when I declined to take the drug.
Rather than saying, people don't think like Bayesians, isn't it more likely that if you were trying to influence somebody else's Bayesian analysis to be closer to your own, you should try to project a level of certainty that overshoots as they will discount your oppinion somewhat. Perhaps this game is one of the reason's that people discount other peoples opinions in the first place. Also, both Bush and the Doctor were selling something, not necessarily trying to convince you that they were correct.
Although it I am digressing slightly from your main point, I think it bizarre that (AFAIK) pre-med programs require calculus rather than basic stats. A Tufte-Tukey based exploratory data analysis course would help encourage statistical thinking, and an intro to stats course could further develop the ability to think in terms of odds, probabilities, etc.
However, although that would help the doctor in dealing with you, doctors in particular need to project certainty. There is research evidence to suggest that the doctor's confidence in the efficacy of the treatment they are advocating improves its efficacy. Apparently, if the doctor doesn't believe strongly in his treatment we won't respond as fully. Curious, no?, and consistent with your hunch that we generally want certainty.
Other evidence consistent with a desire for certainty is the demand for forecasts by experts despite the accumulating evidence for the inability of experts to outforcast the well-informed lay observer. (On this, see Philip Tetlock's recent book: Expert Political Judgment: How Good Is It? How Can We Know?)