Author Archives: Arnold Kling

Bias Toward Certainty

Before the Iraq invasion, President Bush did not say, "I think that there is a 60 percent chance that Saddam has an active WMD program."

Al Gore does not say, "I think there is a 2 percent chance that if we do nothing there will be an environmental catastrophe that will end life as we know it."

Instead, they speak in the language of certainty.  I assume that as political leaders they know a lot better than I do how to speak to the general population.  So I infer that, relative to me, the public has a bias toward certainty.

Another piece of evidence of that is an anecdote I cite about my (former) doctor.  Several years ago, he said I needed a test.  I did some research and some Bayes’ Theorem calculations, and I faxed him a note saying that I did not think the test was worth it.  He became irate.

I think that one reason that our health care system works the way it does is that it does not occur to anyone to say, "OK, I can live with that level of uncertainty."  Instead, we must have the MRI, or the CT scan, or whatever.  Even, as in my case, when the patient is willing to live with uncertainty, the doctor has a problem with it.

Another way that bias toward certainty shows up is in the way we handle disagreement.  People don’t say that there were differences within the intelligence community about the probability distribution for Saddam having WMD.  They say that Bush manipulated the intelligence.  And they are right, in the sense that he tried to make it sound certain. 

My point is that what comes naturally to a lot of people on this blog–thinking in Bayesian terms–is in fact very unnatural in general.  It seems as though outside of the realm of sport betting, people don’t like to think in terms of chance.  Maybe there are realms where even those of us who are more Bayesian than most are victims of bias toward certainty.

GD Star Rating
loading...
Tagged as:

crackpot people and crackpot ideas

I listened to Brian Doherty talk about his book on the history of libertarianism.  One point he makes is that in the forties and the fifties, libertarians were mostly crackpots.  He suggests that this is likely to be the case for any dissident idea.

This suggests that different ideas are going to occupy different niches.  For example, suppose that there is a large niche for anti-capitalist ideas.  The actual ideas occupying that niche may be different in different time periods, but something always fills that niche.

There may different niches for pessimistic ideas and optimistic ideas.

When there is a popular idea and a crackpot idea, which is more likely to be right?  Instead of thinking about this problem by thinking in terms of a probability distribution, it may be useful to think of an ecological model.   What sort of false ideas are likely to occupy particular niches, including the niche of popular opinion?  What sort of false ideas are likely to survive by finding crackpots to host them?

Belief in anthropogenic global warming is becoming popular.  Skepticism is becoming crackpot.  What is the probability that the global warming partisans will turn out to be the crackpots?  How does that probability depend on the niche that the global warming idea occupies?

I know that the ecological metaphor has been used in this context, with the term "meme," but I admit I have never read the literature, so I don’t know if the connection between bias and survival of memes has been addressed there.

GD Star Rating
loading...
Tagged as:

Rationalization

Steven Pinker writes,

Take the famous cognitive-dissonance experiments. When an experimenter got people to endure electric shocks in a sham experiment on learning, those who were given a good rationale ("It will help scientists understand learning") rated the shocks as more painful than the ones given a feeble rationale ("We’re curious.") Presumably, it’s because the second group would have felt foolish to have suffered for no good reason. Yet when these people were asked why they agreed to be shocked, they offered bogus reasons of their own in all sincerity, like "I used to mess around with radios and got used to electric shocks."

…The brain’s spin doctoring is displayed even more dramatically in neurological conditions in which the healthy parts of the brain explain away the foibles of the damaged parts (which are invisible to the self because they are part of the self). A patient who fails to experience a visceral click of recognition when he sees his wife but who acknowledges that she looks and acts just like her deduces that she is an amazingly well-trained impostor. A patient who believes he is at home and is shown the hospital elevator says without missing a beat, "You wouldn’t believe what it cost us to have that installed."

I think readers of this blog will enjoy Pinker’s entire essay.

GD Star Rating
loading...
Tagged as:

Socially Influenced Beliefs

The discussion of the atheistic tendencies of professors leads me to posit the following.

1.  Suppose that we say that beliefs are primarily influenced by social considerations.  You believe X because you want to earn the friendship/respect of people around you.  So, if you are around God-fearers, your social instinct is to believe in God.  If you are around atheists, your social instinct is to be atheist.

2.  Suppose that you are inclined to believe x.  If your reference group does not believe x, then you will pay attention to evidence against x and reconsider your position.  However, if the group also believes x, you will want to search for evidence in favor of x and to be skeptical about evidence against x.  That is, we try pretty hard to align our thinking to conform to that of our reference group.

3.  Even our belief in mathematical and scientific propositions has a social component to it.   

4.  Academic intellectuals learn something of how to question beliefs in a rational way.  This makes them a bit less inclined to fall for popular superstitions.

5.  However, even academic intellectuals are leery of questioning beliefs within their own reference group.  So it is possible for a group of academics to get stuck in an equilibrium in which they believe a dubious proposition.  One hopes that eventually someone comes along and questions the conventional wisdom in such a way as to disturb that equilibrium.

6.  The atheism of academics looks like an equilibrium.  I think it is a sound one.  However, other equilibrium beliefs among academics strike me as more problematic.  That is, a huge majority of academics may hold some political views, and I do not share those views.

GD Star Rating
loading...
Tagged as:

Avoiding Truth

In this essay I describe two strategies for avoiding truth in forming political opinions.

The great mass of people form their political beliefs with little regard for facts or logic. However, the elites also have a strategy for avoiding truth. Elites form their political beliefs dogmatically, using their cleverness to organize facts to fit preconceived prejudices. The masses’ strategy for avoiding truth is to make a low investment in understanding; the elites’ strategy is to make a large investment in selectively choosing which facts and arguments to emphasize or ignore.

I am particularly interested in the high-investment strategy, also known as confirmation bias.  Is it true that for the most part we organize data to fit our priors?  If so, why?  How can we tell when we are confronting data honestly or with confirmation bias?

GD Star Rating
loading...
Tagged as: