Avoiding Truth

In this essay I describe two strategies for avoiding truth in forming political opinions.

The great mass of people form their political beliefs with little regard for facts or logic. However, the elites also have a strategy for avoiding truth. Elites form their political beliefs dogmatically, using their cleverness to organize facts to fit preconceived prejudices. The masses’ strategy for avoiding truth is to make a low investment in understanding; the elites’ strategy is to make a large investment in selectively choosing which facts and arguments to emphasize or ignore.

I am particularly interested in the high-investment strategy, also known as confirmation bias.  Is it true that for the most part we organize data to fit our priors?  If so, why?  How can we tell when we are confronting data honestly or with confirmation bias?

GD Star Rating
Tagged as:
Trackback URL:
  • David Murphy

    Can I suggest that part of the issue is the lack of clarity as to the order used to measure the results of political decisions? (I’d say ‘total order’ rather than ‘order’ except that there seem to be situations of incomparable goodness, at least to some observers.)

    We cannot really discuss confirmation bias, or indeed any other kind of bias, without saying what that bias favours. At least if I tell you ‘this is what I mean by a good outcome in transport/education/foreign policy/whatever’ then you can analyse my policy proposals with respect to that notion of goodness — while of course potentially disagreeing with it. But if I offer you policy proposals in the abstract, we may well end up confusing disagreements about methods with those about the comparison of outcomes.

  • Information Processing

    A thoroughly thought provoking short essay has been posted on TechStationCentral by Arnold Kling on epistemology in politics.

  • Arnold, I don’t know if any of the extensive literature on confirmation bias addresses the question of whether the bias applies ‘for the most part’ of our beliefs. Whether or not, I think we know some of the reasons why we are vulnerable. Dennett makes an interesting point in his ‘Breaking the Spell’ when he says that perhaps acquiring a faith is like falling in love, indeed, perhaps it actually is falling in love. People don’t just have moral beliefs like they have perceptual beliefs. Moral beliefs are often treasured possessions, and for that reason to be nurtured and protected from harm.

    You might say that these are inappropriate attitudes to beliefs, but it seems to be a fact of our psychology that we have this kind of attitude to certain beliefs. This might be because we are only approximations to rational agents, and since for a rational agent belief necessarily has certain kinds of relations to action, our approximation to those correct relations also allows in other inappropriate relations to action.

    Given the nature of belief, what is nurturing and protecting a belief from harm? Presumably it is nurtured by being surrounded with supportive facts and protected from harm by rebutting claims that tend to show it to be false. Hence confirmation bias and selective scepticism. So I suppose one way of telling whether we are confronting data honestly is to ask ourselves whether it is data that bears on a treasured moral belief. If it is, we know we are likely to encompass it in a biased way.

  • Arnold: I think that it is very hard for us to avoid confirmation bias. As a society then, we create a number of institutions to counteract or nullify it: 1st Amendment, the adversarial legal system with trial by jury, double-blind testing, the econ publication process of ideas->conversations with colleagues->brown bag seminars->working papers, and conference and other seminars->peer review->publication), political debates, etc. Perhaps one could argue that societies that progress do so by creating such rules and institutions.

    An additional thought. You seem to be arguing that elites follow what Philip Tetlock has characterized as hedgehog strategies. But are there not elites who pursue fox strategies. I suspect that Tetlock would argue that we are born foxes or hedgehogs; as long as there are foxes to tip the balance, we may be OK.

  • Adrian, it is not obvious that that all our institutions reduce confirmation bias; some of them may increase it. It is important to carefully evaluate our institutions on this issue, and seek improvements.