Which biases matter most? Let’s prioritise the worst!

As part of our self-improvement program at the Centre for Effective Altruism I decided to present a lecture on cognitive biases and how to overcome them. Trying to put this together reminded me of a problem I have long had with the self-improvement literature on biases, along with those for health, safety and nutrition: they don’t prioritise. Kahneman’s book Thinking Fast and Slow represents an excellent summary of the literature on biases and heuristics, but risks overwhelming or demoralising the reader with the number of errors they need to avoid. Other sources are even less helpful at highlighting which biases are most destructive.

You might say ‘avoid them all’, but it turns out that clever and effort-consuming strategies are required to overcome most biases; mere awareness is rarely enough. As a result, it may not be worth the effort in many cases. Even if it were usually worth it, most folks will only ever put a limited effort into reducing their cognitive biases, so we should guide their attention towards the strategies which offer the biggest ‘benefit to cost ratio’ first.

There is a bias underlying this scattershot approach to overcoming bias: we are inclined to allocate equal time or value to each category or instance of something we are presented with, even if they are arbitrary, or at least not a good signal of their importance. Expressions of this bias include:

  • Allocating equal or similar migrant places or development aid funding to different countries out of ‘fairness’, even if they vary in size, need, etc.
  • Making a decision by weighing the number, or length, of ‘pro’ and ‘con’ arguments on each side.
  • Offering similar attention or research funding to different categories of cancer (breast, pancreas, lung), even though some kill ten times as many people as others.
  • Providing equal funding for a given project to every geographic district, even if the boundaries of those districts were not drawn with reference to need for the project.

Fortunately, I don’t think we need tackle most of the scores of cognitive biases out there to significantly improve our rationality. My guess is that some kind of Pareto or ’80-20′ principle applies, in which case a minority of our biases are doing most of the damage. We just have to work out which ones! Unfortunately, as far as I can tell this hasn’t yet been attempted by anyone, even the Centre for Applied Rationality, and there are a lot to sift through. So, I’d appreciate your help to produce a shortlist. You can have input through the comments below, or by voting on this Google form. I’ll gradually cut out options which don’t attract any votes.

Ultimately, we are seeking biases that have a large and harmful impact on our decisions. Some correlated characteristics I would suggest are that it:

  • potentially influences your thinking on many things
  • is likely to change your beliefs a great deal
  • doesn’t have many redeeming ‘heuristic’ features
  • disproportionately influences major choices
  • has a large effect substantiated by many studies, and so is less likely the result of publication bias.

We face the problem that more expansive categories can make a bias look like it has a larger impact (e.g. ‘cancer’ would look really bad but none of ‘pancreatic cancer’, ‘breast cancer’, etc would stand out individually). For our purposes it would be ideal to group and rate categories of biases after breaking them down by ‘which intervention would neutralise this.’ I don’t know of such a categorisation and don’t have time to make one now. I don’t expect that this problem will be too severe for a first cut.

GD Star Rating
loading...
Tagged as: , ,
Trackback URL:
  • http://www.facebook.com/profile.php?id=599840205 Christian Kleineidam

    This post is quite ironic.
    In the Effective Altruism you don’t focus on identifying the worst problems in the world. You focus on identifying the problems where you get the most bang for your back.

    You don’t ask: “Is AIDS, Malaria or Nulcear weapon proliferation the biggest problem?”

    Similarly you shouldn’t focus on the biases who do the most damage but focus on those interventions where the least amount of invested resources brings you the highest return.
    The problem is that we really don’t know which methods are the most effective to reduce the amount of biases that people have. 

    • robertwiblin

      I appreciate this, but working out the scale of the damage is part of the process. Furthermore, I think that the damage done by biases is much more variable than the cost of reducing the mistake, so we should put more attention into that part of the ‘benefit/cost’ equation.

      “The problem is that we really don’t know which methods are the most effective to reduce the amount of biases that people have. ”

      Kahneman has given me the impression that it is difficult but there are known methods.

    • robertwiblin

      “You don’t ask: “Is AIDS, Malaria or Nulcear weapon proliferation the biggest problem?””

      Actually I think that is the key question if you are using an approaches to those problems with a high fixed cost but low marginal cost (e.g. medical research, political advocacy).

      • http://www.facebook.com/profile.php?id=599840205 Christian Kleineidam

        I don’t think that given a lecture for self improvement purposes is a high fixed cost low marginal cost purpose. 

  • http://www.facebook.com/nbeckstead Nick Beckstead

    I think Aaron Swartz’s take on this question is worth considering. http://www.aaronsw.com/weblog/optimalbias

  • Andy McKenzie

    I asked a related question this past summer: http://lesswrong.com/lw/csf/which_cognitive_biases_should_we_trust_in/

    I actually did talk to a few people from CFAR about it — they suggested basically the same ideas that I proposed in that post. 

    I’m slightly less interested in impact than I am in veracity because I estimate that it’s difficult to know ahead of time which biases would actually be detrimental. Though, I’m certainly open to arguments on this front. 

    I went through your list and voted on the ones that I thought were useful / reasonably well validated last summer, but note that some of mine aren’t on your list, including duration neglect and empathy gap. 

    Unless you have a reason against it, it would probably be easier if your list were in alphabetical order. 

    If I could have everyone in the world do 10 spaced repetition flashcards over the next few years for one bias, it would be outgroup homogeneity bias. 

  • Jess Riedel

    I think you’ll need to draw a distinction between the effects of biases on empirical beliefs versus on normative beliefs.  The former is a tractable question.  It seems reasonably straightforward to compile evidence on which biases have largest impact.  The latter quickly becomes entangled in issues surrounding ethical uncertainty.  But if you already believe that potential future people should be valued like existing people, then the question basically boils down to “what biases cause us to discount future people?” since you can smash all other concerns with the x-risk hammer.

  • Daublin

    I suspect most of our biases are quite good for most of the decisions we have to make.

    To the extent that’s not true, I’d suspect *socially motivated* biases are the worst. Helpfully, those are also things where a blog on rationality could really help.

    For example, Katja posted about how conformity bias is punished whenever you balk the evidence but turn out to be wrong after all. People assume you were being evil, rather than following the evidence.

    • Romeo Stevens

      Good along which metric?  Happiness?  Then we might as well give up on most things since it isn’t that variable.

  • Guest

    Question is not bias but lack of knowledge, while applying theory/strategy to deal with questions. Judgement and decision making though influnced by bias, but mostly its lack of idea as how to deal with the problem that is the root cause than bias….

  • stevesailer

    The dominant bias of our age is “Who? Whom?” thinking.

    • http://www.facebook.com/peterdjones63 Peter David Jones

      Feel free to expand on that.

  • Anon

    This one is pretty useful to know about from a mental health & happiness perspective: http://en.wikipedia.org/wiki/Negativity_bias

  • http://kruel.co/ Alexander Kruel

    I would claim that the biases which matter most are possibly new kinds of biases introduced by approximations of  uncomputable methods of rationality.

    Consider what would happen if we build an artificial general intelligence according to out current understanding of rationality. It wouldn’t work. Our current methods seem to be biased in new and unexpected ways. Pascal’s mugging, the Lifespan Dilemma are just two examples on how an agent build according to our current understanding of rationality could fail. So how could we expect to do better when using those methods? We shouldn’t. We should expect to run into many more problems as we are not able to use those methods consistently and correctly.   

    Ideal AI’s might be perfectly rational but computationally limited but artificial systems will have completely new sets of biases. As a simple example take my digicam, which can detect faces. It sometimes recognizes faces where indeed there are no faces, just like humans do but yet on very different occasions. Or take the answers of IBM Watson. Some were wrong but in completely new ways. That’s a real danger in my opinion.

    Again, if our current theories are not enough to build an artificial general intelligence that will be reliably in helping us to achieve our values, even if those values could be thoroughly defined or were computable even in principle.

    Which raises the following question. If we are unable to build – and wouldn’t trust even if we could –  a superhuman agent equipped with our current grasp of rationality to be reliably in extrapolating our volition, how can we trust ourselves to arrive at correct answers given what we know?

  • Christopher

    To prioritise baises begs the question of whether such a ranking reflects the cultural bias of the person deciding on the priority. I presume you aren’t suggesting cognitive biases exist inside a culturally free zone (American exceptionalism is an interesting bias itself). An American would likely come up with a different list from someone in China, Tibet, England, Thailand or Norway. 

    • Sebastian h

      I suspect our biggest biases are universal: ie we tend to trust and believe what better looking people say more than ugly people. There is a cultural component to pretty/ugly, but the consequences of the judgment are widely shared.

  • Arch1

    I suspect a very important bias to attack is the one against ideas conflicting with one’s own.  This has a bunch of loosely related manifestations (such as the belief that faith is a virtue).  It is kind of a keystone meta-bias which makes it difficult to go after the others.

  • http://www.facebook.com/profile.php?id=611225423 Albert Ling

    The simple ordering of biases is biased since people are gonna click first on the ones on the top of the list and then probably some will get tired and stop clicking the checkboxes. You should randomize the order of biases on that questionaire so that everytime you refresh the page they are in a different order!

    • robertwiblin

      I thought of this Albert, but forms doesn’t yet allow randomisation that I can see. My alternative is to identify the trend after the fact and push up the ones at the bottom to make the line flatter.

  • Pablo

    A “bias” that is not included in the Wikipedia list which you relied upon to create the Google form but which has had a more damaging effect in my life than all but a few of those listed biases is the what-the-hell effect.  Note that, to some degree, this bias has continued to cause damage even after I became aware of it, confirming your point that “to overcome most biases, mere awareness is rarely enough.”
    Other than that, here’s perhaps a valuable heuristic for identifying the biases one should try to get rid of first.  There are a few biases that either cause other biases or impede debiasing.  Since these biases affect many other biases, they seem to be good candidates for being particularly damaging.  So the heuristic is: focus on these “meta” biases.  Here are three such biases:

    Naive realism

    Bias blind spot

    Sophistication effect

    • rw

       The “What the Hell” effect sounds a lot like the Broken Window theory on a personal level. Buildings with graffiti get graffitied more.

      Or maybe Irrational escalation?

      The entirely different broken window fallacy(that destruction and waste can have a positive economic effect) would also have been nice to address, but that’s probably a subset of some other bias. The focusing effect, perhaps? I’m not sure.

  • Nancy Lebovitz

    The worst bias might not be the most important one to work on if there are other biases which are easier and whose correction opens the way to working on more difficult but more important biases.

    • robertwiblin

      Agreed. We should factor in this benefit.

  • http://www.facebook.com/jake.witmer Jake Witmer

    The bias in favor of unlimited government is the most damaging, stupid, and easy-to-correct-with-education bias of all.  This chart may help:
    http://hawaii.edu/powerkills/VIS.TEARS.ALL.AROUND.HTM

    • http://www.facebook.com/peterdjones63 Peter David Jones

       What bias? There’s a certain amount of unlimited, tyranical govt. about, but that doesn’t mean it’s there because it’s popular and people vote for it

  • http://www.facebook.com/louie.helm Louie Helm

    I attempted to prioritize the biases into a rough order of importance here: http://rationalpoker.com/2011/07/30/23-cognitive-mistakes-that-make-people-play-bad-poker/

    PS – Don’t be distracted by the fact that I published this on my poker blog. It’s fully general advice.

  • Pingback: Which biases matter most? Let’s prioritise the worst! | Fifth Estate

  • quiet

    Hindsight Bias appears on the list twice. 

    • quiet

       Also, Bandwagon Effect includes ‘herd behavior’ in it’s description, while Herd Instinct is listed as a separate option.

  • http://juridicalcoherence.blogspot.com/ srdiamond

    Kahneman’s book Thinking Fast and Slow represents an excellent summary of the literature on biases and heuristics, but risks overwhelming or demoralising the reader with the number of errors they need to avoid.

    I don’t know that Kahneman “risks” anything: he didn’t intend to write a self-help book. But Kahneman does reveal which bias he thinks must harmful: failure to realize that things are important to you only while you’re thinking of them.

    My humble impression is that the endowment effect is a particularly destructive bias, having repercussions for far-flung subjects from death to clutter. (My take on the endowment effect in relation the writing process: http://tinyurl.com/9sw54v8 )

    But from a remedial perspective, you would need to address the numerous consequences separately (unless you can come up with something very new). This goes to Christian Kleineidam’s important point.)

  • David

    I would think the ranking depends a lot on context, learning about the gotchas in various things you’re trying to pursue. The problem with a more general analysis is that evolution has already done that! That’s why we have the biases that we do. Generally speaking, not being eaten by a predator is more important than getting a math problem right. If you’re a mathematician living in 2013 though, maybe not so much :) Same goes for our bias towards understanding things socially, though I think people underestimate how important that is to our lives.

  • Trevor Blake

    Most damaging to me, most damaging to others and most damaging to everyone over all – most damaging to who? The answer changes when the question is clarified. Confirmation bias of a person versus a group have different levels of likely damage.

  • Tenoke

    You should prune the identical and near-identical answers such as the fundamental attribution error and the actor-observer bias.

  • Jim Stone

    My number one tip would be to learn to recognize when you WANT, for some reason (financial, convenience, reputation, status, group-pride, etc.), one hypothesis to be true. 

    When that’s the case, take extra care to correct for bias by 1) not prematurely truncating your search for alternative hypotheses, 2) looking at the issue from a variety of perspectives, 3) setting up experiments that are well-insulated against subjective judgment 4) getting outside opinions, etc.

  • Ryan Wise

    I’d say ‘stereotyping’ but it seems to be one bias people are already working to correct. There’s something similar, though, which is less well addressed, where an outgroup of people are viewed as homogeneous.  View any group closely enough and there’s internicean strife. But we rarely see that from a distance. Americans talk about Chinese culture, for example, but not Nanjing culture, Sichuan culture.

  • http://stephenrice.eu Stephen Rice

    Surely the only biases that matter are the ones that are affecting your decision at that point and if it was easy to identify the biases you were affected by life would be a lot easier than it is.
    I think it’s dangerous to see each item on the WIkipedia list of cognitive biases as a separate, individually solvable problem. “Today I will work on my availability heuristic, tomorrow I’ll do confirmation bias”. It’d be lovely it worked like that but it’s a more insidious problem.

  • Quantitrader

    Can you tell me from where you got your image, I would like to buy it and use it on my website.

  • Pingback: Executive Functioning, Focus and Attentional Bias | ADD . . . and-so-much-more