The Smart Are MORE Biased To Think They Are LESS Biased

I seem to know a lot of smart contrarians who think that standard human biases justify their contrarian position. They argue:

Yes, my view on this subject is in contrast to a consensus among academic and other credentialed experts on this subject. But the fact is that human thoughts are subject to many standard biases, and those biases have misled most others to this mistaken consensus position. For example biases A,B, and C would tend to make people think what they do on this subject, even if that view were not true. I, in contrast, have avoided these biases, both because I know about them (see, I can name them), and because I am so much smarter than these other folks. (Have you seen my test scores?) And this is why I can justifiably disagree with an expert consensus on this subject.

Problem is, not only are smart folks not less biased for many biases, if anything smart folks more easily succumb to the bias of thinking that they are less biased than others:

The so-called bias blind spot arises when people report that thinking biases are more prevalent in others than in themselves. … We found that none of these bias blind spots were attenuated by measures of cognitive sophistication such as cognitive ability or thinking dispositions related to bias. If anything, a larger bias blind spot was associated with higher cognitive ability. Additional analyses indicated that being free of the bias blind spot does not help a person avoid the actual classic cognitive biases. …

Most cognitive biases in the heuristics and biases literature are negatively correlated with cognitive sophistication, whether the latter is indexed by development, by cognitive ability, or by thinking dispositions. This was not true for any of the bias blind spots studied here. As opposed to the social emphasis in past work on the bias blind spot, we examined bias blind spots connected to some of the most well-known effects from the heuristics and biases literature: outcome bias, base-rate neglect, framing bias, conjunction fallacy, anchoring bias, and myside bias. We found that none of these bias blind spot effects displayed a negative correlation with measures of cognitive ability (SAT total, CRT) or with measures of thinking dispositions (need for cognition, actively open-minded thinking). If anything, the correlations went in the other direction.

We explored the obvious explanation for the indications of a positive correlation between cognitive ability and the magnitude of the bias blind spot in our data. That explanation is the not unreasonable one that more cognitively sophisticated people might indeed show lower cognitive biases—so that it would be correct for them to view themselves as less biased than their peers. However, … we found very little evidence that these classic biases were attenuated by cognitive ability. More intelligent people were not actually less biased—a finding that would have justified their displaying a larger bias blind spot. …

Thus, the bias blind spot joins a small group of other effects such as myside bias and noncausal base-rate neglect in being unmitigated by increases in intelligence. That cognitive sophistication does not mitigate the bias blind spot is consistent with the idea that the mechanisms that cause the bias are quite fundamental and not easily controlled strategically— that they reflect what is termed Type 1 processing in dual-process theory. (more)

Added 12June: The New Yorker talks about this paper:

The results were quite disturbing. For one thing, self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.” … All four of the measures showed positive correlations, “indicating that more cognitively sophisticated participants showed larger bias blind spots.”

GD Star Rating
Tagged as: , ,
Trackback URL:
  • John Maxwell IV

    Have there been any attempts to measure the presence of cognitive bias in researchers who study cognitive bias?

    I would be pretty surprised if there’s a large population of thinkers who think they pay attention to base rates but in fact neglect base rates.

    • Bobby

      Yes, the part of the study quoted said “More intelligent people were not actually less biased”. It did not say the “people aware of biases who work to overcome them are not less biased”.  So the study didn’t contradict your introductory story.

  • colin

    So who is less biased?

    • Humanpl

      ALGORITHM in its area of application

  • Ari Timonen

    I agree but defending expert consensus is easy, especially on fields like climate science where the experts are readily found and accepted. But it is not always that such exists. When you have questions of mixed beliefs, values and local knowledge finding the experts, never mind the consensus, is not that easy.

    Also, sometimes people are just exposed to different information, or the information exists in such form (musical knowledge) it isn’t easy to transfer it. And for signaling or other game theoretic reasons people may not want to reveal it.

    I don’t think people who say, short shell, companies, think the current experts (ie. market), are biased that much, but have inside knowledge that others don’t. But then again, short sellers are accountable for their beliefs unlike political pundits etc.

    For what it matters, I do respect expert opinion, and advise others to do so almost everywhere experts can be found.

  • What if I hold all these contrarian views, but at the same time freely admit that I’m probably wrong because of all of the ways that my brain is likely to be tricking me into thinking that I’m smarter/less biased/more honest/whatever than everyone else?  That would make me better than everyone, right?

    • V V

       No, that would mean that your beliefs are inconsistent.

      • Hilariously, the same way the idea of global warming became mainstream, with a consistent view, is now commonly used against any detractors with consistent views.  If anything, inconsistency between views in a group should be celebrated, not hammered into a more definable stereotype.

  • V V

    “Most cognitive biases in the heuristics and biases literature are negatively correlated with cognitive sophistication”

    “However, … we found very little evidence that these classic biases were
    attenuated by cognitive ability. More intelligent people were not
    actually less biased”

    Isn’t that a contradiction?

    • Good catch.  Was the study able to reproduce the (presumably) previously established relationship between degree of most biases and cognitive sophistication or not?

    • J O

      Ah.  I must not have been paying attention, I didn’t even notice.

      I too would like to know what the deal is.

    • Douglas Knight

      I think they are saying that their results contradict earlier results, not that they endorse contradictory positions.

      • Josh Burroughs

        Not so sure about that. In the conclusion: “Thus, the bias blind spot joins a small group of other effects such as myside bias and noncausal base-rate neglect in being unmitigated by increases in intelligence.”

        A better definition of bias blind-spot would be “the belief, in a person with median or above median levels of bias, that others exhibit more bias than they”. This does not seem to be what was measured.

    • Science is to be used to prove oneself wrong.  Any other use opens the door for cognitive bias.

  • V V

     “I seem to know a lot of smart contrarians who think that standard human biases justify their contrarian position.”

    Such as cryonics and singularitarianism?

    • officer_fred

       I think the error consists in thinking that BECAUSE biased people are against you, THEREFORE you are correct. When actually you should just decide things based on evidence. So the real reason to get your head frozen is not to impress people with your weirdness, but simply because you want to survive. Reversed stupidity is not intelligence, Hitler thought the sky is blue, etc., etc.

      • V V

         Just wanting to survive isn’t enough. In order for cryonics to be an appropriate decision, you need evidence that it significantly increases your expected lifespan.

        Typical arguments in favour of cryonics that you find on the Internet tend to be in the form: “People say it’s insane, but they are fools affected by conformity bias and availability heuristics, while I’m much smarter than them ha ha therefore they’ll go to Hell nothingness while I’ll live forever in Heaven Singularity with the Almighty God Friendly AI”.

      • Rationalist

        The arguments I have seen most widely promoted are object-level arguments about the amount of information that can be extracted from the connectome of your brain, the idea that if you make full use of the available information then it is surprising how much you can deduce. 

        The argument that you give is the response to “if the argument about information in the connectome is correct then why isn’t everyone doing it”

      • V V

         The arguments I’ve seen just assume that whatever is preserved by cryonics is enough to reconstruct the individual self, but this is typically not argued for. In fact, I’ve actually seen arguments against that.

        But even if the cryonic process preserves the self, it doesn’t automatically follow that frozen people will have their lifespan expanded. A number of events, some arguably improbable, must occour in order to make that happen.

      • John

        ” amount of information that can be extracted from the connectome of your
        brain, the idea that if you make full use of the available information
        then it is surprising how much you can deduce”

        And then again, these arguments typically say close to nothing about whether
        a) an exact cellular – level copy of yourself is yourself
        b) assuming preservation of identity on the conceptual level, whether the eventual switch of hardware on which you are supposed to “run” (I hate to use that terminology, but it is appropriate for information geeks) – flesh versus silicon/whatever – will not alter you beyond recognition (you know, the medium is the message).

      • This doesn’t sound typical to me. Can you point to an example of an argument taking this form?

      • V V

         “As the case of cryonics testifies, the fear of thinking really
        different is stronger than the fear of death.  Hunter-gatherers had to
        be ready to face death on a routine basis, hunting large mammals, or
        just walking around in a world that contained predators.  They needed
        that courage in order to live.  Courage to defy the tribe’s standard
        ways of thinking, to entertain thoughts that seem truly weird—well, that
        probably didn’t serve its bearers as well.  We don’t reason this out
        explicitly; that’s not how evolutionary psychology works.  We human beings are just built in such fashion that many more of us go skydiving than sign up for cryonics.”

        All the cryonics material on LW/OB seems to be on the same lines. I don’t read the cryonics-specific websites, but I suppose that if there were better arguments in favour of cryonics, they would have been echoed on these blogs.

      • The strong arguments are indeed presented much more exhaustively on cryonics specific websites. LW/OB are more oriented towards trying to fill in the gaps, e.g. why the heck people keep ignoring this.

      • V V


      • oldoddjobs

        It’s easier to convince yourself that you are “following the evidence” than it is to follow the evidence. Also, it just kicks the can down the road – what constitutes correctly “following” will also be disputed.

  • This basically means: Drop those IQ test and instead get tests for cognitive bias. Afterwards you hold competitions in being unbiased. 

    • Rationalist

      The problem is, there aren’t any. It might be a good idea for someone to create some. 

  • CouponClipper

    So, given a choice between adopting the consensus view of a group of individuals who I suspect are, on average, more intelligent than me or a group who I suspect are, on average, less intelligent than me… what should I do?

  • The case you’re thinking about does seem particularly exempt from the domain of study, though. Particularly, the contrarians are actively using their  knowledge of bias, rather than claiming that it helps in a general way.

    Though I suppose the counterargument is that bias arguments would also lead to us generating such a reasonable excuse for our beliefs, though in fact we hold them for ignoble reasons.

    But while that last point is evidence for Robin’s conclusion, it does not as strongly update on whether this study is applicable to it.

  • Rationalist

    “And this is why I can justifiably disagree with an expert consensus”

    If you want to make the argument that you can’t disagree with credentialed experts using only an object-level argument, i.e. you need some cast-iron meta-level reason that they are wrong (they are biased, for example) then you end up having to believe all sorts of stupid crap. 

    For example priests are credentialed experts on religion, so that means that anyone following your rationale has to be a theist. 

    A cryonics proponent (for example) might say that cryobiologists are in the same relation to cryonics as preists are in relation to the existence of God: namely they have a social structure and a system of rituals and a consensus about the issue, but they are really talking nonsense. 

    • Rationalist

      Of course if start disagreeing with big groups of credentialed experts you have to be *good* at telling the difference between correct and incorrect object-level arguments. Otherwise you end up like timecube guy. 

  • This accords with my own long-held view — and so is obviously correct! The more intelligent a person is, the more expertly they can rationalize their beliefs. Surely one major mark of true posthumans will be that this no longer applies.

    • HiO

       Homo Sapiens is the exception: the current crop of Homo sui amor is drawing at Monsanto DNA straws to dominate the Homo omnia amor.

  • gwern0

    What’s the point of justifying a controversial opinion on a subtle topic without any fulltext? How can anyone possibly judge or interpret this study without the actual article? From the snippets, I can’t even tell what data they may be discussing!

    • gwern0


      So, they actually did some measuring; this isn’t just a review of older work.

      One interesting quote:

      > Adults with more cognitive ability are aware of their intellectual status and expect to outperform others on most cognitive tasks. Because these cognitive biases are presented to them as essentially cognitive tasks, they expect to outperform on them as well. However, these classic biases happen to be ones without associations with cognitive ability. Not all classic cognitive biases show such a dissociation, however (Stanovich & West, 2008b), and these individuals with higher cognitive ability would have been correct in their assessment that they were more free of the bias. In short, our tasks created what might be termed a hostile environment (see Stanovich, 2009) for higher ability people to do self-assessments. Nonetheless, the finding of a lack of calibration in the domain of cognitive bias is important to reveal, however, because the heuristics and biases literature samples the space of cognitive tasks that are least connected to cognitive ability as measured by traditional intelligence tests (Stanovich, 2009, 2011).

  • Wildmike31

    The most telling part of this piece is the “if anything” in “if anything, the correlations went in the other direction”. Basically, they got a negative result and are trying to spin it into a pseudo-conclusion that will draw some attention.

  • FYI, Andrew Critch (Minicamp 2011) taught a math-department course on cognitive biases.  His students became correctly calibrated after a single round of feedback on the overconfidence bias, i.e., asked to name ten 90% confidence intervals, the true value was outside the bounds on an average of 1 item in 10.  This after just 1 round of feedback to illustrate their previous overconfidence.  So far as I know, this result is unprecedented and makes me less confident that other biases are genuinely unfixable for smart people.  I’ll nag Nisan about trying to replicate it at the June Rationality Minicamp; we actually had an overconfidence unit in May already, but failed to gather this data – though I did get exactly 1 exception out of ten 90% intervals on the part of his exercise I could do before running off elsewhere.

    • To be clear, I’m not claiming that it is impossible to train yourself to avoid certain cognitive biases, at least in some situations. I’m saying that your being smart is by itself no evidence that you have done so.

      • daedalus2u

        Yes exactly. I went to MIT, and
        freshman year the most common statement uttered by many MIT freshman
        was “that idea is counterintuitive”.

        Most of the freshman at MIT were
        accustomed to being one of the smartest people in their classes,
        since kindergarten. Suddenly, half of them were below average! To
        spare themselves the narcissistic injury that comes from
        acknowledging that one is not as smart as one thinks one is, one has
        to justify why a particular idea is not immediately and intuitively
        understood. Obviously what is needed is to displace the difficulty
        in understanding onto the problem which is difficult to understand
        because it is “counterintuitive”, not on the “crappy intuition
        that was unable to correctly predict the correct answer to the

        Of course if you consider the problem
        to be counterintuitive, then you never bother to change your
        intuition such that it better corresponds with reality. This makes
        intuition pretty useless. On the other hand, if you do actively
        change your intuition when it is shown to be wrong, then over time
        that intuition can get to be pretty good. One does not get good
        intuition by being a contrarian. One gets good intuition by
        accurately perceiving when one is wrong and then actively changing
        one’s cognitive structures to prevent that error in the future.

        If one avoids perceiving errors because
        one is trying to signal allegiance to the person holding the
        erroneous idea, then one can never acquire good intuition. Any
        organization that puts loyalty ahead of being factually correct is
        doomed to be wrong and to get more wrong over time.

      • HiO

        one’s counter-intuitive judgment is biased toward the judge. A contemporaneous smoke screen useful in a failure to cogitate by the full-of-self ratiopaths.

      • > I’m saying that your being smart is by itself no evidence that you have done so.

        This is wrong. Being smart is evidence of being resistant to almost all biases, see Stanovich’s own work: what is surprising about biases is that the correlations of intelligence with unbiasedness on scores of biases are relatively low compared to the usual correlations of intelligence with cognitive tasks.

  • Pingback: Simoleon Sense » Blog Archive » Weekly Roundup 176: A Curated Linkfest For The Smartest People On The Web!()

  • Mark M

    Hey, just because I’m biased doesn’t mean I’m wrong!  My biases tell me the truth without wasting my time with all that messy thinking.

    You can be sure you’re on the wrong path if you think you’re right just because you’re smart.

  • Pingback: More evidence that intelligence does not necessarily correlate to critical thinking ability. | The Thinker()

  • Dimitri Klimenko

    If just knowing about biases isn’t good enough, then I think what is really needed is a tool for hammering the point much deeper into your brain (i.e. that *you* are biased, not just that *people* are biased).
    It seems like a systematic demonstration that can be either administered by yourself with some kind of tool (likely software), or in a small group, might be reasonably effective.

    Does anyone know of good exercises of this nature? If not, does a software-based bias self-demonstration tool seem like a good/viable idea?

  • Asf

    The majority is always right, obviously. Because groups of people can’t be more biased than single people.

  • Pingback: Econhacks » Blog Archive Links of the Week 6/15 » Econhacks()

  • Pingback: 10 Weekend Reads | The Big Picture()

  • Pingback: Trade Confidently and Safely...()

  • That is because people think of themselves in relation to others.  They adopt titles, groups, maybe even “non-conformity”, but many adopt actual morals and apply them to their own actions before others?  The idea of a bias blind spot should immediately highlight the possibility that you are being biased about him knowing whether or not he is biased while he does the same thing.  If anything, psychology should be used on the self before others.

  • Pingback: Who would benefit from a “Council of Psychological Advisors?” « Latent Paradigm()

  • Pingback: Overcoming Bias : Prefer Contrarian Questions()

  • Pingback: Castles and Tents |

  • How can someone be meaningfully assessed as being relatively intelligent except by a tendency to measurably answer questions correctly?

    And, how can a bias be meaningfully determined except by a tendency to measurably answer questions incorrectly?

    The conclusions of this research boil down to “a = not a.” You don’t need to know anything about their data or methodology to understand that they’ve made a mistake somewhere.

  • Pingback: Knowledge vs Action: Three Road Blocks Keeping You From Better Fitness - Binary Reveux()

  • Pingback: Why Stefan Molyneux is Wrong about Libertarians()