41 Comments

How can someone be meaningfully assessed as being relatively intelligent except by a tendency to measurably answer questions correctly?

And, how can a bias be meaningfully determined except by a tendency to measurably answer questions incorrectly?

The conclusions of this research boil down to "a = not a." You don't need to know anything about their data or methodology to understand that they've made a mistake somewhere.

Expand full comment

> I'm saying that your being smart is by itself no evidence that you have done so.

This is wrong. Being smart is evidence of being resistant to almost all biases, see Stanovich's own work: what is surprising about biases is that the correlations of intelligence with unbiasedness on scores of biases are relatively low compared to the usual correlations of intelligence with cognitive tasks.

Expand full comment

 Homo Sapiens is the exception: the current crop of Homo sui amor is drawing at Monsanto DNA straws to dominate the Homo omnia amor.

Expand full comment

one's counter-intuitive judgment is biased toward the judge. A contemporaneous smoke screen useful in a failure to cogitate by the full-of-self ratiopaths.

Expand full comment

The majority is always right, obviously. Because groups of people can't be more biased than single people.

Expand full comment

If just knowing about biases isn't good enough, then I think what is really needed is a tool for hammering the point much deeper into your brain (i.e. that *you* are biased, not just that *people* are biased).It seems like a systematic demonstration that can be either administered by yourself with some kind of tool (likely software), or in a small group, might be reasonably effective.

Does anyone know of good exercises of this nature? If not, does a software-based bias self-demonstration tool seem like a good/viable idea?

Expand full comment

 Fulltext: http://dl.dropbox.com/u/531...

So, they actually did some measuring; this isn't just a review of older work.

One interesting quote:

> Adults with more cognitive ability are aware of their intellectual status and expect to outperform others on most cognitive tasks. Because these cognitive biases are presented to them as essentially cognitive tasks, they expect to outperform on them as well. However, these classic biases happen to be ones without associations with cognitive ability. Not all classic cognitive biases show such a dissociation, however (Stanovich & West, 2008b), and these individuals with higher cognitive ability would have been correct in their assessment that they were more free of the bias. In short, our tasks created what might be termed a hostile environment (see Stanovich, 2009) for higher ability people to do self-assessments. Nonetheless, the finding of a lack of calibration in the domain of cognitive bias is important to reveal, however, because the heuristics and biases literature samples the space of cognitive tasks that are least connected to cognitive ability as measured by traditional intelligence tests (Stanovich, 2009, 2011).

To enlarge, because other commenters don't seem to understand this: "smarter people on almost all biases are indeed less biased than stupider people, however, in a few very carefully chosen biases, we can manage to make smart people look biased because we pretend to not know the previous results." This does *not* bear the interpretation that Hanson wants to put on it.

See the discussion in http://lesswrong.com/lw/d1u...

Expand full comment

Hey, just because I'm biased doesn't mean I'm wrong!  My biases tell me the truth without wasting my time with all that messy thinking.

You can be sure you're on the wrong path if you think you're right just because you're smart.

Expand full comment

Yes exactly. I went to MIT, andfreshman year the most common statement uttered by many MIT freshmanwas "that idea is counterintuitive".

Most of the freshman at MIT wereaccustomed to being one of the smartest people in their classes,since kindergarten. Suddenly, half of them were below average! Tospare themselves the narcissistic injury that comes fromacknowledging that one is not as smart as one thinks one is, one hasto justify why a particular idea is not immediately and intuitivelyunderstood. Obviously what is needed is to displace the difficultyin understanding onto the problem which is difficult to understandbecause it is “counterintuitive”, not on the “crappy intuitionthat was unable to correctly predict the correct answer to theproblem”.

Of course if you consider the problemto be counterintuitive, then you never bother to change yourintuition such that it better corresponds with reality. This makesintuition pretty useless. On the other hand, if you do activelychange your intuition when it is shown to be wrong, then over timethat intuition can get to be pretty good. One does not get goodintuition by being a contrarian. One gets good intuition byaccurately perceiving when one is wrong and then actively changingone's cognitive structures to prevent that error in the future.

If one avoids perceiving errors becauseone is trying to signal allegiance to the person holding theerroneous idea, then one can never acquire good intuition. Anyorganization that puts loyalty ahead of being factually correct isdoomed to be wrong and to get more wrong over time.

Expand full comment

 references?

Expand full comment

To be clear, I'm not claiming that it is impossible to train yourself to avoid certain cognitive biases, at least in some situations. I'm saying that your being smart is by itself no evidence that you have done so.

Expand full comment

FYI, Andrew Critch (Minicamp 2011) taught a math-department course on cognitive biases.  His students became correctly calibrated after a single round of feedback on the overconfidence bias, i.e., asked to name ten 90% confidence intervals, the true value was outside the bounds on an average of 1 item in 10.  This after just 1 round of feedback to illustrate their previous overconfidence.  So far as I know, this result is unprecedented and makes me less confident that other biases are genuinely unfixable for smart people.  I'll nag Nisan about trying to replicate it at the June Rationality Minicamp; we actually had an overconfidence unit in May already, but failed to gather this data - though I did get exactly 1 exception out of ten 90% intervals on the part of his exercise I could do before running off elsewhere.

Expand full comment

The strong arguments are indeed presented much more exhaustively on cryonics specific websites. LW/OB are more oriented towards trying to fill in the gaps, e.g. why the heck people keep ignoring this.

Expand full comment

The most telling part of this piece is the "if anything" in "if anything, the correlations went in the other direction". Basically, they got a negative result and are trying to spin it into a pseudo-conclusion that will draw some attention.

Expand full comment

 "As the case of cryonics testifies, the fear of thinking really different is stronger than the fear of death.  Hunter-gatherers had to be ready to face death on a routine basis, hunting large mammals, or just walking around in a world that contained predators.  They needed that courage in order to live.  Courage to defy the tribe's standard ways of thinking, to entertain thoughts that seem truly weird—well, that probably didn't serve its bearers as well.  We don't reason this out explicitly; that's not how evolutionary psychology works.  We human beings are just built in such fashion that many more of us go skydiving than sign up for cryonics."

http://lesswrong.com/lw/1mc...

All the cryonics material on LW/OB seems to be on the same lines. I don't read the cryonics-specific websites, but I suppose that if there were better arguments in favour of cryonics, they would have been echoed on these blogs.

Expand full comment

What's the point of justifying a controversial opinion on a subtle topic without any fulltext? How can anyone possibly judge or interpret this study without the actual article? From the snippets, I can't even tell what data they may be discussing!

Expand full comment