17 Comments

I disagree with the above comment that study into biases has not been conducted, and even more so with the idea that education on bias is not present in our education system (I think it is a standard college course for most undergraduates). As for "debiasing" -- well, the act of "debiasing" described above is really no different than a bias, in that it indoctrinates a person into your particular world-view. Your particular world-view may be rooted in the techniques of critical thinking, which is why it is listed in the above premises that must be accepted for the argument to make sense. I'd disagree with that premise. So called "critical thinking" is very much a bias.

Empirically speaking, logical-reasoners are some of the most bias people I know, particularly in their unwillingness to consider any topics or possibilities that aren't reasoned according to their rules of critical thinking (which are taught to them in a system that is very much an indoctrination). In fact, many logical-reasoners prefer simply not to SPEAK with people who aren't capable (say, through lack of training or "indoctrination") of forming "logic-game" context arguments.

Many of the rules of logic simply make conversation and the exploration a "game" of sorts where whoever can reason better and prove their case better "wins".

Being more logical does NOT make a person right. Being UNBIASED (which, it could be argued, is a state that does not exist), does not make a person right.

Nor does "winning" an argument by proving the logical flaws of your "opponent" make you right, although many lawyers might argue the opposite.

Critical thinking and logical reasoning are wonderful tools, but when you speak of them in terms of "debiasing" and "education" you commit what you are supposedly trying to avoid. Critical thinking should most certainly not be a religion, but its adherents are rarely any different than the "fanatics" they are trying to cure.

In practical terms, a fella dislikes talking to smarmy "you're biased and I'm right" types as much as he does door-to-door 7th day adventists.

All that being said, I'm all for "debiasing" (ie, brainwashing) experiments that indoctrinate people into the system of critical thinking. Sounds like fun. Parties are better when you have folks to offset the religious nutcases. Put them in a pit for gladiator combat for the amusement of the masses. We should just avoid kidding ourselves about what's being done. Go forth and multiply your flock, o prophet. Teeheehee!

-MRAPS I like critical thinking books better than religious texts. They're usually shorter.

Expand full comment

Beginning fall of 08, I will begin teaching high school social studies in Fairfax County. For my masters degree, I do need to conduct some kind of research with my students. If you think that there is a worhtwhile experiment that could be done with only a few teachers and classrooms of students (who are already of high school age) I would be happy to help if I can. I will be student teaching in the spring, but my ability to implement any kind of experiment will be more limited.

Expand full comment

Douglas Knight: part of the problem with 'uncontroversial things' like reading is precisely due to various forms of bias by the decision making bodies (not to mention voters, politicians, etc). Starting with bias rather than 'easier things' might prove more difficult, but I think even small positive results will have far reaching positive effects.

Expand full comment

Tom McCabe,yes, people who've studied H&B at all probably won't make explicit conjunction errors if you tell it's a test of bias, but my understanding is that such people are still highly susceptible to anchoring (When did Einstein first come to the US? 1200?) and statistically visible conjunction fallacies (asking different groups people different questions--though I have to wonder if this should also be filed under anchoring).

An easy research program would be to explore the debiasing effect of simply studying H&B. My impression is that this is not being pursued because H&B is seen as descriptive, not normative. It also seems to me that there is more work that takes the position that H&B describes humans working optimally (and not even because of bounded rationality, but really optimally!) than the position that humans can do better.

Expand full comment

This is ambitious to the point of being silly. It's hard enough to get people to do experiments teaching easily measured uncontroversial things, like reading, and even harder to get establishments to pay attention to the results of those experiments.

Expand full comment

"*How* to measure them is another big problem."

For naive subjects, you can just hand them a set of fifty or so experiments listed in Judgment Under Uncertainty and see how well their answers match reality. For people who have studied H&B *at all*, this won't work, because the minute you tell them "This is a test on cognitive bias", they'll start to think about the well-established ways in which they might be biased, which wouldn't happen in a real-world situation.

Expand full comment

Maybe we need to first develop a bias scale, a way of measuring how biased a thinker is. This would be useful to other aspects of the psychology of bias (i.e. what personality traits promote it, do people become generally more biased when stressed etc) and act as a way of testing whether a program works. I see a big problem in that there are numerous biases and making a scale for them all might require an excessively complex test, but tests measuring just a few of the major scales might be useful. *How* to measure them is another big problem.

There have been some attempts at teaching critical thinking, with varying results (see e.g. http://www.arts.monash.edu/... at the end) but this has apparently been framed as traditional logic, argumentation analysis etc and not as directly reducing major biases.

Would there be a market for courses reducing bias? At least some people obviously think it is an Important Thing, so there is some market for it. And for some situations like police or certain experts there might be a demand that they reduce biases before being hired.

Expand full comment

While reading the quoted Lilienfeld article, I become concerned that terms like "bias" are being used almost as shibboleths.

Apply a little General Semantics here. Is the bias[1] that seems neccessary for fanaticism the same as the bias[2] that a course in critical thinking could hope to cure? I don't think so.

In particular, of the 3 biases he specifies, 2 don't fit what he's saying.

Bias #1 ("dissenters are evil") seems not at all amenable to being educated out of. This is both because it will apply to any such educators and because this model of bias #1 has cause and effect backwards or at least sidewards. It seems to come from associated social effects at least as much as they come from it.

#2 ("I'm less biased than others") has an intrinsic problem so serious that the writer is committing it himself as he writes. I'm not accusing him of hypocrisy, I'm accusing him of a hopelessly circular point #2.

Bias #3, agreed.

On a slightly different note, I'm reminded of observations in Pascal Boyer's Religion Explained. He concludes in an early chapter that religious beliefs are not generated by a lowered threshhold of skepticism, rather the belief comes first and then comes the credulity when problems with the belief need to be rationalized away. I'm paraphrasing him from memory, and I recommend the book.

Expand full comment

"Tom, I'm sure Scott has very particular procedures in mind for constructing "well-validated attitudinal and behavioural measures of ideological fanaticism"; it seems unfair to call this "word salad.""

You can make the best proposal in the world, eg, "Stop the Nazis from shooting innocent children", sound like this. Translated into word salad, it would read something like:

"We should implement a clear, effective, and comprehensive plan for ensuring that the present circumstances do not impede the mental and physical development of the youth in the labor camp system."

Expand full comment

Tom, I'm sure Scott has very particular procedures in mind for constructing "well-validated attitudinal and behavioural measures of ideological fanaticism"; it seems unfair to call this "word salad."

Expand full comment

"So, the most important psychological experiment never done would (1) begin with the construction of a comprehensive evidence-based educational programme of debiasing children and adolescents in multiple countries against malignant biases, (2) randomly assign some students to receive this program and others to receive standard educational curricula, and (3) measure the long-term effects of this debiasing program on well-validated attitudinal and behavioural measures of ideological fanaticism."

Beautiful Engfish. This kind of language pervades the entire school system, because there's no readily available metric for education, and so nobody can clearly define the curriculum. If you don't know what your end goal is, and you don't know what steps you should take to get there, you're forced to spout out sentences like:

"graduates, having completed a rigorous academic program, are lifelong learners who think critically and communicate effectively.""graduates live their faith through their actions, recognize the interconnectedness of our world, and act on their obligation to right injustice.""graduates accept responsibility for their own education and actions and in the spirit of Christian gentleman treat people with respect."

This kind of word salad can mean anything; that's the whole point, because if you're never specific, nobody can argue with you or sue you. You might as well say that people should be good and not bad, and be done with it.

Expand full comment

This deserves a post of its own:

Before you launch massive experiments in school curricula for children, an intermediate step would be to develop training programs for adults, and, even more importantly, some way of verifying that they worked. Typical current experiments in debiasing are a one-hour lecture and a two-week followup.

The most important thing would be creative ways of verifying that people had decreased in bias over a multimonth, multiyear period. Evidence-based Bayescraft, so that the whole project doesn't flush itself down the tubes like the psychoanalysis fad. If you can verify rationality, you can run as many experiments as required to figure out how to teach it.

The number one obstacle here is the means of verification - above all, that the measures of rationality are correlated to real-life measures of effectiveness. (As has been so thoroughly established for IQ, contrary to popular misconceptions.) Find ways to measure personal rationality and validate the measurements, and everything else will follow from that.

Expand full comment

The greatest obstacle to conducting this experiment ... is the surprising paucity of research on effective debiasing strategies.

IMO the greatest obstacles to any such test would be political. Imagine the reaction the instant the debiasing strategies were linked with truths contrary to the prevailing ideologies almost anywhere.

Expand full comment

People are mostly biased into increased certainty. There might be people who say "I don't want to look at the evidence, I want to maintain my belief that it's all 50:50 and nothing affects it" but I think in general that isn't as much of the problem. Uncertainty is disturbing and unpleasant, it's easier when you think you know.

When you admit to uncertainty it gets harder to take decisive action. There's the temptation to go get more data, to find out the things you don't know and then act. "What if we're wrong? What's the worst that can happen?" If you think seriously about what's the worst that can happen, it's hard to leave your house. But if you're sure what will happen you can just march out and do things, and then when something unexpected comes along you can deal with it.

And so we get "Who could have imagined that Saddam didn't have a nuclear program? The whole world thought he did." And "Who could have imagined that Saddam would lie about how much oil reserves he had?" If we'd thought seriously about such things we probably wouldn't have invaded. We'd have dithered.

[Cue to picture of buzzards with the caption "I'm tired of waiting. I'm going to go kill something.]

People prefer their biases. Biases provide excitement, they allow a life of adventure. Imagine an opera about unbiased people. Boring. Imagine an action-adventure film about unbiased people. Boooring.

There's no market. Lots of people like to imagine themselves as conquering warriors and proud princesses and such, and the only place in the story for the unbiased is the boring old advisors who give such prudent advice that if it got followed there wouldn't even be a war to win. Imagine the Iliad with unbiased people. Booooring.

Expand full comment

Michael, they would only be justified in believing that they are much less biased than others if they in fact had evidence supporting that conclusion.

Expand full comment

If such a program worked, wouldn't its graduates still believe "that they are objective so others who disagree must be foolish irrational or evil" (or at least badly misinformed) and that "they are not biased but others are"?. They would just now be correct in holding those beliefs. The real test of any such program would be how rapidly its graduates could reach agreement upon meeting one another.

Really, isn't this what higher education has always nominally intended to do? If it doesn't already do it well it must either be hard to do well or hard to set up incentives to do well.

Expand full comment