I have so far found most discussions and debates about the correction of cognitive "biases" very confusing, including most of the posts on this blog. Why? Because I find the very idea of a cognitive bias confusing any time I really start to think about it. A bias is a bias only relative to some standard. The cognitive shortcuts and blind spots identified in the heuristics and biases literature may look like "failure" when laid against some idealized conception of rationality, but why should we care about such conceptions of rationality anyway? A hip hop dancer is making constant "mistakes" from the perspective of the formal norms of ballet, but why on Earth would you judge hip hop from the perspective of ballet? You wouldn’t. I’m making a "mistake," in some sense, by failing to have abs like a Spartan in 300. But so what? And in the absence of normatively binding reasons to conduct ourselves cognitively according to the principles of idealized Rationality, cognitive "biases" may not be biases at all. Indeed, they may well be optimal relative to some other standards we have reasons to care about.
I have become convinced, from reading contemporary cognitive science, neuroscience, and Hayek, that Reason is no part of our biological endowment, and that Rationality is an "unnatural," culturally-transmitted set of cognitive norms. As I’ve analogized before, Rationality is to the mind as ballet is to the body. Failure to adhere to the standards of the canons rationality — decision theory, game theory, formal logic, Bayes Rule, etc. — is a failure to cognize balletically, but is not a mistake unless one was trying to cognize balletically, or was trying to accomplish something that requires that kind of highly polished cognition as an instrument. Reason, it turns out, is damn good for a huge number of things, has made our live unimaginably better, and deserves nothing but hymns of praise.
Nevertheless, I think it is important to acknowledge that the project of this blog — the project of debiasing-as-making-Rational — is about adjusting our cognitive behavior to live up to a particular set of cultural ideals, not about living up to our "nature" as putatively rational beings. People guilty of cognitive biases are failures in the sense that people who don’t have the discipline to hold down a job are failures. Shame! But if we look at this kind of "failure" from an outside perspective, as detached but interested consumers and critics of our own cultural norms, then we’ve got to ask: so what? The cultural project of debiasing is about cultivating norms that prevent people from shifting to or feeling comfortable in the "so what?" perspective. What can we say in favor of this cultural project?
Libertarian debiasers favor technological and market debiasing techniques. Paternalist debiasers favor elite-managed policy debiasing techniques. Both, I think, need to face up more fully to the particular cultural construction of Rationality standing behind the desire for these techniques, and the lack of intrinsic normative oomph therein. It simply isn’t obvious that this cultural ideal about the refinement and deployment of our native capacities is one reasonable people cannot reasonably (or even Rationally) reject. So I’m "biased." So we’re all "biased." So what?
If there is some cost to debiasing, then maybe I don’t want to buy very much of it. Why should I? What’s in it for me? It’s seems that the answer might be: not much, unless enough other people coordinate on the cognitive norm. If the individual advantages of being less "biased" are contingent on many other people debiasing first (or at the same time), what kind of problem is that? I think it’s a rather diffuse and maddening <em>cultural</em> problem (if I’m convinced it counts as a problem at all). There is little demand in our democratic society for political representatives inclined to appoint competent debiasing bureaucrats, and so unless there is some kind of bloodless coup of behavioral economists, we’re not going to get any. Paternalist hopes are misplaced in the absence of a cultural shift, at least among elites. Libertarian debiasers hoping for new institutional technologies, like betting markets for ideas, face a similar kind of political problem. Legislators and/or bureaucrats have to act to make these kinds of markets legal. Why would they want to do that? Where does the demand come from? We probably need a bit of cultural ferment before we get there.
So, how do we catalyze this cultural ferment? Blogging? Actually, I think that’s partly the way. New technologies that do not require political approval, but which dramatically decrease the cost of communication among elites can be create openings for the transmission of new norms. So I think we’re actually doing something useful here. And if new technologies create increasing economic returns for certain kinds of debiased individuals (can anyone name such technologies?), we should expect to see more of them, and we should expect those individuals to demand cultural norms that help them rationalize, justify, and psychologically sustain their economic interest in debiasing. If there is reason to be bullish on the adoption of the cultural norms of Reason, then we have reason to be bullish on the eventual adoption of debiasing institutions, paternalist or libertarian.
For the first time on this forum, I find myself agreeing almost entirely with a blogger's original post. Just thought I'd encourage Wilkinson and others who would follow his lead that he has struck a chord.
Wilkinson, I addressed many of these issues in Why Truth? and What's a Bias? If you find fault with my reasoning there, feel free to post it in a comment here. Incidentally, I aspire to dramatize and romanticize rationality because I believe Rationality is a dramatic and romantic endeavor, one of the great high melodies sung in the unfolding epic poem of Humankind.
I agree with Glen that truth is a subgoal of nearly any goal that requires cognition, up to and including walking across a room. When you successfully locate and sit down on a chair, you are committing an act of truthfinding no less than believing that humans evolved by natural selection. The politics and arguments and surrounding verbal bibblebabble are more complicated in the second case, but the math is the same.
Robin, we've differed on this before, but I still object to your calling people who deliberatively endorse truth yet engage in self-deluding behavior "hypocrites". Traditionally, a "hypocrite" is someone who verbally advocates a morality which they do not privately believe. Many people claim to believe in a morality, and internally believe they believe in the morality, yet commit some acts not in accordance with it; these people are traditionally called "sinners". People who say they believe in truth (honestly, without knowing intent to deceive you) and then self-deceive are sinners, not hypocrites.