24 Comments

Pretty interesting observation.

There are basically two ways to deal with an observed bias. First, try to be aware and counter it. But, I agree with the late Tversky that this is a lot easier to say than do.

The second method is simply to avoid making those decisions that you have reason to believe that your biases are too great to overcome.

For example, suppose I tend to run red lights under certain circumstances causing a danger to myself and others. I could try to pay more attention to spotting red lights, or I could just stop driving in those circumstances.

Our entire due diligence law, protecting consumers from fraud, assumes that only the first method is valid: viz. giving investors, purchaser or consumers more information or knowledge to counter their biases. Many times, the second option is better - just don't make those decisions in which you know you are lousy at.

Expand full comment

Alexis and HH, Good points. And HH, thanks for the advice!

Expand full comment

"HH, wouldn't closing my stance make me pull it less not more (I'm a rightie)?"

It depends on your upper body position once you adjust the footwork. Most people who change their stance usually don't adjust their upper body position: your uppper body remains perpendicular to the front edge of home plate, even if your feet at are at a more closed angle. This means that in your stance, you've already "started" your swing, and the bat has a shorter path to the ball. Your swing basically gets faster, and you pull the ball more. [I've seen people lose power doing this, but I doubt that'll happen to you if you're already a push hitter.] If you're the type of person would rotate their upper body as well [facing home plate at the same angle as your feet are] then you'd probably pull the ball less.

In fact, thinking about the stance has made me think about how to successfully correct bias. Presumably, when we identify a bias, we can also identify the optimal outcomes we would reach in the absence of said bias. In the batting case, that would mean pulling the ball more. Changing the stance is essentially finding a way to bring about the desired outcome whilst still giving in to the bias: you can swing late and still achieve the desired outcome. I think commitment devices and other early actions ["stance changes"] that produce the desired outcome regardless of whether the bias is corrected or not are probably a lower-cost and more realistic way for most people to reduce the effects of bias. In effect, tying yourself to the mast is probably more likely to succeed than learning to ignore the sirens' song. ...As long as you can be unbiased enough to tie yourself to the mast.

That said, I have to agree with Alexis: one can train oneself to avoid biases, though at first it's difficult and it takes time. It's like driving a stickshift: it owns your whole brain at first, but after a while it's second nature. [I now notice false dichotomies that people use almost instinctively.]

Expand full comment

David said:

"But I do think my point is slightly different. It's ... that already-identified biases are hard to overcome because ... the nature of the task you're doing may be such that you can't simultaneously worry about correcting bias and competently perform the other elements of the task."

My comment addressed the specific point you are making -- that focusing on avoiding a particular bias disrupts the necessary balance with other "elements of the task", such as other kinds of bias-avoidance and the primary reasoning task. I suggested that this problem is normal and maybe the solution is normal. In many learning situation, a new skill always disrupts old skills until it is internalized. After internalization, it can be integrated in a more correctly with old skills because it requires less conscious attention or, as you put it, you're not "worrying about correcting bias" simultaneously with performing the task.

So my hypothesis is that the problem is solved via conscious exercises focused on bias-awareness leading to unconscious internalized awareness of that bias. Is this true? One way to test this, as I said, is to consider if repeated efforts to sensitize yourself to a bias have generally resulted in oversensitivity or to a more balanced weighting of that bias. No one followed up on my comment, but this still strikes me as an empirical question that progresses the issue.

Personally, it *has* been true for me as regards certain kinds of hindsight bias. Has it been true for you in any cases you remember? Has the opposite been true?

Expand full comment

frelkins said:

"Or at the very least, take your current project plan and run Monte Carlo VaR on it. Some people like Palisades' MC tool, but frankly I've found a market is more accurate."

I had to do a little work to unpack the acronyms and typos there. VaR is "Value at Risk". See http://en.wikipedia.org/wik.... "Palisades'" should have been "Palisade's", since there are semi-relevant companies named both Palisade and Palisades. The URL is http://www.palisade.com/, not http://www.palisadesresearc.... "MC" is Monte Carlo.

Expand full comment

The above comment is me. I'm commenting from my brother's computer in Israel and that's screwing things up a bit.

Expand full comment

The various quotes make it clear that I'm not the first one to have had something like this insight. But I do think my point is slightly different. It's not just that already-identified biases are hard to overcome because of the direct difficulty of overcoming them, but additionally that they're hard to overcome because the nature of the task you're doing may be such that you can't simultaneously worry about correcting bias and competently perform the other elements of the task.

HH, wouldn't closing my stance make me pull it less not more (I'm a rightie)?

Expand full comment

So, right here, as pointed out by the latter John Maxwell, is perhaps a bias that we might correct. We might employ technology to make ourselves better.

Expand full comment

>But most of the false statements in comments on this blog draw no response, so lack of responses is at best very weak evidence of the truth of a comment.

That's why we need a better commenting system like Disqus so that individual threads will be easier to keep track of, thus encouraging replies.

By the way, I'm not the same John Maxwell that wrote the comment retired urologist quoted. I think it was my dad: he also reads this blog, and we both have the same name.

Expand full comment

And then there's Jon Elster (Explaining Social Behavior (Cambridge, 2007), p. 128, n. 7):

Knowing that one may be subject to bias is one thing; being able to correct it is another. Studies show that deliberate attempts to debias one's judgment are of little value, since one easily falls into the traps of insufficient correction, unnecessary correction, or overcorrection. One may learn to distrust one's judgment, but it is harder to improve it. If one were able to, there might be no need to.

Expand full comment

Back when I was taking a class from Tversky, he made a similar analogy. He noted that psychologists have a huge catalog of ways the human perceptual system is inaccurate. But simply knowing that things look farther away on hazy days doesn't magically make you a better judge of distance. He said it's exactly the same with cognitive biases.

Daniel Kahneman, Tversky's long-time collaborator, made a similar point in a recent interview:

Does your research offer insights to individuals with regard to how they might change their own behavior?Yes, potentially it does, but they are not insights that are very easy to use. You can be aware of mistakes you make, [but] it's very difficult to learn to avoid them, because System One operates automatically. Occasionally when a decision is very important, then you can stop to think, and you have to recognize that you can stop to think. And then there is still the extra stage, which is a very painful and difficult stage, [where] even if your analysis tells you to do something, you don't want to do it. To impose the discipline of rationality on your desires is extremely difficult. I've just lived something like this in deciding whether or not to write a book: I just want to do it. When I analyze the pros and cons it's absurd for me to do this, but I'm going to do it, I think. So, you know, it's very difficult to impose discipline on yourself.

Expand full comment

David, I keep intending to alter the focus of my writing in the way that you suggest, but I've found it hard enough that I haven't been satisfied with the results.

Richard, you might want to think about whether you're responding to your boss's preference for overconfidence.

Expand full comment

This post could lead to this site becoming a much more valuable resource.

Expand full comment

Back when I was taking a class from Tversky, he made a similar analogy. He noted that psychologists have a huge catalog of ways the human perceptual system is inaccurate. But simply knowing that things look farther away on hazy days doesn't magically make you a better judge of distance. He said it's exactly the same with cognitive biases.

To overcome bias, you need something analogous to rulers and rangefinders.

Expand full comment

@ Richard & Tom P.

"I am currently trying to overcome an overconfidence bias in my programming job, specifically that my estimates for time to complete a project are too low."

I solved this problem by starting a prediction market. I suggest you do too. Or at the very least, take your current project plan and run Monte Carlo VaR on it. Some people like Palisades' MC tool, but frankly I've found a market is more accurate.

Expand full comment

His comment drew no response, possibly because it is true.

But most of the false statements in comments on this blog draw no response, so lack of responses is at best very weak evidence of the truth of a comment.

Expand full comment