Correcting Biases Once You’ve Identified Them

Most of the discussion on this blog seems to focus on figuring out how to identify biases.  We implicitly assume that this is the hard part; that biases can be really sneaky and hard to ferret out, but that once you’ve identified a bias, correcting it is pretty straightforward and mechanical.  If you’ve figured out that you have a bias that causes you to systematically overestimate the probability of a particular kind of event happening by .2, you simply subtract .2 from future estimates (or whatever).  But it seems to me that actually correcting a bias can be pretty hard even once it’s been identified.  For example, I have a tendency to swing a bit too late at a (slow-pitch) softball.  I’m sure this bias could be at least partially corrected with effort, but it is definitely not simply a matter of saying to myself: "swing .5 seconds sooner than you feel like you should swing."  That just can’t be done in real time without screwing up the other mechanics of the swing.

I think this is also a problem for more consequential matters  In real decision-making situations, where there are elements of the problem that need attention besides the (already identified) bias, it is not going to be a trivial matter to fix the bias without screwing up some other part of the problem even worse.  I’m not sure this is the right way to put it, but it seems like OB engineering is a seperate and important discipline distinct from OB science.

GD Star Rating
loading...
Tagged as:
Trackback URL:
  • http://goodmorningeconomics.wordpress.com jsalvati

    I totally agree. Sometimes I see myself making a mistake stemming from bias while I’m in the process of making it and still make it.

  • http://users.ox.ac.uk/~ball2568/ Pablo Stafforini

    Worse: identifying a bias can sometimes cause one to be more biased. See Robin’s (superbly titled) post, In Bias, Meta is Max.

    (I wonder, though, whether these findings apply even to those actively and self-consciously trying to overcome bias.)

  • http://www.alexisgallagher.com Alexis Gallagher

    I love your analogy with hitting a softball. It reminds me of my work in teaching theatrical improvisation which may offer some analogies with learning to overcome bias.

    Good improvisation requires a suite of skills — acting, objectwork, literal listening, subtext listening, etc.. Weak improvisers are usually missing one of these skills completely. When you first give someone an exercise that focuses only on their underdeveloped skill, it feels unnatural to them. It upsets their balance, makes them worse at their developed skills, and damages their overall performance. This is because all these skills need to be fairly automatic to work in tandem with each other. But this phase is just growing pains. If people keep focusing on their weak points, the new skills become internalized, get integrated with older skills, and the overall result eventually improves.

    The analogy with messing up your “mechanics” trying to hit a softball is clear. How does this all apply to overcoming bias?

    One obvious parallel is the idea of an exercise focused on a particular skill. This is already a lot like the questions with which Robin will end an article, a question designed to focus your awareness on a particular bias — an underdeveloped skill. This is great. Every bias can have some question or assignment — a bias-sensitization exercise — that develops your awareness of this bias.

    But there are two more parallels here. First, there is the idea that when you first learn a new skill (i.e., a particular bias awareness), you go through a phase where you are awkward and over-conscious in using it and it messes up your other skills (i.e., you see the new bias everywhere, over-weighting it versus other factors). Second, there is the idea that you work through this phase by internalizing the skill, by learning it so thoroughly that it becomes like a reflex.

    Does this translate to biases? That is, if you spend enough time doing a particular bias-sensitization exercise, does that ultimately make you over-sensitive to that bias so that you see it everywhere? Or does that ultimately let you recognize it more easily, more naturally, and then weigh it more correctly vis-a-vis other biases?

    I would bet on the second because I would bet that, like hitting a softball, even explicit reasoning depends a lot on coordinating implicit or automatic reasoning skills. “Judgment” and “intuition” both seem like names for reaching conclusions by intuitively balancing a large number of largely unconscious guidelines and arguments.

  • Richard

    The quick band-aid fix certainly isn’t working for me. I am currently trying to overcome an overconfidence bias in my programming job, specifically that my estimates for time to complete a project are too low. I have tried multiplying the amount of time that “feels” right by 1.5, but that hasn’t helped at all.

    What I suspect is happening is that I know I’m going to multiply by 1.5, but the amount of time in question still feels right, so I multiply by 2/3 first and tell myself that that’s what “feels” right. Then I multiply back out by 1.5 per the plan and I’ve accomplished nothing.

    What I want to do at this point is get around that by multiplying by 1.5 a second time, because that would be easy, but I suspect it would be no more useful than doing it the once. Actually correcting the bias will probably require just that, figuring out why the wrong time feels right and changing my thought process so it doesn’t.

  • http://drchip.wordpress.com/ retired urologist

    Thank you,, D.J. Balan. You’ve revealed the elephant in the room.

    In the link to Robin Hanson’s previous post given by “Pablo Stafforini (above), commenter “John Maxwell” wrote:

    “Without a way to overcome this bias, our other efforts are largely wasted.”

    This understates the problem. Without a way to overcome this bias, overcomingbias.com is actually making people more biased than they were before.

    His comment drew no response, possibly because it is true. I agree with him, and with you. For the most part, each blog post either is defended by it’s writer, even when egregious bias is revealed by another, or no comment about the identified bias is made, skipping to more easily defended comments. Perhaps “identifying bias.com” would be more descriptive.

  • Tom P

    Richard – I try to break down projects into small chunks and guess a time commitment for each. I then sum the components and multiply by the “incompetence factor” (for me, 2). This factor takes into account the unforseen.

    I believe that chunking the project makes it harder for you to secretly multiply by 2/3 before taking your bias into account.

  • HH

    Close your stance a little. That is, if you’re a right-handed hitter, rotate around your midpoint such that your left foot is slightly closer to home plate, and your right foot slightly further away. It won’t mess with your mechanics per se, but you will hit the ball at a better angle, and probably pull it more than you did when you swung late.

    I’ll let someone else analogize this to actual biases.

  • Z. M. Davis

    I’ve been rather haunted by “The Bottom Line” lately. Once someone actually points it out, the principle is obvious: decisions can only be justified or not by their true causes. The real-life application, however, is nontrivial–to me, at least. For one thing, life does not come marked with instantaneous points where a “decision” happens. You could say that a decision happens every “moment”–whatever it is we mean by moment. When we specify a decision in words, we could be talking about the timespan between when one first has the idea and when one actually signs the paperwork (or whatever)–more than enough time to engage in rationalization. Also, people don’t come with flight recorders: human memory and human introspection and personal artifacts aren’t enough to pin down exactly what motives were moving oneself at what time. Furthermore, there’s the possibility of genuinely having different values before and after the decision, perhaps even as an effect of some of the decision’s consequences.

    In my case, I’m happy with my decision, but my virtue as a rationalist is computationally intractable.

  • http://rhollerith.com/blog Richard Hollerith

    His comment drew no response, possibly because it is true.

    But most of the false statements in comments on this blog draw no response, so lack of responses is at best very weak evidence of the truth of a comment.

  • http://www.iphonefreak.com frelkins

    @ Richard & Tom P.

    “I am currently trying to overcome an overconfidence bias in my programming job, specifically that my estimates for time to complete a project are too low.”

    I solved this problem by starting a prediction market. I suggest you do too. Or at the very least, take your current project plan and run Monte Carlo VaR on it. Some people like Palisades’ MC tool, but frankly I’ve found a market is more accurate.

  • Kevin Dick

    Back when I was taking a class from Tversky, he made a similar analogy. He noted that psychologists have a huge catalog of ways the human perceptual system is inaccurate. But simply knowing that things look farther away on hazy days doesn’t magically make you a better judge of distance. He said it’s exactly the same with cognitive biases.

    To overcome bias, you need something analogous to rulers and rangefinders.

  • sonic

    This post could lead to this site becoming a much more valuable resource.

  • http://profile.typekey.com/bayesian/ Peter McCluskey

    David, I keep intending to alter the focus of my writing in the way that you suggest, but I’ve found it hard enough that I haven’t been satisfied with the results.

    Richard, you might want to think about whether you’re responding to your boss’s preference for overconfidence.

  • http://users.ox.ac.uk/~ball2568/ Pablo Stafforini

    Back when I was taking a class from Tversky, he made a similar analogy. He noted that psychologists have a huge catalog of ways the human perceptual system is inaccurate. But simply knowing that things look farther away on hazy days doesn’t magically make you a better judge of distance. He said it’s exactly the same with cognitive biases.

    Daniel Kahneman, Tversky’s long-time collaborator, made a similar point in a recent interview:

    Does your research offer insights to individuals with regard to how they might change their own behavior?

    Yes, potentially it does, but they are not insights that are very easy to use. You can be aware of mistakes you make, [but] it’s very difficult to learn to avoid them, because System One operates automatically. Occasionally when a decision is very important, then you can stop to think, and you have to recognize that you can stop to think. And then there is still the extra stage, which is a very painful and difficult stage, [where] even if your analysis tells you to do something, you don’t want to do it. To impose the discipline of rationality on your desires is extremely difficult. I’ve just lived something like this in deciding whether or not to write a book: I just want to do it. When I analyze the pros and cons it’s absurd for me to do this, but I’m going to do it, I think. So, you know, it’s very difficult to impose discipline on yourself.

  • http://users.ox.ac.uk/~ball2568/ Pablo Stafforini

    And then there’s Jon Elster (Explaining Social Behavior (Cambridge, 2007), p. 128, n. 7):

    Knowing that one may be subject to bias is one thing; being able to correct it is another. Studies show that deliberate attempts to debias one’s judgment are of little value, since one easily falls into the traps of insufficient correction, unnecessary correction, or overcorrection. One may learn to distrust one’s judgment, but it is harder to improve it. If one were able to, there might be no need to.

  • John Maxwell

    >But most of the false statements in comments on this blog draw no response, so lack of responses is at best very weak evidence of the truth of a comment.

    That’s why we need a better commenting system like Disqus so that individual threads will be easier to keep track of, thus encouraging replies.

    By the way, I’m not the same John Maxwell that wrote the comment retired urologist quoted. I think it was my dad: he also reads this blog, and we both have the same name.

  • Daniel Griffin

    So, right here, as pointed out by the latter John Maxwell, is perhaps a bias that we might correct. We might employ technology to make ourselves better.

  • David חץ Balan

    The various quotes make it clear that I’m not the first one to have had something like this insight. But I do think my point is slightly different. It’s not just that already-identified biases are hard to overcome because of the direct difficulty of overcoming them, but additionally that they’re hard to overcome because the nature of the task you’re doing may be such that you can’t simultaneously worry about correcting bias and competently perform the other elements of the task.

    HH, wouldn’t closing my stance make me pull it less not more (I’m a rightie)?

  • David J. Balan

    The above comment is me. I’m commenting from my brother’s computer in Israel and that’s screwing things up a bit.

  • http://www.fungible.com Tim Freeman

    frelkins said:

    “Or at the very least, take your current project plan and run Monte Carlo VaR on it. Some people like Palisades’ MC tool, but frankly I’ve found a market is more accurate.”

    I had to do a little work to unpack the acronyms and typos there. VaR is “Value at Risk”. See http://en.wikipedia.org/wiki/Value_at_risk. “Palisades'” should have been “Palisade’s”, since there are semi-relevant companies named both Palisade and Palisades. The URL is http://www.palisade.com/, not http://www.palisadesresearch.com/. “MC” is Monte Carlo.

  • http://www.alexisgallagher.com Alexis Gallagher

    David said:

    “But I do think my point is slightly different. It’s … that already-identified biases are hard to overcome because … the nature of the task you’re doing may be such that you can’t simultaneously worry about correcting bias and competently perform the other elements of the task.”

    My comment addressed the specific point you are making — that focusing on avoiding a particular bias disrupts the necessary balance with other “elements of the task”, such as other kinds of bias-avoidance and the primary reasoning task. I suggested that this problem is normal and maybe the solution is normal. In many learning situation, a new skill always disrupts old skills until it is internalized. After internalization, it can be integrated in a more correctly with old skills because it requires less conscious attention or, as you put it, you’re not “worrying about correcting bias” simultaneously with performing the task.

    So my hypothesis is that the problem is solved via conscious exercises focused on bias-awareness leading to unconscious internalized awareness of that bias. Is this true? One way to test this, as I said, is to consider if repeated efforts to sensitize yourself to a bias have generally resulted in oversensitivity or to a more balanced weighting of that bias. No one followed up on my comment, but this still strikes me as an empirical question that progresses the issue.

    Personally, it *has* been true for me as regards certain kinds of hindsight bias. Has it been true for you in any cases you remember? Has the opposite been true?

  • HH

    “HH, wouldn’t closing my stance make me pull it less not more (I’m a rightie)?”

    It depends on your upper body position once you adjust the footwork. Most people who change their stance usually don’t adjust their upper body position: your uppper body remains perpendicular to the front edge of home plate, even if your feet at are at a more closed angle. This means that in your stance, you’ve already “started” your swing, and the bat has a shorter path to the ball. Your swing basically gets faster, and you pull the ball more. [I’ve seen people lose power doing this, but I doubt that’ll happen to you if you’re already a push hitter.] If you’re the type of person would rotate their upper body as well [facing home plate at the same angle as your feet are] then you’d probably pull the ball less.

    In fact, thinking about the stance has made me think about how to successfully correct bias. Presumably, when we identify a bias, we can also identify the optimal outcomes we would reach in the absence of said bias. In the batting case, that would mean pulling the ball more. Changing the stance is essentially finding a way to bring about the desired outcome whilst still giving in to the bias: you can swing late and still achieve the desired outcome. I think commitment devices and other early actions [“stance changes”] that produce the desired outcome regardless of whether the bias is corrected or not are probably a lower-cost and more realistic way for most people to reduce the effects of bias. In effect, tying yourself to the mast is probably more likely to succeed than learning to ignore the sirens’ song. …As long as you can be unbiased enough to tie yourself to the mast.

    That said, I have to agree with Alexis: one can train oneself to avoid biases, though at first it’s difficult and it takes time. It’s like driving a stickshift: it owns your whole brain at first, but after a while it’s second nature. [I now notice false dichotomies that people use almost instinctively.]

  • David J. Balan

    Alexis and HH, Good points. And HH, thanks for the advice!

  • http://bizop.ca michael webster

    Pretty interesting observation.

    There are basically two ways to deal with an observed bias. First, try to be aware and counter it. But, I agree with the late Tversky that this is a lot easier to say than do.

    The second method is simply to avoid making those decisions that you have reason to believe that your biases are too great to overcome.

    For example, suppose I tend to run red lights under certain circumstances causing a danger to myself and others. I could try to pay more attention to spotting red lights, or I could just stop driving in those circumstances.

    Our entire due diligence law, protecting consumers from fraud, assumes that only the first method is valid: viz. giving investors, purchaser or consumers more information or knowledge to counter their biases. Many times, the second option is better – just don’t make those decisions in which you know you are lousy at.

  • Pingback: xoxoANP! » Blog Archive » Behavioral engineering. Or: playing God.