Bias Is A Red Queen Game

It takes all the running you can do, to keep in the same place. The Red Queen.

In my last post I said that as “you must allocate a very limited budget of rationality”, we “must choose where to focus our efforts to attend carefully to avoiding possible biases.” Some objected, seeing the task of overcoming bias as like lifting weights to build muscles. Scott Alexander compared it to developing habits of good posture and lucid dreaming:

If I can train myself to use proper aikido styles of movement even when I’m doing something stupid like opening a door, my body will become so used to them that they will be the style I default to when my mind is otherwise occupied. .. Lucid dreamers offer some techniques for realizing you’re in a dream, and suggest you practice them even when you are awake, especially when you are awake. The goal is to make them so natural that you could (and literally will) do them in your sleep. (more)

One might also compare with habits like brushing your teeth regularly, or checking that your fly isn’t unzipped. There are indeed many possible good habits, and some related to rationality. And I encourage you all to develop good habits.

What I object to is letting yourself think that you have sufficiently overcome bias by collecting a few good mental habits. My reason: the task of overcoming bias is a Red Queen game, i.e., one against a smart, capable, and determined rival, not a simple dumb obstacle.

There are few smart determined enemies trying to dirty your teeth, pull your fly down, mess your posture, weaken your muscles, or keep you unaware that you are dreaming. Nature sometimes happens to block your way in such cases, but because it isn’t trying hard to do so, it takes only modest effort to overcome such obstacles. And as these problems are relatively simple and easy, an effective strategy to deal with them doesn’t have to take much context into account.

For a contrast, consider the example of trying to invest to beat the stock market. In that case, it isn’t enough to just be reasonably smart and attentive, and avoid simple biases like not deciding when very emotional. When you speculate in stocks, you are betting against other speculators, and so can only expect to win if you are better than others. If you can’t reasonably expect to have better info and analysis than the average person on the other side of your trades, you shouldn’t bet at all, but instead just take the average stock return, by investing in index funds.

Trying to beat the stock market is a Red Queen game against a smart determined opponent who is quite plausibly more capable than you. Other examples of Red Queen games are poker, and most competitive contests like trying to win at sports, music, etc. The more competitive a contest, the more energy and attention you have to put in to have a chance at winning, and the more you have to expect to specialize to have a decent chance. You can’t just develop good general athletic habits to win at all sports, you have to pick the focus sport where you are going to try to win. And for all the non-focus sports, you might play them for fun sometimes, but you shouldn’t expect to win against the best.

Overcoming bias is also a Red Queen game. Your mind was built to be hypocritical, with more conscious parts of your mind sincerely believing that they are unbiased, and other less conscious parts systematically distorting those beliefs, in order to achieve the many functional benefits of hypocrisy. This capacity for hypocrisy evolved in the context of conscious minds being aware of bias in others, suspecting it in themselves, and often sincerely trying to overcome such bias. Unconscious minds evolved many effective strategies to thwart such attempts, and they usually handily win such conflicts.

Given this legacy, it is hard to see how your particular conscious mind has much of a chance at all. So if you are going to create a fighting chance, you will need to try very hard. And this trying hard should include focusing a lot, so you can realize gains from specialization. Just as you’d need to pay close attention and focus well to have much of a chance at beating the hedge funds and well-informed expert speculators who you compete with in stock markets.

In stock markets, the reference point for “good enough” is set by the option to just take the average via an index fund. If using your own judgement will do worse than an index fund, you might as well just take that fund. In overcoming bias, a reference point is set by the option to just accept the estimates of others who are also trying to overcome bias, but who focus on that particular topic.

Yes you might do better than you otherwise would have if you use a few good habits of rationality. But doing a bit better in a Red Queen game is like bringing a knife to a gunfight. If those good habits make you think “I’m a rationalist,” you might think too highly of yourself, and be reluctant to just take the simple option of relying on the estimates of others who try to overcome their biases and focus on those particular topics. After all, refusing to defer to others is one of our most common biases.

Remember that the processes inside you that bias your beliefs are many, varied, subtle, and complex. They express themselves in different ways on different topics. It is far from sufficient to learn a few simple generic tricks that avoid a few simple symptoms of bias. Your opponent is putting a lot more work into it than that, and you will need to do so as well if you are to have much of a chance. When you play a Red Queen game, go hard or go home.

GD Star Rating
Tagged as: ,
Trackback URL:
  • Brent Dill

    The fear there, though, is that the HARDEST bias to overcome is who to trust. When multiple competing groups claim to be attempting to overcome bias, and are attempting to signal that they focus on a particular area, and that THEIR interpretation of a particular phenomenon is accurate, it is very difficult to distinguish which of them are, in fact, unbiased without devoting the same level of focus and intelligence yourself (whether or not you even possess it to devote). What, then, is to be done?

    • Take an average of their opinions, or have no opinion.

      • aretae

        What do you do with cases where you strongly suspect strong bias: I’m not a religious scholar, but I’m smart, and haven’t found anything sane on God, as compared to Hume or some such? Kling’s suspicion of macro- and climate modeling as a method, and everyone supporting the position supports the method? Still average in the opinions?

      • IMASBA

        Yes, but just average correctly. If you are standing on a barren piece of desert and one expert thinks the enemy army will attack the town 100km to the west of you and the other thinks they will attack the town 100km to the east of you, you have to average it as a 50% chance of an attack on the town to the west and a 50% chance of an attack on the town to the east instead of a 100% chance of an attack on your position. I know it appears ridiculously easy to make the right kind of average in this example but in analoguous problems people will often average the wrong way, clear examples would be those political compromises that take the worst of both worlds and make no one happy.

  • Vladimir Nesov

    The problem with thinking too highly of yourself, of trusting your own judgement too much, is straightforward to overcome with a general habit: you recalibrate. This is possible at least on the conscious level, for estimates you bother to consider carefully (e.g. write down your estimate, then adjust it using a fixed procedure based on past performance, even if you don’t believe the result of the correction, etc.). At that point, you start trusting yourself too little more often. You won’t become any better at distinguishing good accuracy from poor accuracy in your own judgement, but you’ll fix the systematic miscalibration problem. This particular issue is not a competition, so it’s a bad example for the post.

    • Miscalibration isn’t the only problem resulting from arrogance.

      • Vladimir Nesov

        It’s the kind of thing that characterizes systematically insufficient reliance on others’ judgement. I agree that it’s futile to personally produce beliefs on many topics that are more accurate than what could be collected elsewhere (for topics that have identifiable epistemically healthy communities focusing on them). In this sense, fixing the issue of calibration in estimating accuracy of others’ info is an example of a straightforward generally applicable good habit, not an example of a futile effort against a varied, subtle and complex bias problem.

      • Peter McCluskey

        Fixing my miscalibration made me rational enough to beat the stock market.

        The Red Queen aspects of the stock market that I see involve getting more information and rapidly analyzing it, not becoming increasingly rational.

        I do put a small amount of continuing effort into avoiding endowment effects, but that isn’t a race against a smart opponent.

      • Sounds like you followed my advice: to pick a few focus areas in which to try seriously to overcome bias.

      • IMASBA

        “Fixing my miscalibration made me rational enough to beat the stock market.”

        Unlikely, it’s vastly more likely that you’re on a lucky streak (which can last for years or be over tomorrow).

        Unless perhaps you own a portfolio that consists exclusively of landmine manufacturers, chemical weapons plants, African arms dealers and miners/oil drillers operating in rainforests, that might stand a chance of beating the stock market…

    • The problem with thinking too highly of yourself, of trusting your own judgement too much, is straightforward to overcome with a general habit: you recalibrate.

      With a simple technique, you claim to have solved the problem of overvaluing one’s own opinion. Surely you no longer overvalue yours! (How broader biases will influence “recalibration” seems too obvious to pursue.)

      This is almost as amazing a “rationalist” claim as EY’s about his becoming a perfect utilitarian.

      • anon

        It seems to me the only problem with relying on calibration is that it doesn’t work without clear feedback. You say that broader biases will influence this process in presumably negative ways, but those ways aren’t obvious to me. Feedback might not solve everything, but it sure does seem to solve a whole lot.

      • One source of bias the choice of reference class. Essentially, you’re trying to get less biased in situations where you don’t get feedback based on those where you get feedback, which leaves you without feedback about your choice of reference class.

        On a more empirical note, why are LWers intellectually arrogant when recalibration is handy. (Perhaps you deny the premise.)

      • anon

        I agree LWers are arrogant. I think they fail to recalibrate. But theoretically, recalibration has tremendous potential.

        Thanks for the example, that makes sense.

  • Noumenon72

    Tooth decay bugs are a Red Queen game for your immune system, but not for you — you can prevent decay your whole life by just pushing a toothbrush around. The motivated irrationality system in your brain, similarly, is in a Red Queen game with naturally occurring rationality, but if you can find one tactic it’s not adapted against, it can’t do anything to stop you in this lifetime. All you have to worry about is humans playing a Red Queen game to find new biases to exploit you with.

    • Are you sure unconscious minds can’t share innovations via culture just as conscious minds? And are there in fact any substantial innovations in ways to avoid unconscious bias?

      • IMASBA

        I’m pretty sure training in, and heavy use of, (Bayesian) statistics qualifies as such an innovation. But of course you can’t always use it (when you do not have enough info or are not sure which variables matter or it would just take too much time, as aresponse perhaps in time unconscious minds can start selecting games where you have to be really fast with your decisions).

      • anon

        Can you provide any examples of unconscious innovations? Culturally specific mechanisms that biases to operate through are not a real thing, as far as I know. The biases themselves are the same, only the subject of the biases is altered cross culturally.

        Possible counterexample: the way we evaluate time is different than some other cultures. However, I think that this actually shows an innovation in avoiding bias – our culture has overcome the default state where we’re sloppy about labelling time and I think that’s a useful development.

      • Mark

        >> Can you provide any examples of unconscious innovations?

        We pick up other’s mannerisms without conscious intent and surely we unconsciously adapt or discard them depending on their utility.

      • anon

        This is not an example of a new mechanism. If you learn how to fight the automatic adoption of one set of mannerisms, you have the habits and knowledge needed to fight any other set.

      • If you learn how to fight the automatic adoption of one set of mannerisms, you have the habits and knowledge needed to fight any other set.

        Do you know of such a technique, or is this a stipulation?

      • Noumenon72

        Thank you for taking the time to succinctly show me what I was overlooking instead of just thinking “He doesn’t have a clue” and moving on.

  • Silent Cal

    You say “don’t be a rationalist on your non-focused areas: rely on the estimates of others.”

    I say “be a rationalist on your non-focused areas: rely on the estimates of others.”

    After all, refusing to defer to others is one of our most common biases.

    • You could have said that investing in an index fund is also “speculation”, but of the right sort. In the end it is boring to argue over word definitions.

    • “Rationalists” advise to rely on the estimates of others, but that doesn’t mean that being a “rationalist” inclines you, in the right circumstances and manner, to be heedful of others’ opinions.

      LW (viewed as a social experiment) tends to show that the overestimation of and pre-occupation with techniques and stratagems designed to enhance general rationality dominates over the advice to heed others.

  • almondguy

    To me “overcoming bias” seems very similar to “integrating your mental modules.”

  • anon

    It’s important to note that biases are evolutionarily designed to fight you from the inside, and cannot be overcome entirely. But describing it as a Red Queen game is overly pessimistic. All of the other examples of Red Queen games you provide are agents competing with other agents, where there are opportunities for intelligent intervention or copying the strategies of opponents. Biases cannot invent or copy new strategies, the biases you have now are the same biases you will have to fight in 20 years. So your description seems wrong.

    • It’s important to note that biases are evolutionarily designed to fight you from the inside, and cannot be overcome entirely.

      The primary self-deception is the belief that you want to overcome irrationality—denying its contemporary adaptiveness.

      Any human who cares deeply about status, that is any human, deeply wants to be irrational. The striving for status not only depends on exploiting one’s own irrationality, but could not exist were it not for irrational beliefs. (This is in part a belated reply to TheBrett on the usefulness of the fundamental error of attribution.)

      Anyone who claims the mantle “rationalist” (that is, who claims to value rationality in general) will be (rationally) suspected of self-justification in some irrationality.

      The instrumental rationality of hypocrisy (and self-deception) makes “rationalism” self-undermining.

      • Carinthium

        Could you clarify please? Why must the striving for status depend upon irrational beliefs? And why must a desire for status depend upon irrational beliefs?

      • Our striving for status is furthered by our irrational beliefs because we can better promote ourselves when we have illusions.

        (I discuss this in “The unity of comprehension and belief and the common failure to grasp opposing arguments” — . I cite to Trivers, but I think it’s also part of homo hypocritus.)

        That our striving for status is constituted by illusions is based on the observation that all strivings seem invariably accompanied by necessary illusions, which I consider near-mode biases.

        (The broadest such illusion is the illusion of free will, which I discuss at .)

        As to status in particular, one necessary near-mode illusion is (I would hypothesize) the belief that one person is in some absolute sense superior to another.

      • anon

        I have Asperger’s, perhaps that’s distorting my view somewhat. I care about status in some ways, but in other ways it seems useless and I’m frustrated by its role in the behavior of others. But, here’s my perspective anyway.

        I think you’re right that most people care about status. But status is only one value that’s competing against many. Most people choose status over rationality. But it’s too much of an exaggeration to say that all of them do. There’s significant variation in how much priority is given to status concerns by different people, at the least most people can become more rational than they already are.

        I don’t think that biases are something that evolved especially for status gains. I think they evolved as a rough and heuristic driven approach to genuine reasoning about the world. We wouldn’t understand nearly as much about the world as we do if our reasoning was so driven by social status concerns. Your argument proves too much, if status exerted such a powerful influence on thought it wouldn’t be possible for people to become rational even in limited domains, but they clearly can.

        Also, I think your post involves questionable speculative evolutionary-psychology. I’m disagree with the idea that biased behavior is a major boost to social status or a prerequisite for it. From what I know of psychology, biases have more to do with ego-protection than community deception. People whose views are biased in their own self interest are often seen as rude, stupid, or evil. We like to believe that nerds are persecuted, but I expect increased intelligence and rationality will actually correlate with increased social status.

    • I don’t think it is at all obvious that unconscious minds can’t copy strategies just as conscious ones do.

      • anon

        I can’t prove a negative. What leads you to speculate that this is the case? Where’s the evidence that causes you to believe unconscious minds are innovating?

        Even if unconscious minds can copy strategies, some strategies are going to be weaker or stronger than others. Also, there shouldn’t be an unlimited number of strategies. I think that there are lists of cognitive biases online which are extremely comprehensive, and that you’d be hard pressed to invent or discover a new form of bias.

        I think we can observe significant differences in general rationality when we look at different people. Do you disagree? I think that preexisting intelligence plays a role in rationality differences but that its role must be limited, because there are people who are highly irrational yet also intelligent. No one is born with detailed knowledge of cognitive biases or how to combat them, yet some people can eventually counter them anyway. Learning must be involved.

        I don’t understand how you view biases as a Red Queen game and think biases adapt but also think that they can be overcome within specific limited domains. Biases aren’t specific to any particular subject areas, they’re general errors in logic. Beating biases in one area makes it easy to beat biases in others. If unconscious biases were adapting we’d expect people to become worse at general rationality as they struggled to even maintain skill in their area of specialization.

        This study describes resisting system 1 and focusing on system 2 as the core and generalizable skill of resisting bias.

        Here are a few other examples from an old LessWrong post.

        At the very least, there are general and reliable ways to fight biases indirectly. Getting an adequate amount of sleep, for example. Or practicing mindfulness meditation. Or structuring one’s environment with incentives to reduce bias and feedback mechanisms.

  • Aaron T

    I think that a lot of bias is just cognitive quirks, rather than reflecting an unconscious motivation – inference is hard, and it wouldn’t be surprising if evolution was unable to do it optimally.

    I agree that signalling considerations might make it hard to think things which are inflammatory to one alliance or another, but I’d be hard-pressed to come up with a way in which base rate neglect is fundamentally serving some social purpose that your unconscious is optimizing for. Unconsciously motivated biases might make you more susceptible to it in some circumstances, but I think that there are many that are just the quirk.

    Getting rid of all bias is really hard, but I don’t find the Red Queen argument convincing that there isn’t substantial progress to be made with less than expert-level mastery.

  • Philon

    “When you speculate in stocks, you are betting against other speculators, and so can only expect to win if you are better than others. If you can’t reasonably expect to have better info and analysis than the average person on the other side of your trades,
    you shouldn’t bet at all . . . .”

    This understates the difficulty. The market–the sum total of all the other investors–in effect aggregates their information. To expect to win you must think you have better information than the market *as a whole*, not just than the *average* other participant.

    • Ben Southwood

      Hugely important

  • Tom Hynes

    It is hubris to think that you have a strategy that would do worse than an index fund. If you did, just take the opposite of that strategy and, absent transaction costs, you would do better than an index fund.

    • Reclaim our southern children

      Short the market?

    • AspiringRationalist

      With a given average return per year, higher volatility will decrease your long-term returns. Index funds provide diversification by investing in a broad range of investments; concentrating your money in a few stocks doesn’t, so if you pick stocks no better or worse than chance but don’t diversify well, you will, on average, underperform index funds.

      An even easier way to underperform index funds is to choose investments with high fees, such as actively managed mutual funds.

  • Eliezer Yudkowsky

    This seems tremendously exaggerated, at least relative to my own life experience. I’m having trouble thinking of a single occasion where getting rid of one bias spawned another. Sometimes it produced enough clarity to *notice* another. And it’s true that the rabbit hole goes very deep. But my mind behaves nothing like an intelligent adversary that adapts against me, and nothing like an efficient market in stupidity that has already adjusted for every heuristic that I can manage to invent. Perhaps people are different in this way as in others, and yet I still find this hard to imagine. Can we have some concrete examples of this from your life, please?

    • I don’t think I said that getting rid of biases spawned other biases. But your unconscious adapts all the time to the world around you, and to your conscious thoughts. After all the main way that you adapt to things is via your unconscious adapting.

    • For a concrete example: when trained in physics I looked down on everyone but those in physics and math. When I learned computer science I learned to appreciate that they knew a lot. So my biases adapted: I next looked down on everyone who wasn’t technical, which included physics and computers.

    • I’m having trouble thinking of a single occasion where getting rid of one bias spawned another.

      In light of what’s evident, that’s actually a very funny claim.

  • Kenny

    If overcoming biases is truly a Red Queen game, then we *have* to play, if only to keep our biases from overwhelming us.

  • mony
  • Pingback: The Motte and Bailey Doctrine | διά πέντε / dia pente()