Don’t Be “Rationalist”

The first principle is that you must not fool yourself — and you are the easiest person to fool. Richard Feynman

This blog is called “Overcoming Bias,” and many of you readers consider yourselves “rationalists,” i.e., folks who try harder than usual to overcome your biases. But even if you want to devote yourself to being more honest and accurate, and to avoiding bias, there’s a good reason for you not to present yourself as a “rationalist” in general. The reason is this: you must allocate a very limited budget of rationality.

It seems obvious to me that almost no humans are able to force themselves to see honestly and without substantial bias on all topics. Even for the best of us, the biasing forces in and around us are often much stronger than our will to avoid bias. Because it takes effort to overcome these forces, we must choose our battles, i.e., we must choose where to focus our efforts to attend carefully to avoiding possible biases. I see four key issues:

1. Priorities – You should spend your rationality budget where truth matters most to you. You can’t have it all, so you must decide what matters most. For example, if you care mainly about helping others, and if they mainly rely on you via a particular topic, then you should focus your honesty on that topic. In particular, if you help the world mainly via your plumbing, then you should try to be honest about plumbing. Present yourself to the world as someone who is honest on plumbing, but not necessarily on other things. In this scenario we work together by being honest on different topics. We aren’t “rationalists”; instead, we are each at best “rationalist on X.”

2. Costs – All else equal, it is harder to be honest on more and wider topics, on topics where people tend to have emotional attachments, and on topics close to the key bias issues of the value and morality of you and your associates and rivals. You can reasonably expect to be honest about a wide range of topics that few people care much about, but only on a few narrow topics where many people care lots. The close you get to dangerous topics, the smaller your focus of honesty can be. You can’t be both a generalist and a rationalist; specialize in something.

3. Contamination – You should try to avoid dependencies between your beliefs on focus topics where you will try to protect your honesty, and the topics where you are prone to bias. Try not to have your opinions on focus topics depend on a belief that you or your associates are especially smart, perceptive, or moral. If you must think on risky topics about people, try to first study other people you don’t care much about. If you must have an opinion on yourself, assume you are like most other people.

4. Incentives – I’m not a big fan of the “study examples of bias and then will yourself to avoid them” approach; it has a place, but gains there seem small compared to changing your environment to improve your incentives. Instead of pulling yourself up by your bootstraps, step onto higher ground. For example, by creating and participating in a prediction market on a topic, you can induce yourself to become more honest on that topic. The more you can create personal direct costs of your dishonesty, the more honest you will become. And if you get paid to work on a certain topic, maybe you should give up on honesty about who if anyone should be paid to do that.

So my advice is to choose a focus for your honesty, a narrow enough focus to have a decent chance at achieving honesty. Make your focus more narrow the more dangerous is your focus area. Try to insulate beliefs on your focus topics from beliefs on risky topics like your own value, and try to arrange things so you will be penalized for dishonesty. Don’t persent yourself as a “rationalist” who is more honest on all topics, but instead as at best “rationalist on X.”

So, what is your X?

GD Star Rating
loading...
Tagged as: ,
Trackback URL:
  • IMASBA

    Good points, somewhat open doors but that does not make them less thruthful.

  • anon

    You provide zero evidence, direct or indirect, that rationality is a limited resource. Do you expect us to believe you simply because of your credentials, or what?

    My perspective is the opposite of yours. I think that practicing rationality in any one domain makes it easier to practice in others. Striving for rationality builds good mental habits.

    I do agree that “rationalist” as a self-concept is a dangerous one, but not for the reason you describe here. I think it encourages people to overlook their mistakes when they should be jumping on them.

    • contrarian

      Who bears the burden of proof — the one who claims that a resource is constrained, or the one who claims that a resource is entirely unlimited?

      • Silent Cal

        There’s an empirical question here, and we’re not going to resolve it a priori.

        Here’s the limited-resource argument adapted to a case where the empirical facts oppose it:

        Don’t be a ‘weightlifter’. No one can lift all the weight in the world, so you have to pick and choose what weight you lift. Focus on weights that are really important to lift and avoid lifting weights that don’t really matter. Going to the gym and lifting weights that you’ll immediately put down again is the epitome of waste, but you should also avoid unnecessary weightlifting during daily life; for example, take the elevator instead of lifting your body weight up the stairs.

        Robin says rationality is like money-spending, anon says it’s like weightlifting. It’s an empirical question which analogy is accurate.

      • roystgnr

        It’s not a simple empirical question either, since there’s no reason to be certain that short-term and long-term effects of practice are similar.

        Imagine how easy it would be to measure how tired weightlifters are after an individual workout, and conclude “weightlifting makes you much weaker!”

      • IMASBA

        Regardless of whether rationality-attention span is finite the time you have to think to establish rational ideas is definitely finite so Robin is right that you cannot just develop rational ideas about everything in a small amount of time.

      • Silent Cal

        True, but I would contend that ‘develop rational ideas about literally everything’ is a straw version of rationalism. Robin seems to be contending that ‘I don’t always form opinions, but when I do, they’re rational’ is also a bad idea.

      • IMASBA

        What I said, and therefore Robin’s assertion as well, does not need the words “literally everything” to hold. You may be able to develop rational ideas on 20 subjects in one year, so then you have to choose which 20 subjects those will be if you find yourself interested in or being asked about more than 20. 20 years later you may have rational ideas about 400 subjects (though of course memory recall is limited and updating of rational ideas does take time), but not more. In practice 20 per year is probably on the high side for most people and 400 would be too many to recall properly for most people.

      • http://overcomingbias.com RobinHanson

        It is obvious that even if there are ways via practice to increase your capacity, still you have a limited capacity and must allocate its use. Just look around you and see that there just have never been people who are able to look clearly, honestly, and without bias on all topics. Expect that this description will apply to you as well.

      • Silent Cal

        It’s quite possible but certainly not obvious to me that being rational on one topic reduces one’s capacity to be rational on other topics. The lack of people who are rational on every topic could be due to a limited supply of rationality, or it could be because everyone has Achilles’ heel topics where they’ll never be able to be rational.

      • http://overcomingbias.com RobinHanson

        I’m saying that being rational takes topic-specific *work*. You can’t just be rational in general, you have to work to overcoming biases on particular topics. The effort required to accomplish work comes in a limited supply.

      • Silent Cal

        I don’t think the ‘rationalist’ label was ever about being an expert in an unrealistically wide range of fields. If you cultivate a habit of (attempted) bias correction, such that a detailed judgment always includes a detailed bias correction and a snap judgment includes a snap bias correction, you can set yourself apart from most people in a meaningful way. It’s the difference between “The leftist side on seems correct” and “The leftist side seems correct but my exposure is mostly through left-leaning media so I’m not sure, I’ll research it further if my opinion on this becomes important.” I think most of us do have the extra seconds for this.

      • http://juridicalcoherence.blogspot.com/ Stephen Diamond

        I think the issue is getting muddied. (Robin, nonconfrontational guy that he is, tends to moderate his position too much when criticized.)

        Here’s how I would state (what I think) is Robin’s view. Man is an inherently hypocritical being. He can’t be otherwise; and irrationality is hypocrisy’s handmaiden. Perhaps we can become a bit more rational overall, but in general our degree of irrationality is fixed by our human nature. We can choose a few things to be rational about, but, in general, this means being less rational about other matters. We must be careful to funnel our limited rationality into what’s to us most important.

        This far-mode thesis can’t be reduced to some trivial (near-mode) proposition about the amount of time available. (If it is to be explained in near-mode terms, the direction I propose is ego-depletion theory.)

        The difference between Robin and E.Y. isn’t a difference on AI. It’s a difference on human nature! E.Y. once famously proclaimed that he had made himself into a perfect utilitarian (giving his personal interest no more weight than he gives the interests of other humans). I don’t know if he still maintains the same, but at the least, it provides a useful caricature of his worldview.

      • Silent Cal

        I guess the practical question for me is, if I’m thinking about the latest political controversy during my morning shower and I make the effort to combat my biases, will I be less accurate that afternoon when I predict how long my project will take? My intuition says no, and the fact that no one is perfectly objective about everything is not sufficient evidence to convince me.

        It could be that I’m just not spending my entire rationality budget, so it doesn’t seem limited, whereas Robin is constantly up against the limit. If this is the case, I’d cash it out as ‘the rationalist white belt is generalist (learning to spend the whole budget), the rationalist black belt is specialist (learning to spend it wisely)’. I suspect that white belt training is more appropriate for the overwhelming majority of people.

      • http://juridicalcoherence.blogspot.com/ Stephen Diamond

        Proceeding only from what I take Robin’s theory to be, I’d answer your practical question that it may not be clear that this one decision limits your rationality, but to the extent that you develop norms to be rational about politics you will sacrifice rationality about work. (Perhaps this helps: when you think about politics, overcoming bias consists in bracketing self-interest, whereas in considering personal investments, one must bracket societal interest.)

        At this level, I don’t think the point is obvious. You are correct to ask for reasons, but I think those lie partly in the whole homo hypocritus theory.

        But this counterintuitive conclusion dovetails with the (also counterintuitive) results of ego-depletion theory. (The classical study is where judges fatigued themselves by making decisions, making worse decisions as the day progressed.) (See [with links] “Decision fatigue”: Its implications for analyzing issues on appealhttp://tinyurl.com/7lnoxne )

        Speaking confessionally, it seems personally clear that thinking hard about one thing makes it harder to think hard about another. (I am stymied in my efforts to write a book by my compulsion to expend energic resources on thinking about Robin’s posts.)

      • Silent Cal

        Homo hypocritus provides possible reasons rationality might be finite, but what I’m asking for is evidence. Your experiences are something in that direction, which I appreciate, but I can think of alternate interpretations of that data. The key experiment would be to substitute something less demanding of rationality but equally gratifying for thinking about Robin’s posts and see if it improves your writing progress. If not, that would mean the resource OB posts are consuming is not rationality but something else, like productive daylight hours.

        (N.B. I’ve previously encountered reports of the Israeli judge study that interpret it only as showing the effects of hunger on decisions rather than anything about decision fatigue. More ambiguous evidence…)

      • Eliezer Yudkowsky

        Well said, Silent Cal!

        Your rationality budget is like your early resources in a game that depends on harvesting more resources. You should allocate your early, small rationality budget to cases that seem difficult, doable, and where you later get to find out whether you were right or wrong. Picking easy things like “Anti-vaccination advocates are stoopid” won’t improve your muscles, and if you start by trying to solve the hard problem of conscious experience you might not notice when you turn into a crackpot. If you try to trade equities then your success is almost surely noise because you’re basically trying to compete at the Olympic level in a case where there are lots of false positives; this is a worry with highly liquid prediction markets too.

        But betting with your friends on all your short-term resolvable disagreements, indiscriminately, is probably a pretty good idea. That’s the sort of thing rationalists do! (Making this clearly good thing a part of your identity / virtue ethics will help you do it more and so increase your rationality budget.)

      • http://overcomingbias.com RobinHanson

        You might say the same about money. If I tell you to spend your money budget carefully, you might reply that one can invest money to get more money. True, but somewhat beside the point.

        By all means try to increase your rationality budget, but even after you’ve done your best at that, you will still have to face having a limited budget.

      • Silent Cal

        The devil is in the parameters. For an individual managing finance on a short timescale, your money is approximately finite; how you budget your purchases will be much more important than the few cents on the dollar you might get investing. In a resource-acquisition-focused game, your investments in more resources can be vastly more important than what you finally buy with them.

      • http://overcomingbias.com RobinHanson

        On reflection, our disagreement here is related to our disagreement on foom. You think AI mostly needs to learn a few general principles of thinking, while I think it mostly needs lots n lots of topic specific insight. Similarly, you think rationality is mostly about topic independent principles, while I think avoiding bias is mostly done via topic specific insights.

      • http://juridicalcoherence.blogspot.com/ Stephen Diamond

        But no one is saying don’t be rational at all. The claim is that trying to do too much will be injurious where it’s important. This equally applies to weightlifting. Not only is it possible to lift too much, but it is important to make workouts high quality and (somewhat) few in number.

        Whether there’s a resource limitation is, of course, an empirical question. That doesn’t mean it’s a nonobvious question–whether in instrumental rationality or weightlifting.

        “Rationalists” are remarkably resistant to the idea of willpower limitation. Instrumental rationality means attending to what’s worth being rational for–that there are tradeoffs, not mere necessary omissions as E.Y. suggests.

        We can’t be (instrumentally) rational about that much. (This is often forgotten in discussions of the poor, who tend to be endowed with the capacity for perfect rationality, which they choose not to use. (“Rationalists” are compatibilists.
        [ http://tinyurl.com/cdl69lk ])

      • Joshua Brulé

        I’d like to take a third option and suggest that rationality is like being, say, a mathematician or physicist. It would be absurd for a mathematician to claim expertise in every area of math, but reasonable to claim expertise in a few areas, plus a familiarity with a majority of the rest.

        Robin’s advice about “spend your rationality budget where truth matters most to you” holds in general, but I don’t see the harm in trying to be rational about other topics as long as one mentally attaches a much higher degree of uncertainty to any opinions they form outside of their area of expertise.

      • http://overcomingbias.com RobinHanson

        Yes you can try always, but don’t expect to achieve as much when you aren’t trying as hard.

      • Grant

        Strength training is not unlike money-spending. Your body has a finite capacity to recover from and adapt to lifting weights. You must spend this budget where it is most effective for your uses. If you want to get very strong, lifting very light weights is counter-productive, as is walking up 10 flights of stairs when you could take an elevator.

        Our brains are naturally irrational but we’re trying to make them adapt and become less so. It seems reasonable to suggest that our capacity for this adaptation is finite.

        A more accurate analogy would be ‘don’t be “athletic”‘. There’s just no way we can be good at all sports.

      • IMASBA

        You’re probably right, although when you say “our brains are naturally irrational but we’re trying to make them adapt and become less so” that may be true for Robin but I for one am fully aware than things joy and pleasure are very much dependent on some measure of irrationality, I think most people are too and thus would consciously create “irrationality conservation reserves” for topics that we do not find it very important to have a rational opinion about.

    • HellotoRh

      I think we’re all aware that most of us have limited mental stamina for these things, so it’s important to prioritize. If this is disconfirmed empirically then we’ll all celebrate. However, until then focus on the important things. For the naive weightlighting example. I doubt RH is saying do not strengthen the rationality muscle, he is saying that if there are boulders in your way that might destroy you or are more important to displace, displace them first and not the useless ones, and if you can’t train for those. After that’s all said and done, we’ll get to the other stuff.

    • http://juridicalcoherence.blogspot.com/ Stephen Diamond

      Striving for rationality builds good mental habits.

      The answer straight out of homo hypocritus
      (although Robin doesn’t say it–and maybe doesn’t even believe it) is that it’s humanly impossible to develop broad rationality habits.

      I think it encourages people to overlook their mistakes when they should be jumping on them.

      Largely from the belief that it’s possible come the illusions accompanying the label “rationalist.”

    • http://juridicalcoherence.blogspot.com/ Stephen Diamond

      Striving for rationality builds good mental habits.

      It’s humanly impossible to develop broad rationality habits. (Loosely derived from homo hypocritus.)

      I think it encourages people to overlook their mistakes when they should be jumping on them.

      Largely from the belief that it’s possible come the illusions accompanying the label “rationalist.”

      [Added.] Most importantly: were it possible to develop broad rational habits, nobody would want them! Hypocrisy (hence irrationality) is too useful, in fact, absolutely necessary. (That’s a direct inference from homo hypocritus.)

  • http://wateringgoodseeds.tumblr.com/ Shira Coffee

    Thank you. You stated things that I have felt but have not managed to articulate. In particular: “If you must have an opinion about yourself, assume you are like most other people.” I ought to stitch that on a sampler and hang it over my computer screen!

  • oldoddjobs

    “Try not to have your opinions on focus topics depend on a belief that you or your associates are especially smart, perceptive, or moral.”

    Where’s the fun in that?!

  • http://juridicalcoherence.blogspot.com/ Stephen Diamond

    So my advice is to choose a focus for your honesty, a narrow enough focus to have a decent chance at achieving honesty.

    The flaw in this advice is that restriction of focus itself consumes enormous rational resources.

    Instrumental and epistemic rationality involve different principles; instrumental rationality is rather obviously a limited resource. (Those who challenge the contention are probably thinking of epistemic rationality–because it wasn’t clear which Robin is thinking of.) It seems to be most probable that the limitation on instrumental rationality is the same as the limitation on willpower–perhaps willpower depletion evolved to cause instrumental-rationality limitation on account of the evolutionary benefits of hypocrisy.

    But when someone (like Robin) says he can only be rational about so much, I immediately wonder what area he wants to continue to be irrational about. (Isn’t it fair we ask Robin about his X, since he raised the subject?) I much wonder whether cryonics falls in his X or within the (vast) purview where he permits himself to be irrational.

    I would define an “intellectual” as one who (claims to) subscribe to a norm of epistemic rationality. Increasing one’s epistemic rationality does require “investing” finite instrumental rationality (read willpower). “Intellectuals” (purportedly) devote a lot of their instrumental rationality to increasing their epistemic rationality–generally. (As a result, they must suffer many practical irrationalities in their lives. A degree of martyrdom for the intellect is necessary.)

    The main trade off is between devoting instrumental rationality (willpower) to intellectual or to practical matters. Epistemic rationality is rather analogous to crystallized intelligence (instrumental, to fluid), in that it accumulates, in the form of intellectual habits. It isn’t a limited resource, although it is always finite. The question is how much willpower you devote to accumulating it. The main choice isn’t between fields of thought, which are variously interconnected (as commenters have pointed out) but between practical and theoretical reason–between the goals of being as instrumentally rational as possible and being as epistemically rational as possible.

  • Charlie

    This advice could be simplified, or generalized: “Don’t be an [anything]-ist; don’t stake your personal identity on an ‘-ism.'”

    • Brent Dill

      Bueller?

  • Brent Dill

    Assertion: If your theory is true, before deciding on any other ‘X’, it is important to decide to be “Rationalist on which X to be rationalist about.”

    Given the sheer number of topics to choose from, I doubt that most people have sufficient ‘rationalist’ budget to do this well.

  • Philon

    “You should spend your rationality budget where truth matters most to you.” But there are probably diminishing returns to rationality-effort in any one area. Maybe I, as a plumber, while devoting 80% of my rationality-effort to plumbing, should use the other 20% in miscellaneous other areas, getting more value for my effort even though each of these other areas is less important to me than plumbing. (Or do you think there are increasing returns to area-specific rationality-effort?)

    • http://overcomingbias.com RobinHanson

      Returns are initially increasing, then they decrease. So you must focus to get those initial increases, but then not focus too much.

  • Silent Cal

    Let me try to reframe the debate by characterizing model-space. That is, specifying what empirical facts are disputed and establishing a mapping from their possible values to behavior prescriptions.

    I think we can take as an axiom that a given person in a given state facing a given question has a limited capacity to be rational on that question (note that this axiom alone explains why no one can ever be rational about every question, so that fact won’t help us decide among models with this axiom). What’s disputed is how being rational on one question affects the capacity to be rational on future questions. I would characterize this with two ‘parameters': the depletion effect and the growth effect. Also relevant is the supply of questions over time. I’m going to consider models where questions have two properties: how much rationality they require to answer correctly(the difficulty), and how much utility a correct answer yields (the value).

    One logical possibility is that the depletion effect is negligible, either because its magnitude is small or zero, or because its duration is shorter than the time before the next question becomes available. In these models, there’s no downside to using your rationality on a question, so you should always do it (unless you want to gain from hypocrisy), whether or not there’s any growth effect. I’ll call this the ‘swing away’ regime.

    Another possibility is that the growth effect is very large on a questions of certain difficulty, and high-value questions of that difficulty are infrequent relative to the duration of depletion, whereas low-value questions of that difficulty are common. In that case it makes sense to work on lots of low-value questions to maximize growth, in addition to some high-value questions to actually generate utility. This is basically what EY is contending, as I read him.

    Another possibility is that the growth effect is moderate to none, the depletion effect is large and long-lasting and, crucially, high-value questions are frequent. In this case value-difficulty ratio is the most important consideration and you get something like what RH is arguing.

    This isn’t an exhaustive classification of possible models, but I think it reasonably captures what’s been put forward (the formulation in terms of discrete questions is the biggest weakness).

    Now, what empirical evidence do we have that lets us discriminate among models? Or are different individuals in different regimes, in which case this post is trueish but should be titled “When Not To Be Rationalist”.

    • http://overcomingbias.com RobinHanson

      It isn’t clear to me how your framework deals with familiar typical cases where there are gains to continued focus on particular tasks. If you want to be a good doctor, you need to put in a lot of time learning to be a good doctor. It isn’t so much that playing basketball has a depletion or growth effect on doctoring, it is more that spending lots of time and energy learning to play basketball well comes at the cost of time you could have spend to learn to doctor well.

      • Silent Cal

        Hmm. Looks like I tried to specify in too much detail. Here’s another shot:

        The ‘swing away’ regime holds if rational thinking is no more costly than biased thinking, and the only constraint is your maximum (growable or not) on how rationally you can think. You still have to budget your thinking, but you should always think as rationally as you can.

        The EY regime holds if thinking rationally depletes a separate persistent resource, but investing in growth consistently returns enough to justify the expenditure including opportunity costs.

        The RH regime holds if thinking rationally depletes a separate persistent resource and justified growth investments are few. Though there’s a continuum from EY to RH.

      • http://juridicalcoherence.blogspot.com/ Stephen Diamond

        The EY regime offers to make you rational in your practical life at the same time as it makes you a rational intellectual. I think intellectual rationality is at the cost of practical rationality; so, I reject EY’s claims. (Just to be clear.) [EY himself has sometimes demonstrated a remarkable lack of practical rationality, which agrees with my view.]

        The main empirical evidence for severe willpower limitation has been with us long before Baumeister did his studies. It is (it seems to me) the only way to explain akrasia. (Thus, the term (and concept) “ego depletion” originated with Freud rather than Baumeister.) (See Akrasia explainedhttp://tinyurl.com/arg4ttq )

        I think Robin is resisting the implications of homo hypocritus by making his claim that rationality is a limited resource a platitude about limited time. The platitude doesn’t support the essay’s title.

      • Silent Cal

        But what evidence is there that intellectual rationality depletes willpower?

        (Also, if it does, this leads to different advice than what Robin gives, since you can buy more rational beliefs at the cost of rational actions and vice versa).

      • http://juridicalcoherence.blogspot.com/ Stephen Diamond

        The consensus (or at least plurality) view is that willpower is depleted by effort at self-control, which intellectual rationality requires.

        My advice differs from Robin’s, true. I think he neglects that the vast amount of our willpower is spent in practical matters. Even if we style ourselves intellectuals, we must get writing done

        I also think Robin ignores the interconnectedness of ideas, hence the possibility of specializing one’s rationality (apart from one’s skills). I think Robin would want to say that intellectuals can specialize their rationality and be irrational about (in particular) religion.

  • Pingback: Assorted links

  • Pingback: Overcoming Bias : Bias Is A Red Queen Game

  • Rationalist

    The push-back against this line of argument is (a) that it is actually not very difficult to be rational about a specific, narrow topic if you actually want to be, in fact it is almost automatic if you are competent, and (b) that there are significant gains to be had from learning a rationality lesson in one area of your life and then applying it elsewhere rather than wasting the learning opportunity, for example calibration, overconfidence, noticing your own confusion, etc.

    Do we find the the majority of good plumbers are irrational within their plumbing work? Or the majority of Taxi drivers are irrational about how to drive a Taxi? I think not. But if they go to the local betting shop after work and lose most of their earnings on sports betting machines, then perhaps they would benefit from not following Robin’s advice…

    • http://overcomingbias.com RobinHanson

      If the plumber thinks to himself “I’m a rationalist”, he may well be more likely to bet, because he thinks the usual advice to avoid betting is targeted at those irrational people.

      • Rationalist

        Realistically if a plumber has even heard of the word “rationalist” (s)he is probably an unusually smart plumber and will probably bother to keep track of whether (s)he loses or wins money on average, and might even realise that stocks have a better risk/reward profile than sports betting.

        I feel this discussion is very detached from reality. Go to your local gambling venue and you will not find people who have read up on probability theory and rationality and are overconfident. You will find people who don’t even know what the word rationality means and are close to innumerate. Learning a small amount of rationality is probably beyond their mental capability, but for those who could learn about concepts like probability and calibration it would probably be helpful. For example, they might learn to identify the gambler’s fallacy and the inverse gambler’s fallacy.

        Just to disclaim: this comment is not intended to insult people who spend a lot of time and money gambling, I’m just pointing out the reality of the situation; often lack of education is to blame for the problems that these people go through.

  • Pingback: Biasing forces are strong: Pick your battles | Daniel Hogshead

  • Pingback: Dan Kahan’s cultural cognition shows why climate-splaining is a fail. Plus applying it to Paul Krugman. | Praxtime