Overcoming Bias: Hobby, Virtue, or Moral Obligation?

I can think of three reasons why somebody should try to get better at overcoming bias:

1. OB is a hobby: It should be pursued by people who enjoy it and/or who think it will pay off somehow.

2. OB is a virtue: It is part of a well-lived life (for a bit of this, see the preface to Facing Up by the physicist Steven Weinberg).

3. OB is a moral obligation: You should do it because it will cause you to do more good and less evil.

A recent post by Robin (responding to earlier remarks by Tyler Cowen) caused me to leave the following comment:

It sounds like you’re arguing here that overcoming bias is something like learning a martial art; whether or not you should do it depends mostly on whether it suits you, but it might also come in handy someday. But isn’t the point of this blog that OB is a positive virtue that *everyone* should seek to acquire?

To which Robin replied:

David, we should distinguish a weak claim, that given our ends we are reasonable to join together here to overcome bias, from a strong claim, that everyone should do so. The weak claim can be true even if the strong claim is not.

I’m not sure if he means that #1 alone could in principle be sufficient reason for us to join together on this blog (which is surely true), or if he means that it is in fact his sole reason.  I certainly buy into Reason #2 for myself, and I buy into it as being prescriptive for other people as well, but only to a limited extent (basically the extent to which I might try to convince someone of the obviously true fact that being a Yankee fan is a virtue and that being a Red Sox fan is a vice and that this is not a matter of taste; it’s a pity if they don’t get it, but it would be wrong for me to cram it down their throats).  But the big one is #3.  Most of the evil in the world is done directly in the service of irrationality and much of the rest (i.e., evil that is done for rationally selfish reasons) uses some kind of irrationality as a cover story.  So I’m not sure about Robin, but I think that OB is something that everyone should be encouraged to do a lot of, though not necessarily to do it to the maximum possible extent (like everything else, OB is subject to diminishing marginal returns and, as Tyler pointed out in the original post, the marginal returns might eventually become negative, though I think this is true a lot less often than he seems to thinks it is).

GD Star Rating
loading...
Tagged as: ,
Trackback URL:
  • David J. Balan

    I just realized that I linked to the paperback version of the Weinberg book, whereas I have the hardback. So the reference in the post is to the preface to the hardback, which may or may not be the same as the one to the paperback.

  • http://www.godofthemachine.com Aaron Haspel

    I don’t understand the distinction between #2 and #3. Aren’t virtues virtues precisely because they cause one to do more good and less evil?

  • http://profile.typekey.com/robinhanson/ Robin Hanson

    It turns out to be an open question whether learning ethics makes you do more good. If so, wouldn’t it also be unclear if being less biased makes you do more good?

  • Constant

    Overcoming bias is a derived end. The chief end is overcoming falsehood. All the reasons mentioned above and elsewhere pro and con overcoming bias apply to overcoming falsehood, and their application to overcoming falsehood is the reason for their application to overcoming bias.

    Overcoming falsehood is the same as pursuing truth, so the question is, why, or why not, pursue truth.

    The question, however, is odd. Everyone pursues truth. Even those who are woefully wrong believe they are right, and evidently pursue truth (though, of course, they err – but they are not intentionally erring). Asking why pursue truth is like asking why breathe.

    Suppose we had a long discussion about the pros and cons of breathing. That would, I think, strike most observers as odd. “Are these people seriously considering the cessation of breathing?” – onlookers might ask.

    We can’t really believe what we don’t believe. If we don’t believe something, then we don’t believe it. The logical difficulty of believing what you don’t believe stands in the way of doing anything other than pursuing truth. It’s not really an option. Maybe it’s less of an option than not breathing, since there are ways to kill yourself by drowning, but it’s not clear how one would even begin to believe what one does not believe. Maybe one could get oneself to, in the future, believe what one does not believe in the present. That is at least logically possible. But how could one sustain a current belief in something in which one does not believe? Such a difficult feat would seem to be necessary in order to pursue untruth.

  • http://profile.typekey.com/cronodas/ Doug S.

    It’s not that difficult to believe something that, on another level, you think is false. The human brain doesn’t have a particularly good internal consistency checker; it’s easy to end up believing a logical contradiction.

  • Nick Tarleton

    The question, however, is odd. Everyone pursues truth.

    I’m sure most people (not necessarily on this blog, though!) can think of some circumstance under which they’d rather have a false belief, or remain ignorant, in order to be happier. This suggests that maybe, for them, finding truth is a subgoal of being happy (or moral, or something else). Breathing, anyway, isn’t an end in itself, but a subgoal of staying alive (which may be an end, or a subgoal of happiness/morality/…).

  • Constant

    “It’s not that difficult to believe something that, on another level, you think is false. The human brain doesn’t have a particularly good internal consistency checker; it’s easy to end up believing a logical contradiction.”

    But that is not the same thing. You can simultaneously believe two contradictory things. What is rather harder to do is to believe something and at the same time not believe it. A and not-A is not that easy to do, since not-A is the absence of A.

    The case where you believe X and Y, where X and Y are mutually contradictory things, are only a special case of pursuit of truth. Here your pursuit of truth is merely forked. You are of two minds about something, and each of your minds pulls you in its direction. You still, in each of your minds, believe what you believe.

    Let’s look at the matter more closely. Suppose that you pursue truth – suppose that you yearn deeply for truth. And yet, suppose that you have two contradictory beliefs. Well, one approach – but only one approach – is to compare the two beliefs and eliminate one belief. Another approach is to preserve each of the two beliefs. You preserve the first belief because – well, obviously, because you believe it’s true! And you pursue truth. So, because you pursue truth, you preserve the belief which you believe. And the same with the other belief. And you may find the possibility of eliminating one of your two beliefs closed off to you anyway, because despite the fact that it contradicts your other belief, you still believe it too. And since you believe it, you find yourself unable to discard it. Why unable? Because you pursue truth, you can’t help pursuing truth.

    Believing two contradictory things, then, does not constitute an exception to the rule that we pursue truth.

    “I’m sure most people (not necessarily on this blog, though!) can think of some circumstance under which they’d rather have a false belief, or remain ignorant, in order to be happier.”

    But it is one thing to want to have a false belief; it is another to manage it. We are compelled to pursue truth. We don’t have much in the way of a mechanism to intentionally bring about belief which we believe to be false. This is not to say that we do not ever manage to fall under the spell of pleasant delusions – only that for the most part this happens without our really being aware of what’s happening. If we become aware, it tends to break the spell, much as magic tricks suddenly seem mundane and even a cheat once we know the secret, their spell having been broken.

    If you walk in and discover that your wife is cheating on you with another man, the extreme unpleasantness of the thought is not usually sufficient to wipe it from your mind, nor would most of us find it possible to wipe it from our minds intentionally. So our capacity for self-delusion is strictly limited.

    “This suggests that maybe, for them, finding truth is a subgoal of being happy”

    Well, believing what we believe is simply logically inescapable. But moreover, biologically, we are built to have veridical perceptions, even when they are unpleasant. While we may manage to believe falsehoods sometimes and on some level, we have little choice but to be without illusion about a great many things. In many cases, learning about something is sufficient to trap us into being aware of it, whether we like it or not.

  • David J. Balan

    Aaron, I think there is a distinction between vitue and morality (at least as I understand those words, it may be that philosophers have precise definitions that I am unaware of). To illustrate with an example, imagine someone who has been diagnosed with a terminal illness. The fortitude with which that person faces the prospect of his/her own death doesn’t really have any moral implications (leave aside the effect on family members and the like). Falling apart wouldn’t make you an immoral person, but I still think being courageous is better than not being courageous; it is a goal that I would actively try to achieve for myself, and that I admire in others. That’s what I mean by virtue as distinct from morality.

    Robin, This is of course the kind of thing that it is very hard to get unambiguous evidence on, but I stand by the statement in the post: almost everything bad is done in the service of irrationality. Do you disagree?

  • http://profile.typekey.com/cronodas/ Doug S.

    Isn’t one theory that people develop that irrational “cover story” in the hope of presenting themselves favorably to others? It’s the metaphor of the conscious mind as the PR department for the primarily unconscious decision making mechanism. In other words, they’re trying to come up with justifications for socially unacceptable but rationally selfish behavior.

    Saying “I stole that from my neighbor because I wanted it and I had the power to take it” is less likely to be socially useful than saying “I stole that from my neighbor because he deserved to be stolen from.”

    I could easily offer a justification for why I illegally download copyrighted music files, but the primary reason I download them illegally is because I want them, it’s easy for me to do, and I don’t want to pay for them.

  • Michael Sullivan

    Doug S. writes: “Isn’t one theory that people develop that irrational “cover story” in the hope of presenting themselves favorably to others? It’s the metaphor of the conscious mind as the PR department for the primarily unconscious decision making mechanism. In other words, they’re trying to come up with justifications for socially unacceptable but rationally selfish behavior.”

    I think the point of a general moral imperative to overcome bias is that if nobody buys your cover story, then it is not useful as a signal, which should decrease the amount of rational self-serving behavior that is evil[*] from those who would otherwise be able to come up with usable (but irrational) covers.

    [*] obviously, rational self serving behavior which is not evil is not a moral problem, and presumably would not incur significant social approbrium in a sufficiently unbiased society.