Selling Overcoming Bias

I recently met a Brit who studies the psychology of environmentalism.  Quick summary:  To get people to act on environmental ideals, first make those beliefs salient, such as by making salient an in-group identity associated with the environment.  For example, you might remind a Brit that Americans (the out-group) are less environmental than Brits (the in-group).  Second, in such a moment of weakness, get people to declare their intention to take certain specific environmental actions, such as recycling each plastic bottle they use.  Third, remind them of this intention when that action is to be taken, such as by having recycle logos near trash cans. 

Can we apply this to overcoming bias?  While at the most abstract level many people will point out general advantages of bias, once you get to particular topics it is rare to hear someone say approvingly that their beliefs on that particular topic are biased.  So our ideal of overcoming bias has very wide support at this middle level.  But once we get down to the level of the particular thoughts people have at particular moments, their devotion to overcoming bias tends to disappear; they can’t be bothered to actually overcome their biases.

Thus, like environmentalism, overcoming bias is an ideal for which people are more likely to declare moderately abstract support than to display moment-to-moment adherence.  So could mechanisms similar to those used by environmentalists be useful in trying to get people to live up to their anti-bias ideals?  What would be the relevant in-groups, action declarations, and local reminders?

It might seem good that our ideal is almost universally given lip service, but unfortunately this makes it harder to find an out-group that people can identify as against overcoming bias.  Some people think of "science" or "scholars" as the in-group more committed to overcoming bias, but unfortunately this description can often be hard to square with actual academic practice.  What other candidate in or out groups do we have to work with?

What concrete visible action policies could we get people to endorse, where they could clearly see if they were failing to live up to their overcoming bias ideals?  One possible policy is to engage, by listening and responding "enough", to high "enough" status people who disagree with us on important "enough" topics.  Unfortunately, we can avoid the spirit of this policy via our judgments of what is "enough."   

Another possible policy is to always perform statistical tests on any concrete inference, and to drawn no conclusions that are statistically "insignificant."  Unfortunately, we usually have enough chances for data mining to make this a rather weak constraint.   Finally, we might follow the policy of accepting the consensus of a betting market, unless we are betting to move that consensus toward our opinion; we should "put up or shut up."  Unfortunately, we would need a lot more betting markets for this to be more than a very weak constraint.

GD Star Rating
Tagged as: ,
Trackback URL:
  • The obvious in-group would be Bayesians. Local reminders don’t really exist yet. It would be helpful if a graphic designer made some visual depictions of overcoming bias. The banner image used by this website is a great start, but something more simplistic would be useful for wider dissemination.

  • Tom

    Who is the British researcher? I’d be interested to read their work.

  • Maybe something along the line of my future warning signs, e.g. cognitive hazard:

    Actually, it would be an interesting challenge to make good symbols or signs for the different biases on that ideally made them simple to understand and apply. Then we could just show them when we think an argument is suffering from the bias. But how does the subadditivity effect or trait ascription bias look?

  • Hopefully Anonymous

    Robin, I like the pragmatic approach of the Brit scientist you mentioned in the OP. Similarly, one of the advantages of anchoring heirarchy to merit in America is that it makes children of elites work very, very hard: probably significantly harder than children of middle and low-income classes (perhaps an interesting reversal from pre-progressive era child labor norms). However, a possible disadvantage is a sort of gentrification of unbiased positions. Rather than trying to make reduced bias popular (or at least make support of policies derived from unbiased appraisal of how to reduce existential risk popular) some may be more interested in making reduced bias positions a marker of elite status, even if it doesn’t facilitate popular support (maybe even especially if it doesn’t). I’m thinking here of how there’s a populist, anti-elitist element to being skeptical of global warming theories, pro-war in Iraq, etc. I think our goal regarding determining polices to reduce existential risk in an unbiased way should be to win, not to be feel a sense of hierarchical fulfillment of being a minority of the larger population that is smarter about these things.

    So, perhaps when identifying the in-group that one wants to adopt a particular policy, we should make sure the ingroup (or various ingroups) are hegemonic, rather than merely elite.

  • Doug S.

    Hmmm… for an outgroup, members of any religion “we” don’t like could serve perfectly well, as religious faith is almost by definition not rational.

  • Hopefully, yes, a risk of associating overcoming bias with an in-group is to perhaps make the out-group less interested in overcoming bias. The bigger the out-group, the worse is this risk.

    Tom, sorry, all I know is the first name “Anne.”

  • Hopefully Anonymous

    Might be fruitful to run google searches for “psychology of x” x=various isms

  • This sounds like a very hard problem. The point of overcoming bias is to have less stereotyped mental behavior, so the methods that word for establishing simple habits like turning lights and recycling trash may not be appropriate.

    I realize that some aspects of non-biased thought like having odds for your predictions and doing honest evaluation of how good your predictions were can be made somewhat habitual, but maintaining mental freshness isn’t quite the same thing as most habits.

  • eddie

    You are attempting to instill in people a nonrational bias against nonrational biases.

    How very zen.

  • eddie

    Identifying an out-group is not as important for persuasion as identifying an in-group. So you don’t need to point out the nonrational in order to sell rationality; you just need to show to person X that everyone else who’s like person X is rational. “All your peers are doing it!” You can sell rationality to everyone this way without having any nonrational outgroup. “All left-handed people are rational; you’re left-handed, so you should be rational too.” Then make the same pitch to the right-handers. Never mind that the arguments themselves aren’t rational – you’re selling something here.

  • eddie

    I like Anders’ idea of creating iconic images for the various biases we want to eliminate. I also very much like Anders’ warning signs for the future and would love to see what he could do with “bandwagon effect” or “confirmation bias” – or especially the meta-bias “bias blind spot”.

    An effective technique for getting people to do something (aka brainwashing) is to have them start small but progress forward making confirmatory statements along the way. So to brainwash people into becoming less biased – aka “selling overcoming bias” – we need to get people to become less biased in some small but overt way, and encourage them to become more and more unbiased over time.

    I suggest starting a club. Call it “The Independents”. Membership is free, and it gives you some minor but tangible benefits – maybe a discount on a variety of goods and services, like AAA or AARP. But not just anyone can join; you have to be referred by a member, and you have to recite the following pledge: “I am an independent; while others simply follow the crowd, I make my own decisions for my own reasons.”

    We should at all costs avoid pointing out the irony of having a club of indpendents all reciting the same pledge. It’ll break the magic spell.

    Creating a group with a group name and group identity gives you the in-group effect. Providing a minor benefit gives a small incentive to join the group; making you self-identify as a group member in order to receive the benefit (“Do you offer an Independents discount? Here’s my membership card.”) helps reinforce the individuals’ identity as a group member. Making membership nominally hard-to-get (you have to be referred by a member) makes membership seem more valuable and thus provides a greater incentive to join. Reciting a pledge when you join, and at group meetings, or even as a recognition ritual between any two members, both further cements the group identity and also reinforces the individuals’ belief in the pledge.

    By joining the group you affirm your belief in (and thus reinforce your belief in) opposing the bias of groupthink. That’s the first small step. The Independents should have a heirarchy of ranks or degrees of membership, with each rank requiring a demonstration of rejection of an additional bias, with a corresponding additional pledge to recite. For example, the second step could be the correlation-causation fallacy or the confirmation bias. It’s important that the earlier steps be biases that are easy for people to abandon, i.e. the ones that they don’t use to support their most deeply held or cherished beliefs.

    In essence, the goal is to build a church of rationality. Churches are among the most successful propagators of belief systems. Therefore, if you want people to adopt more rational practices, you should adopt the nonrational methods that churches use to spread practices.

  • The essay proposes a bias for anti-bias. Actually, isn’t this site already promoting just that? Whenever a person tries to answer some question, they can ask themselves, “what would the folks at Overcoming Bias think about how I am approaching this?”

    Incidentally, I love the idea in the banner, but it took me several looks to figure out what was going on. Maybe there is a better rendition of that scene that is easier to take in at a glance.

  • Stuart Armstrong

    Possible outgroup: “those people out there who will take advantage of you”. Harp on the idea that if you are biased, you are letting others grow rich or powerful at your expense. Very defuse outgroup, that can be demonised by everyone, but doesn’t make anyone feel demonised.

    What concrete visible action policies could we get people to endorse, where they could clearly see if they were failing to live up to their overcoming bias ideals?

    Maybe have a little diary, full of their predictions about specific events or verifiable opinions about the world? Then they can check these against reality, and keep a tally. Insisting that they write down every major verifiable opinion they have (or have argued about) would be the way of turning the “enough” problem.

  • Robin and Commentors (maybe except eddie),

    Doesn’t your proposed method itself depend on a cognitive bias for any efficacy it might have? And aren’t you and all the commentors so far seen, by your post and their comments ( with the possible exception of eddie) to be guilty of trying to induce a cognitive bias, though you think you are doing it to eliminate biases?

    That is, doesn’t the first step in your proposed procedure depend on inducing the myside bias in its various manifestations (confirmation bias, disconfirmation, etc.)

    Doesn’t your second step, taking advantage of a ‘moment of weakness’ and getting people to precommit to a course of action you have tried to trick them into, partake of another bias?

    Would encouraging biases in this way be likely to reduce them?

  • Stuart Armstrong

    We should at all costs avoid pointing out the irony of having a club of indpendents all reciting the same pledge. It’ll break the magic spell.

    From the “Life of Brian”:
    Brian: You are all individuals!
    Crowd: We are all individuals!
    Brian: You are all different!
    Crowd: We are all different!
    Crowd member: I’m not.

    Though it seems, eddie, you’re more suggesting creating the Church of Freethinkers rather than the Church of Bias Overcome. Despite the difference, I feel we should recruit from the freethinkers – did anyone on this blog not think of themselves as freethinkers at some point?

  • Stuart Armstrong

    Would encouraging biases in this way be likely to reduce them?

    Can’t you teach people to think for themselves? I’m not too sure about the practical aspect of many of the ideas here, but I’d argue that we can use some biases to hook the victims, then encourage them to get rid of much more biases than what we encouraged. Then, once they’re debiased, we can remind them of the methods we used to hook them, confess the biases we exploited and help them overcome it, thus cleansing our initial sin.

    If that sounds pretty religious, it’s not a coincidence – religions too are fond of using worldly tricks to lure their congregations in. At the end, the ideal disciple is immune to the worldly bias that lured him in in the first place.

    So on the practical level, nothing to object to – as long as we are truly certain of the positive effects of debiasing and of the ways to get there. But the accusations of hypocrisy might tarnish the image of “overcoming bias”, so that has to be weighed too. Also, those doing the “converting” might lose sight of the goal as well (again, similarity with religions).

  • Hopefully Anonymous

    Some great contributions and ideas in this thread. Particularly from (but not limited to) Eddie and Stuart Armstrong.

  • Stuart, yes, Eddie seemed to be proposing a freethinker club more than an overcoming bias club, just as you seem to be more proposing a “think for yourself” club. These might be good first steps for some people, but they do embody substantial biases.

    Bruce, yes, it seems dangerous to try to discourage some biases by encouraging other biases.

  • To Robin, Michael, Anders, Hopefully, Nancy, Daublin, Stuart:

    I’m curious about whether you noticed that inducing a cognitive bias was the heart of Robin’s proposed method, and if you noticed it, when did you notice it? I myself took about 5 hours to notice it, of which only a few minutes, widely separated, was spent actually thinking about it, the rest of the time going about my regular business. Unfortunately, I don’t recall how I came to realize it. I have to admit that almost always I have to depend on others to point out to me my cognitive biases: reviewers of papers, people I’m talking with, encountering opposed views and failing to argue effectively against them, etc,

    I wonder if we don’t need more of a medical-type model for overcoming bias, in which each of the 67 or so biases is regarded as a distinct disorder, with its own etiology, predisposing factors, symptoms, treatment, prognosis. I’m not sure how far we will get considering Bias as a single entity, or unitary category.

  • eddie

    You’ll never get anyone to join the Bias Busters Club. But you’ll get people joining the I’m Not Like Everyone Else Club in droves, even if they’re really just opposing whatever they think everyone else thinks instead of actually thinking for themselves. But joining that club is only a small first step (at least it would be if I were running it), and it’s through a series of small steps that large changes in behavior are made.

    I wouldn’t say that my Church of Rationality discourages some biases by encouraging others. Rather, it discourages all biases (eventually) by *exploiting* some biases (initially).

  • One iconic image suggestion, though I’m sure there are more elegant ways of doing it:

    Might be better if you could see a stick figure reading an article with that content. Or something. Just a random image that occurred to me while reading the comments here.

  • I guess I am using a sort of Kantian Categorical Imperative idea: if you don’t think other people should induce cognitive biases in others, you should not yourself induce them in others. Deontological ethics. Only if I think everybody should do it, should I do it.

    But most of you, Stuart especially, also eddie, seem to be more of the consequentialist school. If it works, do it.

    When Robin says it is ‘dangerous’ to do it, I’m guessing he is consequentialist.

    I think a lot of confusion arises in people’s thinking (mine here) because they are not separating questions of value from questions of fact and truth. They don’t put the ‘ought’ part of arguements in a separate category from the ‘is’ part. In this thread there were certain things people said that I simply didn’t understand until I was able to make this separation.

    I have known for many years that I should do this, but I seem to have great difficulty remembering to do it.

  • Sorry to introduce again a contarian view but is it sure that overcoming all cognitive biases is really good in the long run for the very person who would achieve it?
    Is it even possible?

  • Hopefully Anonymous

    As a consequentialist/egoist (at least I think I am) I lean against working to overcome all biases as a long run goal. For example, I certainly don’t plan to donate my body to medical science and I don’t plan to be an organ donor. Instead I plan to persist in this body and failing that, to have it cryopreserved to maximize my future revival ods. However, I’m glad that many people do do both and I consider myself to be a happy freerider off of them. In fact, I’d like to convince many more people to donate their bodies and and organs, and I’d happily prey off of whatever cognitive bias makes them susceptible to doing so. For example, a belief on faith that they have a soul that persists when their body dies coupled with a desire to emulate Jesus and sacrifice their bodies so that others may suffer less (such as a future me ill and needing an organ transplant). So, good point Kevembuangga!

  • TGGP

    Eliminating all biases sounds like a tough task. Maybe we should identify the worst biases, and use other biases to counter them.

    I share Hopefully Anonymous’ attitudes. Other people being irrational is a problem so far as it leads them to do things that harm me, whereas my own irrationality will more frequently harm me.

  • Hopefully Anonymous, TGGP : …against working to overcome all biases… / …Other people being irrational is a problem…

    I was only talking about eliminating cognitive biases (not ethical/unethical) and for oneself (not for others).

  • TGGP

    Kevembuangga, as an emotivist/Stirnerite I do not believe in the “ethical/unethical” distinction in any sense other than “I like/dislike when X is done”, so to talk about an ethical “bias” seems meaningless to me. The post began by talking about a person who encouraged environmental action in others, and I was talking about that general idea only with regard to bias rather than the environment.

  • TGGP : I do not believe in the “ethical/unethical” distinction in any sense other than “I like/dislike when X is done”

    Neither do I.

    so to talk about an ethical “bias” seems meaningless to me.

    Still it is not meaningless.
    Some (for whatever reason) tend to act “ethically”, that is, with concern about other people interests, some others tend to act with the utmost disregard for others or even for animals, earth ecosystems, whatever.
    Thus this is indeed a bias with respect to their own behavior regardless of (proper) cognitive biases they may have.
    The “ethical” biases only bring cognitive biases is they are unconscious, but some are fairly aware of their ethical/unethical position.

    I was talking about that general idea only with regard to bias rather than the environment.

    I don’t get what you mean here.

  • Stuart Armstrong

    I’m curious about whether you noticed that inducing a cognitive bias was the heart of Robin’s proposed method, and if you noticed it, when did you notice it?

    I have to admit I noticed it immediately. But that’s because I’ve got many environmentalist friends (and dabble in it myself) so when it was mentioned that environmentalists use similar methods – that screamed “bias! bias! Warning: bias!”.

    But most of you, Stuart especially, also eddie, seem to be more of the consequentialist school. If it works, do it.

    I’m not sure I would go along with my proposals – they make me uncomfortable. But, if we wanted to do this, that would be the way I would do it. After all, religions and environmental movements are very sucessful at motivating people.

    An example derived from looking at environmentalism in a Kantian imperative way (nicked vicously from the book “Underground economist”): should environmental conferences attempt to be carbon neutral? The author’s argument was that either the money available should be spent on more conferences, or the conference should be cancelled and all the money invested in carbon reduction – whichever is judged the most usefull environmental return.

    I feel that “backing both horses” has many practical advantages (publicity-wise, especially – seeing how Al Gore has been criticised for having a big house offset by carbon reductions, imagine what it would be if it wasn’t offset? Note that the house is irrelevant to the substance of Gore’s point). However it also has the advantage of working in ignorance – if you’re not sure of the environmental impact of a conference compared with carbon reductions, then backing both can be a usefull hedge.

    However, even if I was pretty sure that using biases to unbias people would work (taking all the pros and cons into account), I would probably avoid doing it until I was nearly certain – there seems to be a trace of Kantian imperative in me (or a dislike of being a hypocrite, take your pick). A bias, in other words.

    I have known for many years that I should do this, but I seem to have great difficulty remembering to do it.

    Maybe think of “facts about the world” as decided by a hidden FACT-parliment, including many political parties you would disagree with. So if some probable fact offends you, before rejecting it, wonder “maybe this fact was decided by the fascist party in the FACT-parliment” (a lot of the facts about genetic determinism might fit into that category). If something fits very neatly with your opinions, then wonder “did my favourite party really get a majority and get this fact passed? Or was it overruled by the nasty parties, making this “fact” an manifesto promise, nothing more? Or maybe this is a coalition fact – has some nice aspects to if, but also other aspects I haven’t noticed.
    Very silly, but sometimes helps.

    just as you seem to be more proposing a “think for yourself” club. These might be good first steps for some people, but they do embody substantial biases.

    I don’t see it as a think for yourself (though that’s natural first step). Probably more as a “think for yourself” to figure out what you should use to come to the truth – including when to defer to the crowd, when to defer to the experts, and when to trust your own opinions. But I admit there is a huge hole – all those biases that we cannot overcome just by being aware of them. Are there any methods for overcoming these biases?

  • Bruce Britton, I didn’t frame Robin’s post as being about installing a bias, and I’m still not sure that’s a good way to think about it.

  • I associate bias with the extreme examples of bias, such as the work of Michael Moore or Anne Coulter. So when I catch my own bias, I think “Stop acting like Moore or Coulter!” (Two people I despise, by the way. I try to be like individuals that I perceive make legitimate efforts to be balanced in their thought… like Jonathan Rauch or Tyler Cowen. So I suppose my out-group consists or propagandists, talking heads and many politicians, while my in-group is how I imagine an ivory tower academic or ideal journalist might behave.

    For me this approach isn’t just about bias, but more about the honest pursuit of truth more broadly (of which bias is an important part).

    While I have a fairly clear in-group and out-group, I could probably some tools to remind me of my quest (recycling bins). Perhaps I can train myself, when I make claims in certain contexts, to review my claims for evidence, bias, etc. Or I could tattoo “BIAS” on my right wrist and “EVIDENCE” on my left.

  • Stuart Armstrong

    Or I could tattoo “BIAS” on my right wrist and “EVIDENCE” on my left.

    Any political implications in your choice of wrists? 🙂

  • Tom

    Is there a hidden assumption that everyone in the world is capable of overcoming bias in this discussion? Perhaps it is the case that certain attributes (such as low IQ) prevent an individual from ever being able to overcome their biases, no matter how effective the rationalist evangelism.