The first principle is that you must not fool yourself — and you are the easiest person to fool. Richard Feynman.
This blog is called “Overcoming Bias,” and many of you readers consider yourselves “rationalists,” i.e., folks who try harder than usual to overcome your biases. But even if you want to devote yourself to being more honest and accurate, and to avoiding bias, there’s a good reason for you not to present yourself as a “rationalist” in general. The reason is this: you must allocate a very limited budget of rationality.
It seems obvious to me that almost no humans are able to force themselves to see honestly and without substantial bias on all topics. Even for the best of us, the biasing forces in and around us are often much stronger than our will to avoid bias. Because it takes effort to overcome these forces, we must choose our battles, i.e., we must choose where to focus our efforts to attend carefully to avoiding possible biases. I see four key issues:
1. Priorities – You should spend your rationality budget where truth matters most to you. You can’t have it all, so you must decide what matters most. For example, if you care mainly about helping others, and if they mainly rely on you via a particular topic, then you should focus your honesty on that topic. In particular, if you help the world mainly via your plumbing, then you should try to be honest about plumbing. Present yourself to the world as someone who is honest on plumbing, but not necessarily on other things. In this scenario we work together by being honest on different topics. We aren’t “rationalists”; instead, we are each at best “rationalist on X.”
2. Costs – All else equal, it is harder to be honest on more and wider topics, on topics where people tend to have emotional attachments, and on topics close to the key bias issues of the value and morality of you and your associates and rivals. You can reasonably expect to be honest about a wide range of topics that few people care much about, but only on a few narrow topics where many people care lots. The close you get to dangerous topics, the smaller your focus of honesty can be. You can’t be both a generalist and a rationalist; specialize in something.
3. Contamination – You should try to avoid dependencies between your beliefs on focus topics where you will try to protect your honesty, and the topics where you are prone to bias. Try not to have your opinions on focus topics depend on a belief that you or your associates are especially smart, perceptive, or moral. If you must think on risky topics about people, try to first study other people you don’t care much about. If you must have an opinion on yourself, assume you are like most other people.
4. Incentives – I’m not a big fan of the “study examples of bias and then will yourself to avoid them” approach; it has a place, but gains there seem small compared to changing your environment to improve your incentives. Instead of pulling yourself up by your bootstraps, step onto higher ground. For example, by creating and participating in a prediction market on a topic, you can induce yourself to become more honest on that topic. The more you can create personal direct costs of your dishonesty, the more honest you will become. And if you get paid to work on a certain topic, maybe you should give up on honesty about who if anyone should be paid to do that.
So my advice is to choose a focus for your honesty, a narrow enough focus to have a decent chance at achieving honesty. Make your focus more narrow the more dangerous is your focus area. Try to insulate beliefs on your focus topics from beliefs on risky topics like your own value, and try to arrange things so you will be penalized for dishonesty. Don’t persent yourself as a “rationalist” who is more honest on all topics, but instead as at best “rationalist on X.”
So, what is your X?
a WordPress rating system