Discussion about this post

User's avatar
Jack's avatar

I agree with your general premise that people get too rooted in old beliefs. However, devil's advocate: In most cases of practical interest "evidence" isn't black and white but comes in shades of gray. From a Bayesian perspective it's rational to give more credence to an X that has stood the test of time in your thinking, versus a Y that you learned yesterday. We see this often in science: A new paper makes a compelling case against the big bang or dark matter or whatever, and the rational response is to NOT update ones beliefs immediately, because experience shows that such counterevidence often doesn't stand the test of time.

Also the more I observe people the more I conclude that many people don't believe what they believe because it is correct in a rational sense. Often beliefs are social cues that allow you to align with a certain group, form allies, and so on. To point: Murray's statements about race and IQ may be factually correct, but very few academics would believe them simply because doing so would have strong social consequences. Many of our beliefs have no direct consequences on our wellbeing so it's perfectly rational to believe in false things if it brings us social benefit. Of course all your work on prediction markets is addressing precisely this problem: Giving people skin in the game.

Expand full comment
Berder's avatar

You are conflating two different claims:

1. Your degree of belief in C after carefully considering evidence A and B, should not depend on whether you consider A first and B second, or B first and A second. No question about this one; it is sound.

2. If you have carefully considered A first but not B, you should adjust your degree of belief in C to be closer to the not-C belief of people who considered B first but not A, without needing to carefully consider B yourself before adjusting your degree of belief.

(1) is definitely not equivalent to (2). And (2) is dubious. Should you adjust your degree of belief in the Heaven's Gate cult beliefs to be closer to the beliefs of the Heaven's Gate members? How much should the mere fact that someone else thinks differently from you cause you to change your belief to be like theirs?

You have some direct evidence that your degree of belief in C, based on evidence A, is justified; it makes sense to you. You can understand, specifically, the argument for how A supports C, given all your background knowledge. This makes it strong evidence.

But you don't have the same level of evidence that the degree of belief the other person holds in not-C, based on evidence B, is justified. Perhaps the other person doesn't have rational support for not-C. For example, it's likely that the other person is just repeating doctrine that helps them be accepted by their peers and superiors, because people often do that. You can (if you try) check that you are really believing in C for rational reasons that follow from A. You can't do the same check for the other person, not until you've actually considered evidence B yourself, which would be time consuming.

The exception would be if you do have a strong reason to believe that the person who asserts not-C based on B, is at least as rational as you are. For example, if they are a true expert in a demanding field that requires dispassionate rationality and not fashion-following, such as physics or mathematics, and C is a statement in their field.

But for most people, if you are *actively trying* to be rational about C (and think you are succeeding), you have no assurance the other person is applying the same standard. So it is rational to trust your own judgment over theirs. (Or, you can actually consider B yourself; but this may not be practical if B is a large amount of information.)

Expand full comment
20 more comments...

No posts