When I was a child, I spake as a child, I understood as a child, I thought as a child: but when I became a man, I put away childish things. (Bible)
“83% of 5-year-olds think that Santa Claus is real, … ‘Children’s belief in Santa starts when they’re between 3 and 4 years old. It’s very strong when they’re between about 4 and 8,’ she said. ‘Then, at 8 years old is when we start to see the drop-off in belief’” (more)
You should be forgiven for having had only limited critical thinking skills as a child. But that doesn’t excuse you now from correcting your childhood mistakes as best you can. For example, you probably once believed in Santa Claus, but then changed your mind as contrary evidence accumulated. Alas, some of our biggest correctable biases are of this form: childhood mistakes that we don’t correct when older and wiser. Let me explain.
Most of the literature on “rationality” is about finding specific belief patterns that represent reasoning mistakes, helping people to use such patterns to find and fix their mistakes, and thinking about the implications of beliefs that don’t make such mistakes (eg, the rationality of disagreement). For example, it is a mistake to endorse both A and not A with high confidence. So if you notice this pattern in yourself, you can and should cut your confidence in one or both claims. And it is valid to point out such mistakes in others and to suggest that they may undermine offered arguments.
One very widely accepted rationality principle is that your beliefs should depend on the set of all your evidence, but not on the order in which you learned about it. For example, if you learn A and then B, you should end up with the same beliefs as if you instead learn B and then A. So to a rational legal judge it shouldn’t matter in which order the disputing parties present their evidence.
For example, most are taught as kids in school that their nation is unusually morally justified in its wars and international relations. They later learn that kids in other nations are taught similar things about their nations, and often get a quick summary of what those other kids are taught. However, they have the (correct I think) impression that even if they were to sit in on a foreign classroom and hear all of what those other kids heard, they would still mostly favor their first nation over the second nation re national conflicts. Thus their beliefs depend on the order in which they got evidence.
And this isn’t just about the poor reasoning abilities of young children. Even though the average age of a grad student is from 29 to 33, the opinions of academics on key questions in their fields tend to depend on where they went to graduate school. They later hear summaries of what grad students learn at other schools, and they have the (correct I think) impression that going carefully through other schools’ lectures, readings, and problem sets would not on average move them to some neutral intermediate view, one that all students would come to once they’d learned that students are taught different things at different schools.
This also isn’t an issue of facts vs. values. Yes your early life experiences do seem to influence the personal values and priorities that explain your personal actions, but the way that your experiences did this was via information, i.e., via the things you saw, heard, and experienced when young. Thus you were making inferences from what you saw, heard, etc. to form your best estimates of what you valued. But these inferences could have been mistaken. As an adult, you should try as best you can to identify and fix such mistakes, by looking at your total available evidence now, and trying to correct for biases due to the order in which your learned things.
I also don’t think the key here is that we usually don’t actually expose ourselves to all the detailed sensory experiences of others who were taught different things. We can usually predict quite reasonably how such details would influence us on average. Most of the relevant info re how it would change our overall position can in fact be carried by a few overall summaries.
I also don’t think it matters much that many small random effects are in play, so that one can’t ensure that one would have exactly the same beliefs independent of evidence order. For rationality it is enough to expect that on average your beliefs don’t depend on evidence order in substantial and predictable ways.
Some complain that rationality belief error corrections are alienating and inauthentic. The person you became after your earlier life experiences is the real you they say, and you can’t even consider error corrections without stepping outside yourself and thus to some degree rejecting yourself. An action they see as doing violence to your true value and nature. That just seems to me an excuse to say “I like my mistakes, and don’t want to fix them. In fact, I’m entitled to keep them.” But are you ever entitled to a mistaken opinion?
(This post is an attempt to re-express a point I tried to make in a recent post.)
Added 6p: There is of course a similar evidence-order-dependence error that appears on shorter time scales: confirmation bias.
I agree with your general premise that people get too rooted in old beliefs. However, devil's advocate: In most cases of practical interest "evidence" isn't black and white but comes in shades of gray. From a Bayesian perspective it's rational to give more credence to an X that has stood the test of time in your thinking, versus a Y that you learned yesterday. We see this often in science: A new paper makes a compelling case against the big bang or dark matter or whatever, and the rational response is to NOT update ones beliefs immediately, because experience shows that such counterevidence often doesn't stand the test of time.
Also the more I observe people the more I conclude that many people don't believe what they believe because it is correct in a rational sense. Often beliefs are social cues that allow you to align with a certain group, form allies, and so on. To point: Murray's statements about race and IQ may be factually correct, but very few academics would believe them simply because doing so would have strong social consequences. Many of our beliefs have no direct consequences on our wellbeing so it's perfectly rational to believe in false things if it brings us social benefit. Of course all your work on prediction markets is addressing precisely this problem: Giving people skin in the game.
You are conflating two different claims:
1. Your degree of belief in C after carefully considering evidence A and B, should not depend on whether you consider A first and B second, or B first and A second. No question about this one; it is sound.
2. If you have carefully considered A first but not B, you should adjust your degree of belief in C to be closer to the not-C belief of people who considered B first but not A, without needing to carefully consider B yourself before adjusting your degree of belief.
(1) is definitely not equivalent to (2). And (2) is dubious. Should you adjust your degree of belief in the Heaven's Gate cult beliefs to be closer to the beliefs of the Heaven's Gate members? How much should the mere fact that someone else thinks differently from you cause you to change your belief to be like theirs?
You have some direct evidence that your degree of belief in C, based on evidence A, is justified; it makes sense to you. You can understand, specifically, the argument for how A supports C, given all your background knowledge. This makes it strong evidence.
But you don't have the same level of evidence that the degree of belief the other person holds in not-C, based on evidence B, is justified. Perhaps the other person doesn't have rational support for not-C. For example, it's likely that the other person is just repeating doctrine that helps them be accepted by their peers and superiors, because people often do that. You can (if you try) check that you are really believing in C for rational reasons that follow from A. You can't do the same check for the other person, not until you've actually considered evidence B yourself, which would be time consuming.
The exception would be if you do have a strong reason to believe that the person who asserts not-C based on B, is at least as rational as you are. For example, if they are a true expert in a demanding field that requires dispassionate rationality and not fashion-following, such as physics or mathematics, and C is a statement in their field.
But for most people, if you are *actively trying* to be rational about C (and think you are succeeding), you have no assurance the other person is applying the same standard. So it is rational to trust your own judgment over theirs. (Or, you can actually consider B yourself; but this may not be practical if B is a large amount of information.)