Evidence Order Bias
When I was a child, I spake as a child, I understood as a child, I thought as a child: but when I became a man, I put away childish things. (Bible)
“83% of 5-year-olds think that Santa Claus is real, … ‘Children’s belief in Santa starts when they’re between 3 and 4 years old. It’s very strong when they’re between about 4 and 8,’ she said. ‘Then, at 8 years old is when we start to see the drop-off in belief’” (more)
You should be forgiven for having had only limited critical thinking skills as a child. But that doesn’t excuse you now from correcting your childhood mistakes as best you can. For example, you probably once believed in Santa Claus, but then changed your mind as contrary evidence accumulated. Alas, some of our biggest correctable biases are of this form: childhood mistakes that we don’t correct when older and wiser. Let me explain.
Most of the literature on “rationality” is about finding specific belief patterns that represent reasoning mistakes, helping people to use such patterns to find and fix their mistakes, and thinking about the implications of beliefs that don’t make such mistakes (eg, the rationality of disagreement). For example, it is a mistake to endorse both A and not A with high confidence. So if you notice this pattern in yourself, you can and should cut your confidence in one or both claims. And it is valid to point out such mistakes in others and to suggest that they may undermine offered arguments.
One very widely accepted rationality principle is that your beliefs should depend on the set of all your evidence, but not on the order in which you learned about it. For example, if you learn A and then B, you should end up with the same beliefs as if you instead learn B and then A. So to a rational legal judge it shouldn’t matter in which order the disputing parties present their evidence.
For example, most are taught as kids in school that their nation is unusually morally justified in its wars and international relations. They later learn that kids in other nations are taught similar things about their nations, and often get a quick summary of what those other kids are taught. However, they have the (correct I think) impression that even if they were to sit in on a foreign classroom and hear all of what those other kids heard, they would still mostly favor their first nation over the second nation re national conflicts. Thus their beliefs depend on the order in which they got evidence.
And this isn’t just about the poor reasoning abilities of young children. Even though the average age of a grad student is from 29 to 33, the opinions of academics on key questions in their fields tend to depend on where they went to graduate school. They later hear summaries of what grad students learn at other schools, and they have the (correct I think) impression that going carefully through other schools’ lectures, readings, and problem sets would not on average move them to some neutral intermediate view, one that all students would come to once they’d learned that students are taught different things at different schools.
This also isn’t an issue of facts vs. values. Yes your early life experiences do seem to influence the personal values and priorities that explain your personal actions, but the way that your experiences did this was via information, i.e., via the things you saw, heard, and experienced when young. Thus you were making inferences from what you saw, heard, etc. to form your best estimates of what you valued. But these inferences could have been mistaken. As an adult, you should try as best you can to identify and fix such mistakes, by looking at your total available evidence now, and trying to correct for biases due to the order in which your learned things.
I also don’t think the key here is that we usually don’t actually expose ourselves to all the detailed sensory experiences of others who were taught different things. We can usually predict quite reasonably how such details would influence us on average. Most of the relevant info re how it would change our overall position can in fact be carried by a few overall summaries.
I also don’t think it matters much that many small random effects are in play, so that one can’t ensure that one would have exactly the same beliefs independent of evidence order. For rationality it is enough to expect that on average your beliefs don’t depend on evidence order in substantial and predictable ways.
Some complain that rationality belief error corrections are alienating and inauthentic. The person you became after your earlier life experiences is the real you they say, and you can’t even consider error corrections without stepping outside yourself and thus to some degree rejecting yourself. An action they see as doing violence to your true value and nature. That just seems to me an excuse to say “I like my mistakes, and don’t want to fix them. In fact, I’m entitled to keep them.” But are you ever entitled to a mistaken opinion?
(This post is an attempt to re-express a point I tried to make in a recent post.)
Added 6p: There is of course a similar evidence-order-dependence error that appears on shorter time scales: confirmation bias.