22 Comments
Dec 1, 2023·edited Dec 1, 2023

I agree with your general premise that people get too rooted in old beliefs. However, devil's advocate: In most cases of practical interest "evidence" isn't black and white but comes in shades of gray. From a Bayesian perspective it's rational to give more credence to an X that has stood the test of time in your thinking, versus a Y that you learned yesterday. We see this often in science: A new paper makes a compelling case against the big bang or dark matter or whatever, and the rational response is to NOT update ones beliefs immediately, because experience shows that such counterevidence often doesn't stand the test of time.

Also the more I observe people the more I conclude that many people don't believe what they believe because it is correct in a rational sense. Often beliefs are social cues that allow you to align with a certain group, form allies, and so on. To point: Murray's statements about race and IQ may be factually correct, but very few academics would believe them simply because doing so would have strong social consequences. Many of our beliefs have no direct consequences on our wellbeing so it's perfectly rational to believe in false things if it brings us social benefit. Of course all your work on prediction markets is addressing precisely this problem: Giving people skin in the game.

Expand full comment
author

Sure X "standing the test of time" is the further evidence you get about what happens when you assume X and act on that. Sure, much of the evidence people care about is what will people around them accept or praise.

Expand full comment

You are conflating two different claims:

1. Your degree of belief in C after carefully considering evidence A and B, should not depend on whether you consider A first and B second, or B first and A second. No question about this one; it is sound.

2. If you have carefully considered A first but not B, you should adjust your degree of belief in C to be closer to the not-C belief of people who considered B first but not A, without needing to carefully consider B yourself before adjusting your degree of belief.

(1) is definitely not equivalent to (2). And (2) is dubious. Should you adjust your degree of belief in the Heaven's Gate cult beliefs to be closer to the beliefs of the Heaven's Gate members? How much should the mere fact that someone else thinks differently from you cause you to change your belief to be like theirs?

You have some direct evidence that your degree of belief in C, based on evidence A, is justified; it makes sense to you. You can understand, specifically, the argument for how A supports C, given all your background knowledge. This makes it strong evidence.

But you don't have the same level of evidence that the degree of belief the other person holds in not-C, based on evidence B, is justified. Perhaps the other person doesn't have rational support for not-C. For example, it's likely that the other person is just repeating doctrine that helps them be accepted by their peers and superiors, because people often do that. You can (if you try) check that you are really believing in C for rational reasons that follow from A. You can't do the same check for the other person, not until you've actually considered evidence B yourself, which would be time consuming.

The exception would be if you do have a strong reason to believe that the person who asserts not-C based on B, is at least as rational as you are. For example, if they are a true expert in a demanding field that requires dispassionate rationality and not fashion-following, such as physics or mathematics, and C is a statement in their field.

But for most people, if you are *actively trying* to be rational about C (and think you are succeeding), you have no assurance the other person is applying the same standard. So it is rational to trust your own judgment over theirs. (Or, you can actually consider B yourself; but this may not be practical if B is a large amount of information.)

Expand full comment
Dec 1, 2023·edited Dec 1, 2023

"For example, you probably once believed in Santa Claus, but then changed your mind as contrary evidence accumulated. "

While true in the most technical sense I think this is a misleading reading of what's going on. Most kids stop believing in Santa Claus because it becomes a low status belief associated with younger kids. This is a big reason why the epistemic gradient survives despite older kids frequently trying to black pill younger kids on Santa. Its a status gradient.

" Thus you were making inferences from what you saw, heard, etc. to form your best estimates of what you valued. But these inferences could have been mistaken."

Similar thing here. Most kids aren't judging evidence directly to improve their estimates about what they truly value. Indeed, I am skeptical that truly value exists. Instead, they are judging tribal lines and trying to figure -- in US at least -- how to define their tribe so that they have a consistent enough belief structure but also are aligned with the people whom they like and admire.

This makes total sense because values are away of coordinating actions over space time. Having good value means that you have a system that allows you to coordinate well.

Now that said people do have different dispositions. I suspect that dispositional diversity was even favored by evolution. Like its not just drift diversity or even local adaptation, but its a social adaptation that creates a Mix of Experts model if you will for which disposition is appropriate.

Because I can't help myself I suspect that there are four dispositional attractors and these line up roughly with the four elements Air Fire Earth & Water. The genius of this arrangement is across a wide variety of domains the dispositions remain to orthogonal spectrums. For any given value uncertain situation it seems likely that the first principle components of the problem will align with one axis more than the other. These two positions will vehemently disagree. The other two will have much smaller but also opposed natural affinities along the principle component axis. If, however, the evidence is strong enough to overcome one the much smaller bias then that person will switch to form a 3 - 1 majority.

So if you see this three to one majority forming then that's a pretty good sign that unbiased estimation is in that direction. Lastly, because I really can't help myself, polarization arise when issues are sorted such that they cut one of the diagonals between the four dispositions. Then both corner neighbors align and there is no way to break the tie.

Expand full comment
author
Dec 1, 2023·edited Dec 1, 2023Author

Sure, the social proof and status associations are the main relevant evidence that accumulates for kids re Santa.

Expand full comment

Yeah but I think putting it like that conflates the issue right. The 4 year old are not wrong. It would both upset their parent and their classmates for them to take the anti-Santa position so they do not. Conversely the 8 year-olds do not discover much new information about Santa. They face a new relationship between Santa belief and status.

This I suggest is why most 8 year-olds DO believe that its ok for the concurrent crop of 4 year-old to believe. It reinforces the idea that believing in Santa is for little kids but *they* are different big kids.

Expand full comment

I distinctly remember at that age not daring to voice my doubts about Santa, out of a fear that the gravy train of Christmas gifts would stop if I became a professed nonbeliever. Humans make a lot more sense when you look at how beliefs affect them in tangible ways.

Expand full comment

“As an adult, you should try as best you can to identify and fix such mistakes, by looking at your total available evidence now, and trying to correct for biases due to the order in which your learned things.” Two thoughts: (1) This looks like an arduous process. Correct beliefs are not infinitely valuable; one must count the cost (ex ante) of trying to free his beliefs from order bias, comparing this with the benefit (ex ante) of greater accuracy. (2) Probably one learned first the approved narrative of his own society. While correcting that narrative has the value of making one’s beliefs more likely to be true, it also probably estranges or alienates one from his own society, where the approved narrative probably still prevails. Fitting in with the local people has its value. {(3) Your point about “authenticity” is an odd one, which would not have occurred to me.}

Expand full comment
author

Sure, don't bother to correct mistakes that hardly matter. And sure, collect evidence about the social consequences of your beliefs.

Expand full comment

By the title, I was expecting something about anchoring or short-term sequencing of data in adults. I think there're many other sources of bias when comparing youth and adult information. Your examples are less about sequence and more about quantity and content. Much of youth (especially early-youth) learning mixes value and data in ways that later adult learning separates more clearly.

Expand full comment
author

Early youth learning is ALL data. You infer some values from that data, but it is itself directly data.

Expand full comment

HI Robin, I agree with the intent of your post and appreciate the added insights of other commenters. It's a topic that isn't discussed sufficiently. Even though I agree... still... just reflecting here a bit. I wonder if it may be important to have some time in childhood where you do believe that your country acts morally and is virtuous. How we feel about ourselves may depend on viewing the groups to which we belong in a positive, aspirational light. That is, having a positive view of the self as a moral actor who aspires to do good may be harmed if we are taught from birth that our country has actively harmed other countries, waged unnecessary wars, oppressed a subset of the population, etc.

Expand full comment

This is a good point. "We usually don’t actually expose ourselves to all the detailed sensory experiences of others who were taught different things."

The sensory experiences of a learning experience create a heavily weighted salience. So even if I know that students in a different graduate school are taught differently, I don't get all the same social-emotional feedback and communal belief feelings as these do.

Expand full comment

If your belief is an identitarian ritual of 'community' building, then well, there are whole countries based on such mistakes which entitled them to kill in the-name-of, in which rituals and routines of not-recognizing mistakes, allow/create the entitlement based on existential fear of the same 'beliefs' disappearing thus removing the identity and so the entitlement. Its a vicious cultural circle of bad world-building moralities…, ---and not personal ethics where such mistakes may endanger the soul/conscience and where broader moral questions can be safely ignored.

Expand full comment

what actually happens is a "death by thousands cuts"

the starting position changes:

what evidence to notice. even marginally it cumulates

small differences in assessing each new added piece of evidence. 0.1 difference multiplies fast

shape of the summation model. we can summarize evidence in multiple ways/models. the starting bias also influences this.

said differently: the equality of order depends on multiple stringent assumptions, most of which don't exist in real life.

Expand full comment

I think the failure of order independence is very strong evidence something wrong is going on but I don't think it necessarily tells us very much about what is going wrong. Still very useful but I think you made need the more usual search for fallacies to ferret out where and how the problem occured.

Expand full comment

"This also isn’t an issue of facts vs. values. Yes your early life experiences do seem to influence the personal values and priorities that explain your personal actions, but the way that your experiences did this was via information, i.e., via the things you saw, heard, and experienced when young. Thus you were making inferences from what you saw, heard, etc. to form your best estimates of what you valued."

This may be a bad example because as far as I can glimpse "what you value" really is defined by your early childhood experiences. (And that's all I'll say before being dragged into discussions of moral realism)

Expand full comment
author

Surely "defined" isn't the right word here. Maybe "well predicted".

Expand full comment

Sure. Maybe "at least partially caused". The point is that in this case _what you value_ is not independent of the order to which you are exposed to different experiences.

Expand full comment

this was one of my first encounters with rationality at age 15. I read a bit controversy, and realised that both sides agree about the contours of the evidence. except that each side started from the other edge of the evidence base.

"we know that X. but about Y, we can wiggle out this and this way"

both sides almost literally said this

Expand full comment

Can you give an example of somebody making the excuse discussed in the second to last paragraph?

Expand full comment

I think what's missing is the consideration of bounded rationality.

A possible alternative model: consider the rationality a skill that is learned with practice over time. Suppose you learned it early in life, and later learned about a fact X - then, from the meta perspective, you would consider your current perspective superior to the one you would learn by exchanging the order, and first learning about X, and later learning about rationality.

Another argument, in a similar flavour (maybe less clear, I'm less convinced about it): in general people cannot update properly. A world model is learned from facts, over time. But "world model are convex wrt utility". So there is a difference between "being an expert in Y and then learning about X" and "being an expert in Y, and then learning about X", and both of them are preferable to "knowing X and Y somewhat well".

Expand full comment