15 Comments

This is a form of the anchoring bias

Expand full comment

>why we seem too eager to believe the first opinion we hear on a>subject, and then seem too skeptical about further contrary>opinions we hear

Maybe we just find it unpleasant to be wrong, or rather to admit being wrong.We might also see changing an opinion as a sign of weakness (i.e. we are less knowledgeable, impressionable, haven't got our thoughts straight, etc.)

Expand full comment

why we seem too eager to believe the first opinion we hear on a subject, and then seem too skeptical about further contrary opinions we hear.

I am skeptical of this claim (which is the first opinion I've heard on the subject). Reference?

Expand full comment

We tend to view nonutterances as agreement or at least non-disagreement. Someone utters an opinion, nobody says anything against that view, the listener assumes that there is more weight to the opinion than just one person holding it.

The second view, uttered by a different person, has no such extra weight.

Prediction: If a number of people utter an opinion, and an equal number of people utter the opposite opinion immediately afterwards, the effect does not occur.

Expand full comment

Another classic example of this phenomenon (or at least cultural belief that it exists) is the common emphasis on the importance of making good first impressions upon meeting someone.

Expand full comment

Thanks Eliezer. That was what I suspected and hoped.

Expand full comment

This was a very clever posting. I happen to believe that people tend to be more open to the second opinion they hear about something. See where I'm going with this? Now you all believe me, not Ute Shaw. Of course, had Robin posted that Ute believed we were partial to the second thing we heard, and I chimed in saying we preferred the first, then we'd have a real paradox.

Anyone who's ever been a kid or been around kids knows these theories are suspect. Kids in trouble try to take advantage of this phenomenon (as they imagine it) by telling their side first. Adults quickly learn to wait until they have all sides before they decide the facts. Which, BTW, totally frustrates kids who try to get their story out first. Adults who act like kids in this respect tend to really piss other adults off.

Expand full comment

Bayesian inference is not influenced by the order in which updates are applied. p(c|b,a) = p(c|a,b).

This assumes that there is no confirmation bias effect in how we evaluate additional evidence. Of course, in real life, confirmation bias, group sociological bias, endowment effects and all sorts of other biases come into play and make the order of our exposure to ideas very important.

Expand full comment

Bayesian inference is not influenced by the order in which updates are applied. p(c|b,a) = p(c|a,b).

In theory, the order of observations can itself count as an observation. But once you collect your observations, probability theory takes no notice of the order in which you apply the updates.

Expand full comment

"When we first hear something, we move our prior that way. This decreases the prior for the opposite being true, and we discount that information more easily."

Yes, and this isn't a bias unless we move our prior too much. That is, even while we receive the second evidence more skeptically, the end result may still be the same regardless of which evidence we received first. So, if there is a bias here, it is not that we receive the second evidence more skeptically, but that we do this to too great a degree. So diagnosing a bias would involve more than merely noticing that the second evidence is met with greater skepticism.

(I have not done the math, so I do not know whether repeated Bayesian inference is influenced by the order of observations, so I don't know whether Bayesian inference is biased towards or against the first observation.)

Expand full comment

When we first hear something, we move our prior that way. This decreases the prior for the opposite being true, and we discount that information more easily.

Yep.

On my blog, I discuss a very interesting study that shows just how biased we all tend to be when evaluating data that opposes our priors.

Expand full comment

To apply my guess to a popular, current phenomena.

A. Anthropogenic Global Warming1. The presentation of the theory that anthropogenic global warming is occuring to to the emmission of greenhouse gases.2. The occurance of people challenging the accuracy of this theory, (who have no obvious personal financial stake in its acceptance or rejection), and who claim in the course of challenging anthropogenic global warming that it's presented by "elites".3. Social observers may, on balance, discount the 2nd popularized opinion on anthropogenic global warming, believing (perhaps correctly in this particular instance) that its proponents are primarily motivated by a desire to challenge the status hierarchy of the first presenters.

Expand full comment

Yet another guess: the first piece of information is accepted as fact, since we have no idea there is debate or conflict over the issue.

The second piece of info then informs us that there is a conflict, so we do not take it as true; instead we start analysing the debate. However, habit and salience bias have already firmly ingrained the first idea in our minds, so our analysis is skewed.

Expand full comment

I think the possibilities you and Anders present both make sense. Another possibility. The first opinion presented (new knowledge) is more likely to be from more objective study, contrary second opinions may be more likely from someone seeking to challenge the status hierarchy of the first presenter.

Expand full comment

Assume we discount information we believe based on our priors to be unlikely (makes sense in a noisy environment with deception). When we first hear something, we move our prior that way. This decreases the prior for the opposite being true, and we discount that information more easily.

However, I think your points 1-3 are also true. We definitely defend initial opinions much more strongly than we rationally ought to. Another mechanism is that we have had more time to mull over them, creating scenarios and links to other known facts. That makes accomodating contrary possibilities harder.

Expand full comment