Discover more from Overcoming Bias
Social Proof, But of What?
People tend to (say they) believe what they expect that others around them will soon (say they) believe. Why? Two obvious theories:
A) What others say they believe embodies info about reality,
B) Key audiences respect us more when we agree with them
Can data distinguish these theories? Consider a few examples.
First, consider the example that in most organizations, lower level folks eagerly seek “advice” from upper management. Except that when such managers announce their plan to retire soon, lower folks immediately become less interested in their advice. Manager wisdom stays the same, but the consensus on how much others will defer to what they say collapses immediately.
Second, consider that academics are reluctant to cite papers that seem correct, and which influenced their own research, if those papers were not published in prestigious journals, and seem unlikely to be so published in the future. They’d rather cite a less relevant or influential paper in a more prestigious journal. This is true not only for strangers to the author, but also for close associates who have long known the author, and cited that author’s other papers published in prestigious journals. And this is true not just for citations, but also for awarding grants and jobs. As others will mainly rely on journal prestige to evaluate paper quality, that’s what academics want to use in public as well, regardless of what they privately know about quality.
Third, consider the fact that most people will not accept a claim on topic area X that conflicts with what MSM (mainstream media) says about X. But that could be because they consider the media more informed than other random sources, right? However, they will also not accept this claim on X when made by an expert in X. But couldn’t that be because they are not sure how to judge who is an expert on X? Well let’s consider experts in Y, a related but different topic area from X. Experts in Y should know pretty well how to tell who is an expert in X, and know roughly how much experts can be trusted in general in areas X and Y.
Yet even experts in Y are also reluctant to endorse a claim made by an expert in X that differs from what MSM says about X. As the other experts in Y whose respect they seek also tend to rely on MSM for their views on X, our experts in Y want to stick with those MSM views, even if they have private info to the contrary.
These examples suggest that, for most people, the beliefs that they are willing to endorse depend more on what they expect their key audiences to endorse, relative to their private info on belief accuracy. I see two noteworthy implications.
First, it is not enough to learn something, and tell the world about it, to get the world to believe it. Not even if you can offer clear and solid evidence, and explain it so well that a child could understand. You need to instead convince each person in your audience that the other people who they see as their key audiences will soon be willing to endorse what you have learned. So you have to find a way to gain the endorsement of some existing body of experts that your key audiences expect each other to accept as relevant experts. Or you have to create a new body of experts with this feature (such as say a prediction market). Not at all easy.
Second, you can use these patterns to see which of your associates think for themselves, versus aping what they think their audiences will endorse. Just tell them about one of the many areas where experts in X disagree with MSM stories on X (assuming their main audience is not experts in X). Or see if they will cite a quality never-to-be-prestigiously-published paper. Or see if they will seek out the advice of a soon-to-be-retired manager. See not only if they will admit to which is more accurate in private, but if they will say when their key audience is listening.
And I’m sure there must be more examples that can be turned into tests (what are they?).