What Belief Conformity?

I wrote:

We feel a deep pleasure from realizing that we believe something in common with our friends, and different from most people. … This feeling is EVIL.

Patri Friedman responded:

I see this bias as counteracting the bias of groupthink. The opposite bias is for people to enjoy believing what everyone else believes. This leads to homogeneity of viewpoints, less generation and testing of new hypothesis, and stasis. The people who enjoy believing they have a secret truth are those who nurture non-mainstream but plausible hypotheses, and accumulate new evidence to possibly challenge the mainstream. I think this is very valuable.

Yes, we want to explore a diversity of hypotheses, but this doesn’t require a diversity of beliefs; we can believe similar things while exploring different things.  Yes, groupthink seems to exist, but not as a general bias to conform to average beliefs; groupthink is a bias to conform to in-group beliefs against out-groups.  Thus by their nature groupthink biases of in-groups come already countered by out-groups. 

When a particular group (such as academia) rewards in-group conformity, you may at times be right to resist that.  But by doing so you would not be resisting some general pressure to conform with a global average; you would instead be favoring one group less than others.  I see no general conformity pressure in need of resisting; I instead see instead particular groupthinks, some of which may be preferred to others. 

For example, in herding experiments, subjects must choose between a few acts (e.g., which movie to watch), where some acts pay better than others.  One at a time subjects choose an act, after seeing both a private clue about act quality, and seeing others’ previous choices.  I’ve just reviewed 16 papers on this (including this this this this this this this this this this this and this).

Herding experiment subjects will respond to payoffs that explicitly reward making the same choice as others, but without such extra payoffs subject behavior is well explained by these factors:

  • Rationality:  Subjects attend to the info contained in others’ acts.
  • Error: Subjects make random mistakes
  • Contrarian:  Subjects prefer acts that fewer others pick.
  • Arrogance: Subjects weigh their own clues more strongly than others’.

We need no additional tendency to conform with others’ acts to explain this data!   

Yes, conformity-related personality factors predict who herds more, and for the inexperienced losses can loom larger than gains.  And we do see info collection puzzles; subjects buy too much info and can prefer poor social info over good private info.  But given their info, we see no tendency for herding experiment subjects to conform to others acts unless payoffs directly reward such conformity.  We in fact see the opposite – unrewarded contrarian tendencies. 

Yes, psychologists have a huge literature on conformity, but I can’t find psych data clearly showing belief conformity effects, beyond these other effects.  Most psych experiments don’t carefully control for info effects, and so can’t tell if belief herding is due to info or conformity effects.  Asch’s "conformity" experiments are now most famous, but actually:

[Considering] 99 accounts in social psychology textbooks … we found that authors have often distorted Asch’s findings, and that this trend has increased substantially with time.  … When it first appeared, the Asch study was understood … as evidence of the impressive powers of independence in social life.  With the passing years, however, it has been increasingly represented as a traditional demonstration of conformity and the central point of Asch’s work has become not only drastically weakened, but reversed. 

The place where belief conformity seems clearest to me is in ideological groupthink – religion, politics, etc., i.e., where a group’s identity is tied up with a belief on some hard to verify topic.  I’m willing to believe this often extends to more practical beliefs as well, but as far as I can tell there just is no net "bias … to enjoy believing what everyone else believes."  At best we see pressures to embrace particular groupthinks against others. 

Some extended quotes from four papers.  One:

If there is an additional preference to go along with the crowd, then this status-quo bias should show up most clearly when the Bayes’ distribution for A is close to 1/2. … We think that it is reasonable to identify the previous decision as being the status quo, even when there is only one previous decision.  … If there is a systematic bias in favor of following the previous decision(s), it is too weak to show up in these data.


We study the role of social preferences and conformity in explaining herding behavior in anonymous risky environments. In an experiment similar to information cascade settings, but with no private information, we find no evidence for conformity. On the contrary, we observe a significant amount of non-conforming behavior, which cannot be attributed to errors.


6000 subjects, including a subsample of 267 … from an international consulting firm, participated in the experiment. … We find no evidence of herding or imitative behavior in the presence of a flexible market price…. While we do not observe herding, we find considerable support for the existence of "contrarian" behavior. … There seems to be no significant difference between male and female subjects, or between subjects with and without college education. Ph.D.’s and Ph.D. students, however, performed slightly better in terms of rationality. … physicists and mathematicians perform best in terms of "rationality" (i.e. performance according to theory) and psychologists worst. However, since "rational" behavior is only profitable when other subjects also behave rationally … the ranking in terms of profits is just the opposite: psychologists are best and physicists are worst.


An experiment found significant correlations between herding and age … In outlining the dimensions of social awareness, we use the DSM-IV-TR (2000) / ICD-10 (1994) classifications of an anti-social / dissocial personality along the dimensions of non-conformity, recklessness, disregard for others, impulsivity and risk-seeking. … In the behavioural model, gender and impulsivity have no significant impact on the tendency to herd. The grade of herding still exerts a strong, positive and statistically significant impact on the likelihood of herding. The association with age confirms other research suggesting that older people are less susceptible to social pressure.

Conformity and extraversion are associated with a significantly higher tendency to herd and venturesomeness is associated with a significantly lower tendency to herd. With the exception of the anomalous finding that the propensity to herd is relatively low in the empathetic subjects, overall the results from the psychometric analysis support the idea that the tendency to herd is a function of sociability, giving preliminary evidence that the interactions between conformity and sociability extends beyond the purely social contexts and into economic and financial decision-making.

GD Star Rating
Tagged as:
Trackback URL:
  • Robin writes: “The place where belief conformity seems clearest to me is in ideological groupthink – religion, politics, etc., i.e., where a group’s identity is tied up with a belief on some hard to verify topic. I’m willing to believe this often extends to more practical beliefs as well, but as far as I can tell there just is no net “bias … to enjoy believing what everyone else believes.” At best we see pressures to embrace particular groupthinks against others. ”

    I think I am having a hard time figuring this out. Help me out with this example.

    Suppose I believe abortion is wrong, plain wrong all of the time. You have a more nuanced view, and believe that I am wrong.

    My group also believes that you non-believers are evil; your group just believes of my group that we are nuts, and possibly faking our beliefs.

    Our group takes perverse pride in being small; your group is much larger, but nobody cares too much about the size of the group.

    Our group would like your group to vanish, though oddly it is the small size of our group that makes it cohesive enough to last.

    You group just wants to bar our group from having any policy influence.

    So that is the example. What are you recommending for each participant in the two groups to believe in? I just cannot follow your practical recommendation.

  • billswift

    Two comments for Michael Webster:
    First your group **is** practicing ideological groupthink as per Robin’s essay.
    Second your group is even more evil than Robin’s discussion of groupthink, because you want to **impose** your beliefs on others.

  • Constant

    your group is even more evil than Robin’s discussion of groupthink, because you want to **impose** your beliefs on others.

    Aside from being beside the point, desire to impose one’s belief on others is not by itself evil. For example, I believe that rape is wrong and I desire to impose this belief on would-be rapists. By force. Is this evil? If you believe that my desire to impose this belief on others is evil, then there is quite an abyss between us. But your belief may be self-undermining: if you believe that it is evil (with asterisks of emphasis) for me to impose my beliefs, then presumably you are, yourself, willing to fight against my imposing my beliefs – and thus are willing to impose your own beliefs on me.

  • Michael, I said “take no pleasure when you and your associates disagree with others.” If your belief cannot survive without that pleasure, it should not survive.

  • frelkins

    @M Webster

    just cannot follow your practical recommendation

    2 Steps, 20 minutes:

    1. please review the OSCON07 talk
    2. please review slides 29-38 of the standard disagreement talk, with special focus on slides 32-37

    “my impression” is working magic for me here.

  • billswift

    Yes it is still evil. It is less evil, maybe, depending on how you go about imposing it, than rape is. The better way is for the victim to successfully defend herself, or better yet to prevent attacks. Retaliation is still coercion, even when justified.

  • What others believe is not acceptable evidence that can be used to determine your own belief.

    If this principle is followed, then looking at others’ beliefs can be a useful heuristic. If it is not, then it’s not a useful heuristic.

    The point is not to confuse a useful heuristic with rigorous reasoning. Heuristics are what you use when rigor can’t be obtained. Using them when it’s not appropriate to do so violates reason.

  • billswift

    “then presumably you are, yourself, willing to fight against my imposing my beliefs – and thus are willing to impose your own beliefs on me.”

    This is the classic evil bastards’ claim that equates defense with attack. Defending from someone’s prior attempt to coerce me is NOT the same as imposing my beliefs on others.

  • Constant

    Billswift – I was critiquing your argument, not attacking the conclusion. You have (at 3:53) added the new assumption that pro-lifers are not defending anyone – an additional assumption which was not part of the argument I was critiquing. Pro-lifers will, it need hardly be pointed out, disagree with your new assumption, since they consider the unborn child to be someone.

    If you say, “you want to impose your beliefs on others”, you are making a statement about the other person’s point of view – about what they want. But if you make an assertion about the other person’s point of view, then you need to consider the other person’s point of view. And from their point of view, the unborn child is someone.

  • One reason why a group is likely to acquire uniform irrational beliefs is that the dissenters switch to other groups. If dissenters were to insist on sticking around instead, it will be more difficult for a group to acquire an attitude of “All of us are right and all of them are wrong!”

  • If billswift’s advice were sent by chronophone to various past societies, what would it sound like?

  • billswift

    What advice? Don’t do any more evil than you need to? Defense is better than retaliation? Avoiding attack is even better than defending one? Those are all obvious to anyone who has actually lived in the real world, rather than the fantasyland of academia and the modern middle-class.

    My earlier post doesn’t give any advice, it just points out that imposing beliefs on others is evil (because it requires coercion, which can be dangerous if the victim resists).

  • @Frelkins. Thanks for the references, but the first link doesn’t work. I presume that the powerpoint accompanied the talk that the first link was supposed to point to.

    @Robin. Thanks. Does it follow from your view that I cannot rationally expect to be part of an elite group, which prides itself on being small and exclusive? The Groucho Marx hypothesis, if you will.

  • frelkins

    @M Webster

    The OSCON07 talk is just key – sorry for the typo. The full link (damn you, Typepad!) is http://blip.tv/file/318231/

    No, the disagreement paper doesn’t really accompany Robin’s speech, but is its background.

    “I desire only to know the truth. . . .And, to the utmost of my power, I exhort all other men to do the same. . . .I exhort you also to take part in the great combat, which is the combat of life, and greater than every other earthly conflict.”

    Socrates, The Gorgias

  • I note that in pursuit of rationality, you throw morality overboard. I would point out that most people “with experience of the world” value morality more highly than rationality, and that your advice would strike them as “one of those Western things.”

    To elaborate, check out Jonathan Haidt in http://faculty.virginia.edu/haidtlab/articles/haidt.graham.planet-of-the-durkheimians.doc (copy link, I don’t know this interface’s version of html), or check out the research generally at http://people.virginia.edu/~jdh6n/moraljudgment.html