Conformity Questions

Follow-up to: Conformity Myths

Robin posted earlier about a NYT Magazine article on conformity. I was able to find an online copy of the scientific paper here:

The synopsis from the NYT is not complete. Of the 12 times that people were challenged to disagree with the social consensus, the most popular choice was to agree 0 times. 25% of the subjects did this. The second most common was to agree 3 times, done by 14%. Third most common was agreeing 8 times, 11%. Only 5% went along with the crowd all 12 times.

I think it’s quite significant that 25% of subjects never went along with the crowd and stuck to their own perceptions. In total, only 32% of the answers were wrong.

I’m not sure I follow Robin’s comments on this. It seems to me that this re-interpretation of the classic experiment suggests that people are not as conformist as generally thought. That would mean that we do more than merely give lip service to celebrating independence, that culturally we are quite effective at following the ideal of independent thinking.

The key question is, what is the right thing to do here? Should one conform when presented with 8 people denying the evidence of one’s own senses? I argue that it is the right thing to do.

Now of course, if you know you’re in a psychological experiment, maybe you can’t help but be suspicious that something fishy is going on. But in general, in real life, if 8 people come in and tell you that your perceptions are completely wrong, you should take it very seriously. I imagine that in the history of the world, in the great majority of such situations, the 8 were right and the one was wrong. As an example that some may be familiar with, if a bunch of friends come in and tell you you’re drinking too much, while your perception is that you can easily handle the alcohol, you should probably listen to them.

I would suggest that conformity is the right thing to do in these situations, and to that extent I am rather dismayed that the subjects were as non-conformist as this data shows.

GD Star Rating
Tagged as:
Trackback URL:
  • gwern

    I’m not (much) disturbed. Reading through the paper, there’s no mention of any reward or reason for people to agree with the group, aside from perhaps a vague peer pressure (which the paper mentions Asche mentioning one fellow signaling to the group his awareness of the contradiction by adding ‘sir’ to his statements).

    In such a case, why not fail to conform? There’s nothing at stake, no penalty. Surely you’ve heard the quip about disputes in academia – they are so various and so vociferous because they are so pointless.

    (The obvious followup question would be, how much incentive does it take to make people conform to the group? I’d say not terribly much, and certainly much less than is at stake in real life.)

  • Psychohistorian

    It’s also worth noting that line-length perception is not like drunkenness. You don’t have a good reason to trust other people’s opinions over your own. If it were something more abstract, like determining the main point of the author of an opinion article, I’d expect much more of a conformity effect.

  • celeriac

    To echo my previous comment, there are decades-old and well understood psychophysics techniques that will answer the question of whether and how much conforming improves accuracy in the usual case (when nothing ‘fishy’ is going on). Why can’t I find any paper using them?

  • If you somehow know that:

    1. nobody is lying

    2. you all understand the question

    3. you have no other reason to believe that your judgment is better nor worse than anyone else’s

    4. you have no way of gathering further data

    5. you have no time bound on how long you can spend thinking about the problem (so calculating the majority opinion is costless)

    then you’re in a position of perfect symmetry, and therefore each person should take the same “generic” advice, which would be to trust the majority opinion* in such a case.

    If some of them act on this advice and others do not (either because they refuse the advice, or they did not receive the advice), the symmetry is broken. However, the amount of symmetry-breaking caused by this is tiny, and would not invalidate the advice in practice. (This is a very key point that I’m happy to debate.)

    *I’m a bit stymied by the lack of generally-accepted terminology. I obviously mean here “the opinion of each person, before they factored in the opinions of the others”, but there’s no one-word way to say that.

  • Thanks Hal for digging for more detail here. Like celeriac, I’d like to see real data, not where they are trying to trick subjects, but where subjects are honestly disagreeing.

  • michael vassar

    Is this really hard? In general people should believe that the majority perception is probably correct but it is socially optimal for them to report their non-consensus perception instead to prevent information cascades and to allow accurate aggregation of evidence.

  • Caledonian

    Alcohol is well-known for its ability to affect perception, and most especially self-perception. If you’re confronted with a group of people saying that you’ve had too much to drink, you should probably listen.

    With line drawings? That you can continue to look at even as they tell you you’re wrong? No.

  • “The key question is, what is the right thing to do here? Should one conform when presented with 8 people denying the evidence of one’s own senses? I argue that it is the right thing to do.”

    You seem to be writing that there’s no costs involved in deciding situationally when to conform or not. In the real world, there are decision costs, and that should probably be factored into this. Also, if the default human instinct is often to non-conform, there may be greater costs to overcome (for an example, in terms of will power) to decide to conform. Before deciding to conform, it may make sense to weigh those costs, or more efficiently, to have some simple rules about when to conform without unique deliberation, when to nonconform without unique deliberation, and when to engage in such deliberation (and for how long).

    I think you hint at that when you write that if a group of friends tell you they think you have a substance abuse problem, you (at the least) shouldn’t nonconform without unique deliberation.

  • Bob

    Caledonian, while I tend to agree with your sentiment, I’ve come around to the idea that I am likely overconfident in my perceptions. Sure, you aren’t necessarily going to convince me that black is white even if ten people back you up. But I will reexamine my position. Perhaps look at the black object from different angles, under different conditions. Maybe I’ll decide I need to measure the lines. I think it’s a mistake to look at conformity as all or nothing. The key is whether other opinions, and not just opinions from those you *know* you should respect, cause you to at least briefly question your own. That is, are other opinions data? I find it absurd to answer no. And I increasingly suspect that we (or at least I) tend to underweight this data when forming our own opinions.

    BTW, from a reader’s perspective, I have found your comments much more thoughtful and constructive to the discussion.

  • Overcoming Laziness

    “The key question is, what is the right thing to do here? Should one conform when presented with 8 people denying the evidence of one’s own senses? I argue that it is the right thing to do.”

    You don’t seem to differentiate between reexamining one’s beliefs and conforming despite one’s beliefs; I find this troubling. I initially found this site quite compelling, but over the past few months I have been I have been perplexed by many of the views of the authors. I have a simple question, which if answered, may help me: Are the authors purposely posting what they do not believe, or deliberately witholding information that would be relevant to the posts, in order to promote discussion?

  • Adirian

    Overcoming Laziness – I’ve come to something of a similar conclusion (that many of the posts seem deliberately goading), and have gone from daily reading to a couple of times a month.

    There is occasional real discussion here, but the vast majority of the time it has the feel of that particular form of debate which I can only describe as mental masturbation.

    On this particular topic, the author seems to have skipped over that the evidence acquired from another person has, as an absolute minimum, a squared degree of uncertainty. (You have to perceive their explanations of their perception.) In reality, the uncertainty is even greater, because the number of variables increases by orders of magnitude when you introduce secondhand information. You have to correctly perceive; they have to correctly explain (with subconsiderations of potential motives); they have to correctly perceive. Eight untested strangers don’t even begin to compare to the certainties of one’s own perceptions, which, after all, have been very heavily tested.

    For the correctness of the answer it makes absolutely no sense to go with the consensus view. Eight strangers telling me something of unknown cost to them is a considerably different scenario, after all, than a group of friends telling me something which will have a high cost on the friendship if I don’t take them seriously.

  • brent

    oh please.

    If I could name one thing that I want my kids to achieve out of all their schooling it would the ability to think independently.

    My entire income is based on that ability.

    As a young mechanical engineer in a manufacturing world I’m surrounded by people who are more experienced, more involved, older, and more authoritative than me… so why do they hire me?? Because when 12 people have looked at a problem and declared it unsolvable the expectation is that I will choose to not accept the majority opinion (even when they’re standing RIGHT THERE) and decide for myself the solvability of the problem.

    Think for yourself.

    Of COURSE you should think for yourself.

    Your example of the person too drunk to know they’re too drunk is not a good example – that person is simply ignoring good advice (good DATA) which is a different thing to refusing to allow the will of others to be the decider of your mind.

    It seems to me that the act of accepting the majority opinion, REGARDLESS of whether it’s right or wrong – or worse, when your head is telling you that it IS wrong – is an act of self-betrayal. It can be one of the deepest acts of evilnes – after all, that’s how Nazism became so popular, that’s how religions get started, that’s how Republicans get elected.

  • Adrian, those eight strangers have likely been tested as much as (a random) you; it is just that you have not tested them.

    Brent, if the choice is between loyalty to my head or to the truth, I choose truth.

  • Caledonian

    Adrian, those eight strangers have likely been tested as much as (a random) you; it is just that you have not tested them.

    Example of bias: presumption that testing and selection have operated upon entities without evidence that this has been the case.

    What good is it to presume that those people have been tested, if you don’t know what the results are?

  • celeriac

    Adrian, I’d like to see mathematical justification for “squared.” In any case, the setup is that the experimenter has ostensibly provided the same instructions to the other observers as they have to you.

    I predict an Ernst and Banks style result, where you calibrate the accuracy of your fellow participants and weight their opinion accordingly. Even more so if you put money on the line for being correct. Thirteen trials is not much time to do this though, so the question is whether you start from a prior of distrust or of trust.

  • Mark Spottswood

    Caledonian, it is an example of bias to assume that those other individuals have not experienced similar pressures to form accurate beliefs as you have. Overconfidence in our own analytic capacity is one of the most well documented forms of cognitive bias.

    An unbiased participant would presume, until receiving evidence to the contrary, that the other participants are at the median in terms of their reporting accuracy. Furthermore, an unbiased participant would be very skeptical of his own tendency to think he is a better witness than the others, because he would expect that conclusion to be produced by the operation of the overconfidence bias.

  • Unnamed

    This summary of Asch’s data isn’t anything new, it’s how those data are generally summarized in intro to psychology classes. If you have an intro psych textbook nearby, check it out. Here’s how the experiment is described in James W. Kalat’s Introduction to Psychology textbook, which was the first intro psych textbook description of Asch’s study that I could find on Google Books:

    To Asch’s surprise, 37 of the 50 participants conformed to the majority at least once, and 14 conformed on most of the trials. When faced with a unanimous wrong answer by the other group members, the mean participant conformed on 4 of the 12 trials.

    In other words, pretty much the same thing as what Hodges & Geyer said. What’s different is the interpretation of the meaning of the results. This textbook, following the standard approach, goes on to ask “Why did people conform so readily?” Hodges & Geyer instead say, apparently, that this really isn’t all that much conformity. And Hal seems to think that this wasn’t nearly enough conformity, since we should trust several sets of eyes more than the single pair residing in our head.

    Hal should be even more dismayed than he lets on, since many of the subjects who conformed in Asch’s studies indicated in follow-up interviews that they still believed their own eyes, but just went along with the group for social reasons (although a fraction of the conformers did claim to actually believe the group). This was pinned down more rigorously in further research by Asch (described on this very blog, among other places) that tested out a few variations of his original study. For instance, about 2/3 of the conformity disappeared when the subject wrote down his answer instead of saying it aloud. Conformity dropped by even more when one of the confederates gave the right answer, and it also dropped a great deal when a confederate diverged from the group by giving the other wrong answer. Additionally, the amount of conformity with the original design did not increase when the number of confederates rose from 4 to 15.

    The distinction that psychologists make is between “informational social influence” (seeking truth) and “normative social influence” (seeking approval/avoiding disapproval), and these additional results imply that most of the conformity in the Asch experiment comes from the latter. Not a pleasing result for someone hoping that the participants would use the information from the crowd to act like good Bayesians.

  • Thanks for the thoughtful comments here. Unnamed’s points are very good, that things are even worse than I suggested – and thanks for the link to Eliezer’s analysis of the Asch experiments, which sheds additional light. I thought it was significant that even Eliezer, who is not exactly a fan of majoritarian reasoning, said he would conform in the experiments. Contrast that to his Initiation Ceremony where he seems to suggest the opposite, although the cues to the majority opinion are more indirect in that case.

    Overcoming Laziness asks whether these posts are sincere. The more relevant question is whether the reasoning offered is sound. I hope those who are giving up on the blog are not doing so because the points seem nonsensical. Please consider that at least some commenters do agree with the reasoning, and also keep in mind that we are all subject to a great deal of propaganda about how to think, some of which may not be well grounded. In addition, majoritarian-type reasoning must overcome strong human biases. Brent even suggests that it is outright evil to think this way. If it is any consolation to him, I will point out that Democrats outnumber Republicans in the U.S. by 10% or more, and Independents are twice as likely to lean Democratic. Since he compared Republicans to Nazis I speculate that these facts might make him more likely to recommend majoritarianism to voters!

    But Overcoming Laziness’ question brings us back to Robin’s provocative suggestion for a way that society could aim to retain the advantages of our present diversity of views, even while individuals accept the social consensus on factual matters. He argues that we could present viewpoints that we don’t necessarily agree with, vigorously defending them despite our personal reservations. In fact he points out that data shows that we already do this, although the conflict is unconscious. Basically he is calling for us to become more aware of our internal inconsistency, and to bring it under conscious control. This seems to me to be a central element of the Overcoming Bias program.

  • Unnamed

    One other important thing to realize about the Asch studies is that they do provide evidence that people are subject to normative social influence. They show that people have motivations to conform to those around them for non-informational reasons, that most people do so at least on some occasions, and that this can happen even without explicit pressures from the group or close relationships between group members. The (not directly demonstrated) implication is that this kind of conformity is fairly common. This is true, regardless of what you think of the Bayesian argument about what people should be doing in these experiments. In other words, people’s tendency to alter their behavior due to social pressures and the possibility that people don’t make enough use of information from others are two separate issues, even though they can both be discussed in the context of the Asch studies. The NYT Magazine article dealt with the first of those two issues, and Hal’s argument here is about the second.

    Both kinds of social influence, information and normative, can have downsides. See, for instance, the studies described here on helping and the bystander effect. In one study by Latane & Darley (1969), participants were filling out a questionnaire in a room, either alone or in a group of 3 (either with 2 confederates or with 2 other real participants). Smoke started seeping into the room from under a (locked) door to a neighboring room, and by the time they finished there was a lot of smoke in the room. 75% of lone participants got up to go tell someone about the smoke. Only 10% of participants who were with 2 inactive confederates did so. And when a group of 3 real participants was in the room together, only 38% of the groups sought help (even though there were three potential helpers).

    This was informational social influence. Each person in the group tried to keep their cool. When they glanced at each other for cues about what was going on, they saw how calm the other people were and decided that there wasn’t an emergency (as confirmed by interviews after the fact). A group of confused people, wondering whether they were in an emergency and looking towards each other for information, ended up convincing each other that the situation was not an emergency.

  • That’s a good point, Unnamed, there are situations where conformity causes problems. I see these as cases where the problem is failing to distinguish between people’s private knowledge and beliefs, and their beliefs once they have incorporated the social consensus, producing what is sometimes called an information cascade. Improved terminology might help: you could say, I pre-consensually notice that the building we are in seems to be on fire (using Peter Turney’s word), etc. This demonstrates an advantage of moving these decisions from the subconscious to the conscious level, if we can manage to do so.

  • Caledonian

    Basically he is calling for us to become more aware of our internal inconsistency, and to bring it under conscious control. This seems to me to be a central element of the Overcoming Bias program.

    No, he’s calling for us to adopt a new method that is less accurate but satisfies his own biases.

    That seems to me to be a central element of the Overcoming Bias program.

  • Overcoming Laziness

    “for a way that society could aim to retain the advantages of our present diversity of views, even while individuals accept the social consensus on factual matters.”

    What does it mean to have a consensus on a fact?

    In other words, you want people to conform despite what they believe.

  • Laurence Topliffe

    Future of the world: do search for “yogic flying.”