23 Comments

Future of the world: do search for "yogic flying."

Expand full comment

"for a way that society could aim to retain the advantages of our present diversity of views, even while individuals accept the social consensus on factual matters."

What does it mean to have a consensus on a fact?

In other words, you want people to conform despite what they believe.

Expand full comment

Basically he is calling for us to become more aware of our internal inconsistency, and to bring it under conscious control. This seems to me to be a central element of the Overcoming Bias program.

No, he's calling for us to adopt a new method that is less accurate but satisfies his own biases.

That seems to me to be a central element of the Overcoming Bias program.

Expand full comment

That's a good point, Unnamed, there are situations where conformity causes problems. I see these as cases where the problem is failing to distinguish between people's private knowledge and beliefs, and their beliefs once they have incorporated the social consensus, producing what is sometimes called an information cascade. Improved terminology might help: you could say, I pre-consensually notice that the building we are in seems to be on fire (using Peter Turney's word), etc. This demonstrates an advantage of moving these decisions from the subconscious to the conscious level, if we can manage to do so.

Expand full comment

One other important thing to realize about the Asch studies is that they do provide evidence that people are subject to normative social influence. They show that people have motivations to conform to those around them for non-informational reasons, that most people do so at least on some occasions, and that this can happen even without explicit pressures from the group or close relationships between group members. The (not directly demonstrated) implication is that this kind of conformity is fairly common. This is true, regardless of what you think of the Bayesian argument about what people should be doing in these experiments. In other words, people's tendency to alter their behavior due to social pressures and the possibility that people don't make enough use of information from others are two separate issues, even though they can both be discussed in the context of the Asch studies. The NYT Magazine article dealt with the first of those two issues, and Hal's argument here is about the second.

Both kinds of social influence, information and normative, can have downsides. See, for instance, the studies described here on helping and the bystander effect. In one study by Latane & Darley (1969), participants were filling out a questionnaire in a room, either alone or in a group of 3 (either with 2 confederates or with 2 other real participants). Smoke started seeping into the room from under a (locked) door to a neighboring room, and by the time they finished there was a lot of smoke in the room. 75% of lone participants got up to go tell someone about the smoke. Only 10% of participants who were with 2 inactive confederates did so. And when a group of 3 real participants was in the room together, only 38% of the groups sought help (even though there were three potential helpers).

This was informational social influence. Each person in the group tried to keep their cool. When they glanced at each other for cues about what was going on, they saw how calm the other people were and decided that there wasn't an emergency (as confirmed by interviews after the fact). A group of confused people, wondering whether they were in an emergency and looking towards each other for information, ended up convincing each other that the situation was not an emergency.

Expand full comment

Thanks for the thoughtful comments here. Unnamed's points are very good, that things are even worse than I suggested - and thanks for the link to Eliezer's analysis of the Asch experiments, which sheds additional light. I thought it was significant that even Eliezer, who is not exactly a fan of majoritarian reasoning, said he would conform in the experiments. Contrast that to his Initiation Ceremony where he seems to suggest the opposite, although the cues to the majority opinion are more indirect in that case.

Overcoming Laziness asks whether these posts are sincere. The more relevant question is whether the reasoning offered is sound. I hope those who are giving up on the blog are not doing so because the points seem nonsensical. Please consider that at least some commenters do agree with the reasoning, and also keep in mind that we are all subject to a great deal of propaganda about how to think, some of which may not be well grounded. In addition, majoritarian-type reasoning must overcome strong human biases. Brent even suggests that it is outright evil to think this way. If it is any consolation to him, I will point out that Democrats outnumber Republicans in the U.S. by 10% or more, and Independents are twice as likely to lean Democratic. Since he compared Republicans to Nazis I speculate that these facts might make him more likely to recommend majoritarianism to voters!

But Overcoming Laziness' question brings us back to Robin's provocative suggestion for a way that society could aim to retain the advantages of our present diversity of views, even while individuals accept the social consensus on factual matters. He argues that we could present viewpoints that we don't necessarily agree with, vigorously defending them despite our personal reservations. In fact he points out that data shows that we already do this, although the conflict is unconscious. Basically he is calling for us to become more aware of our internal inconsistency, and to bring it under conscious control. This seems to me to be a central element of the Overcoming Bias program.

Expand full comment

This summary of Asch's data isn't anything new, it's how those data are generally summarized in intro to psychology classes. If you have an intro psych textbook nearby, check it out. Here's how the experiment is described in James W. Kalat's Introduction to Psychology textbook, which was the first intro psych textbook description of Asch's study that I could find on Google Books:

To Asch's surprise, 37 of the 50 participants conformed to the majority at least once, and 14 conformed on most of the trials. When faced with a unanimous wrong answer by the other group members, the mean participant conformed on 4 of the 12 trials.

In other words, pretty much the same thing as what Hodges & Geyer said. What's different is the interpretation of the meaning of the results. This textbook, following the standard approach, goes on to ask "Why did people conform so readily?" Hodges & Geyer instead say, apparently, that this really isn't all that much conformity. And Hal seems to think that this wasn't nearly enough conformity, since we should trust several sets of eyes more than the single pair residing in our head.

Hal should be even more dismayed than he lets on, since many of the subjects who conformed in Asch's studies indicated in follow-up interviews that they still believed their own eyes, but just went along with the group for social reasons (although a fraction of the conformers did claim to actually believe the group). This was pinned down more rigorously in further research by Asch (described on this very blog, among other places) that tested out a few variations of his original study. For instance, about 2/3 of the conformity disappeared when the subject wrote down his answer instead of saying it aloud. Conformity dropped by even more when one of the confederates gave the right answer, and it also dropped a great deal when a confederate diverged from the group by giving the other wrong answer. Additionally, the amount of conformity with the original design did not increase when the number of confederates rose from 4 to 15.

The distinction that psychologists make is between "informational social influence" (seeking truth) and "normative social influence" (seeking approval/avoiding disapproval), and these additional results imply that most of the conformity in the Asch experiment comes from the latter. Not a pleasing result for someone hoping that the participants would use the information from the crowd to act like good Bayesians.

Expand full comment

Caledonian, it is an example of bias to assume that those other individuals have not experienced similar pressures to form accurate beliefs as you have. Overconfidence in our own analytic capacity is one of the most well documented forms of cognitive bias.

An unbiased participant would presume, until receiving evidence to the contrary, that the other participants are at the median in terms of their reporting accuracy. Furthermore, an unbiased participant would be very skeptical of his own tendency to think he is a better witness than the others, because he would expect that conclusion to be produced by the operation of the overconfidence bias.

Expand full comment

Adrian, I'd like to see mathematical justification for "squared." In any case, the setup is that the experimenter has ostensibly provided the same instructions to the other observers as they have to you.

I predict an Ernst and Banks style result, where you calibrate the accuracy of your fellow participants and weight their opinion accordingly. Even more so if you put money on the line for being correct. Thirteen trials is not much time to do this though, so the question is whether you start from a prior of distrust or of trust.

Expand full comment

Adrian, those eight strangers have likely been tested as much as (a random) you; it is just that you have not tested them.

Example of bias: presumption that testing and selection have operated upon entities without evidence that this has been the case.

What good is it to presume that those people have been tested, if you don't know what the results are?

Expand full comment

Adrian, those eight strangers have likely been tested as much as (a random) you; it is just that you have not tested them.

Brent, if the choice is between loyalty to my head or to the truth, I choose truth.

Expand full comment

oh please.

If I could name one thing that I want my kids to achieve out of all their schooling it would the ability to think independently.

My entire income is based on that ability.

As a young mechanical engineer in a manufacturing world I'm surrounded by people who are more experienced, more involved, older, and more authoritative than me... so why do they hire me?? Because when 12 people have looked at a problem and declared it unsolvable the expectation is that I will choose to not accept the majority opinion (even when they're standing RIGHT THERE) and decide for myself the solvability of the problem.

Think for yourself.

Of COURSE you should think for yourself.

Your example of the person too drunk to know they're too drunk is not a good example - that person is simply ignoring good advice (good DATA) which is a different thing to refusing to allow the will of others to be the decider of your mind.

It seems to me that the act of accepting the majority opinion, REGARDLESS of whether it's right or wrong - or worse, when your head is telling you that it IS wrong - is an act of self-betrayal. It can be one of the deepest acts of evilnes - after all, that's how Nazism became so popular, that's how religions get started, that's how Republicans get elected.

Expand full comment

Overcoming Laziness - I've come to something of a similar conclusion (that many of the posts seem deliberately goading), and have gone from daily reading to a couple of times a month.

There is occasional real discussion here, but the vast majority of the time it has the feel of that particular form of debate which I can only describe as mental masturbation.

On this particular topic, the author seems to have skipped over that the evidence acquired from another person has, as an absolute minimum, a squared degree of uncertainty. (You have to perceive their explanations of their perception.) In reality, the uncertainty is even greater, because the number of variables increases by orders of magnitude when you introduce secondhand information. You have to correctly perceive; they have to correctly explain (with subconsiderations of potential motives); they have to correctly perceive. Eight untested strangers don't even begin to compare to the certainties of one's own perceptions, which, after all, have been very heavily tested.

For the correctness of the answer it makes absolutely no sense to go with the consensus view. Eight strangers telling me something of unknown cost to them is a considerably different scenario, after all, than a group of friends telling me something which will have a high cost on the friendship if I don't take them seriously.

Expand full comment

"The key question is, what is the right thing to do here? Should one conform when presented with 8 people denying the evidence of one's own senses? I argue that it is the right thing to do."

You don't seem to differentiate between reexamining one's beliefs and conforming despite one's beliefs; I find this troubling. I initially found this site quite compelling, but over the past few months I have been I have been perplexed by many of the views of the authors. I have a simple question, which if answered, may help me: Are the authors purposely posting what they do not believe, or deliberately witholding information that would be relevant to the posts, in order to promote discussion?

Expand full comment

Caledonian, while I tend to agree with your sentiment, I've come around to the idea that I am likely overconfident in my perceptions. Sure, you aren't necessarily going to convince me that black is white even if ten people back you up. But I will reexamine my position. Perhaps look at the black object from different angles, under different conditions. Maybe I'll decide I need to measure the lines. I think it's a mistake to look at conformity as all or nothing. The key is whether other opinions, and not just opinions from those you *know* you should respect, cause you to at least briefly question your own. That is, are other opinions data? I find it absurd to answer no. And I increasingly suspect that we (or at least I) tend to underweight this data when forming our own opinions.

BTW, from a reader's perspective, I have found your comments much more thoughtful and constructive to the discussion.

Expand full comment

"The key question is, what is the right thing to do here? Should one conform when presented with 8 people denying the evidence of one's own senses? I argue that it is the right thing to do."

You seem to be writing that there's no costs involved in deciding situationally when to conform or not. In the real world, there are decision costs, and that should probably be factored into this. Also, if the default human instinct is often to non-conform, there may be greater costs to overcome (for an example, in terms of will power) to decide to conform. Before deciding to conform, it may make sense to weigh those costs, or more efficiently, to have some simple rules about when to conform without unique deliberation, when to nonconform without unique deliberation, and when to engage in such deliberation (and for how long).

I think you hint at that when you write that if a group of friends tell you they think you have a substance abuse problem, you (at the least) shouldn't nonconform without unique deliberation.

Expand full comment