24 Comments

Max Planck: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” We usually accuse the religious or political groups of engaging in this behavior, but there may indeed be a universal tendency to this perseverance. Maybe Dennett especially.

Expand full comment

I know I'm late, but Daniel Dennett uses almost the exact same wording as this:

"Someone willing to embrace unreasonable arguments for his group shows a willingness to continue supporting them no matter which way the argument winds blow."

... in his book "Breaking the Spell: Religion as a Natural Phenomenon." Which I think anyone reading this site would enjoy, if they haven't read it already.

Expand full comment

The discussion attempts to answer "Does the essential irrationality of the arguments have a function in building group solidarity?", but I must have missed the link where apologetics (Christian, Bayesian, Mathematician) were shown to be essentially irrational.

Expand full comment

How about the simple fact that offering a reason makes a statement or request more likely to be believed regardless of what the reason is. See the photocopy experiment where people complied 93% of the time when asked:

"Can I use the photocopier because I need to make copies?"

Compared to 94% for:

"Can I use the photocopier because I'm in a rush?"

or 60% for:

"Can I use the photocopier?"

Expand full comment

Doesn't experimental econ handle this idea in information cascades leading individuals to ignore their own private information and follow the herd? If the herd is wrong then loyalties will be weak the cascade dies out. In the case of trying to prove a negative "wrongness" is more difficult to establish (as you outline above) and loyalties might be stronger or more resistant to challenges. Reiterating the argument by respected agents might reinforce the herd behavior and again private doubts are ignored say in the case of religious loyalty or loyalty to a cause like "save the earth". Isn't this also a network problem as in six degrees of separation. The relative strength of the connections or closeness to the center might explain degrees of loyalty.

In a real world case of apologetics theologians shore up christian doctrine in their attacks on Rick Warren, the popular evangelical minister. Since Warren deviates from accepted christian doctrine in his church he is fertile ground for christian apologetics. Bur Warren's ideas are an information cascade that the theologians can't effectively counter. His ideas seem to be accepted by many christians of all denominations even though many of Warren's ideas and practices run counter to their own church's liturgical message. So we have an apologetic within an information cascade engulfing the target of the apologetic but having little real effect. So who is the target of the apologetic? Perhaps it is just other theologians.

Expand full comment

There are many reasons one might want to leave a group. If "because rational arguments convinced me" is a better looking reason that the real reason, you may want to claim that was your reason.

Expand full comment

Then why do rational arguments sometimes convince people to leave a group? I suspect your answer will be that these people are now signalling to the 'rational' group instead of the original group by accepting 'rational' arguments. Would it be going to far to say that no one is persuaded by rational arguments, or that no one is rational. Some of us just try to signal to those who think of themselves as rational.

Expand full comment

No no, but in some cases acting crazy can make people more cautious around you, less inclined to harass you, possibly more likely to give in to you.

Expand full comment

It must have been a slow day at the John P. Meier household when he came to this conclusion, "The a priori conviction of such polemics is simple and unshakeable:"

Groups provide for continuity of purpose and as long as people like the ideas of the group, they will remain.

Expand full comment

This rhymes with signals among the criminals where you need to signal that you won't betray your companions when/if it is in your interest.

Expand full comment

An excellent essay, but I think a better one line summary would be:

"If you want to be wrong less often, keep the number of beliefs you identify with small."

Expand full comment

>So let’s take them seriously when they say they’re going to nuke everyone, and give in to them more readily.

So you want to encourage people like that? The better response to someone who you think will attack you is to do it unto them first. I believe in non-aggression, but I also consider capability and explicit threat to be more than enough justification for a preemptive strike.

Expand full comment

Marc Stiegler, in "David's Sling", wrote, "To determine how emotionally involved you are in your position, argue the opposing position as though your life depended on it." (I don't have the novel in front of me, so the wording is probably off).

One thing to be very wary of is the tendency to dismiss particular facts that could undermine your position. It is easy to prove anything if you can cherry-pick the facts. Your position should take into account ALL of the facts available to you; any that you don't use or dismiss should be explicitly accounted for.

Expand full comment

Ideology is rarely cohesive, so we do have to signal to [the speaker/writer/head/ingroup] that we want believe, regardless of external contradiction. This is like a vote of confidence in the person/group/idea that there is some fundamentally compelling insight and that insight is worth pursuing further (and most likely, that the topic is interwoven with the speaker's personal history somehow), but it doesn't stop there. Once you join a group, you're more likely to believe their insights and supporting arguments (exposure works as a justificatory mechanism), as well as personally identify with the pathos of what's going on. That being said, I don't think it is willingness to stay in the face of unreasonable or bad arguments that forms the necessary link to a group--rather, it is the social cohesiveness and blend between self- and group-- identity, though that doesn't foreclose all possibility of moving from group to group or identifying with more than one group at a time (even if the group you outwardly conform with requires exclusivity).

Expand full comment

Pinker articulates that argument on video here:

http://www.youtube.com/watc...

Expand full comment

This resonates strongly with Paul Graham's essay on Identity: http://www.paulgraham.com/i.... If you want to be right more often, keep your identity small.

Expand full comment