Beware Inward Apologetics

Beware oft-rehearsed but rarely-performed arguments on why-our-group-is-right:

Recently I came across a quotation that expressed, with wonderful clarity, something that I kind of half-knew but had not articulated so well to myself.  The historian John P. Meier …:

Despite the theoretical purpose of addressing and confuting one’s adversaries outside, most religious apologetics and polemics are directed inward.  Their real function is to give a sense of assurance and reinforcement to the group producing the polemics.  Most apologetics and polemics are thus an attempt to shore up group solidarity and conviction within a community that feels insecure and under attack.  The a priori conviction of such polemics is simple and unshakeable: “We are right and they are wrong, and now we will think up some reasons to prove that they are wrong.”

… What I am curious about is why this would be an effective means of building group solidarity.  Does the essential irrationality of the arguments have a function in building group solidarity?  Or do group alliances matter only when the subject—politics, religion, scholarship—is so difficult that clear conclusions are impossible to establish?

That is Brian Malley; hat tip to Stan Tsirulnikov.

We can signal loyalty to a group by showing our confidence in its beliefs.  And our ability to offer many reasonable arguments for its beliefs suggests such confidence.  But sometimes we can show even stronger loyalty by showing a willingness to embrace unreasonable arguments for our group’s beliefs.  Someone who supports a group because he thinks it has reasonable supporting arguments might well desert that group should he find better arguments against it.  Someone willing to embrace unreasonable arguments for his group shows a willingness to continue supporting them no matter which way the argument winds blow.

GD Star Rating
loading...
Tagged as:
Trackback URL:
  • Mat

    If it’s inward apologetics, how do you know about it? Can you read my mind? If by inward you mean it’s only shared among the followers of a specific group, don’t you think they know the information is eventually going to leave the group? Perhaps they just want a lag time so that they can move on to another apologetic by the time their old one leaves the confines of the group so that they have counters to outsider responses.

  • http://akinokure.blogspot.com agnostic

    In *How the Mind Works*, Steven Pinker reviews similar ideas for why we fall irrationally, uncontrollably in love. If you were only with someone for rational reasons — youthful skin, warm smile, or good sense of humor (you always have to throw that one in the list) — then you’d dump them in a heartbeat once those traits began to fade.

    But if you’re with someone because you just can’t help it, you’ll stick it out and their investment in the relationship won’t go mostly to waste.

    We love our ideologies for richer or poorer, in sickness and in health.

  • Pingback: Recomendaciones « intelib

  • http://asymptosis.com Steve Roth

    This effect–signaling group loyalty by deploying irrational arguments–would be particularly powerful in groups that put a high value on group loyalty as a moral good.

    Conservatives, for instance, have been shown to have that attribute, liberals much less so.

    http://psycnet.apa.org/index.cfm?fa=buy.optionToBuy&id=2009-05192-002&CFID=3553993&CFTOKEN=21910898

  • Dan

    Sorry this is not very original… Actually it is quite common, we even have an old saying for it when it becomes tedious: “Preaching to the choir”.

    We have a boatload of constant arguing but not much convincing, so why do we do it if the ROI is supposedly so low? Because it isn’t it has enormous defensive utility…

    Actually good arguments (and some bad ones) has high persuasive utility, that is why so much is invested in apologetic s, you will soon find yourself poached into extinction if you aren’t constantly engaged in defensive arguments. Also obviously the ones offering the best (or most loony) apologetic s will also rise to the top…

  • http://www.thefaithheuristic.com Justin Martyr

    I’m skeptical that this is a real bias or something to “beware.” As Thomas Sowell writes in A Conflict of Visions, “It would be good to able to say that we should dispense with visions entirely, and deal only with reality. But that may be the most utopian vision of all. Reality is far too complex to be comprehended by any given mind. Visions are like maps that guide us through a tangle of bewildering complexities.”

    Yes, we introduce a bias when we organize our thoughts and arrange them into a coherent picture. But the costs of this bias is lower than the costs of the alternative. Rather, the correct thing to do is to engage with intelligent and informed people with whom you disagree.

    • http://williambswift.blogspot.com/ billswift

      Marc Stiegler, in “David’s Sling”, wrote, “To determine how emotionally involved you are in your position, argue the opposing position as though your life depended on it.” (I don’t have the novel in front of me, so the wording is probably off).

      One thing to be very wary of is the tendency to dismiss particular facts that could undermine your position. It is easy to prove anything if you can cherry-pick the facts. Your position should take into account ALL of the facts available to you; any that you don’t use or dismiss should be explicitly accounted for.

  • http://michaelkenny.blogspot.com Mike Kenny

    It seems like showing irrational commitment to a group might send a bad signal to others. “This guy is nuts. I can’t cooperate with him, because he doesn’t see simple facts and follow logic.” Maybe this is good for the group with followers with irrational beliefs, because it throws off enemies who might try to strategize by considering rational choice theory. “Well, if they were rational they’d do this, but they aren’t, so who knows what they are going to do. So let’s take them seriously when they say they’re going to nuke everyone, and give in to them more readily.”

    • http://williambswift.blogspot.com/ billswift

      >So let’s take them seriously when they say they’re going to nuke everyone, and give in to them more readily.

      So you want to encourage people like that? The better response to someone who you think will attack you is to do it unto them first. I believe in non-aggression, but I also consider capability and explicit threat to be more than enough justification for a preemptive strike.

      • http://michaelkenny.blogspot.com Mike Kenny

        No no, but in some cases acting crazy can make people more cautious around you, less inclined to harass you, possibly more likely to give in to you.

  • http://manwhoisthursday.blogspot.com Thursday

    It’s all a delicate balancing act. If you can’t come up with good reasons for your ideology, you will start to bleed your most intelligent and intellectually honest people. Example: Traditional Catholicism has a lot easier time retaining it’s intellectuals (and even attracting some new ones) than Mormonism despite Mormonism being much more socially cohesive than Catholicism.

  • fburnaby

    This resonates strongly with Paul Graham’s essay on Identity: http://www.paulgraham.com/identity.html. If you want to be right more often, keep your identity small.

    • http://williambswift.blogspot.com/ billswift

      An excellent essay, but I think a better one line summary would be:

      “If you want to be wrong less often, keep the number of beliefs you identify with small.”

  • Pingback: uberVU - social comments

  • justin

    Ideology is rarely cohesive, so we do have to signal to [the speaker/writer/head/ingroup] that we want believe, regardless of external contradiction. This is like a vote of confidence in the person/group/idea that there is some fundamentally compelling insight and that insight is worth pursuing further (and most likely, that the topic is interwoven with the speaker’s personal history somehow), but it doesn’t stop there. Once you join a group, you’re more likely to believe their insights and supporting arguments (exposure works as a justificatory mechanism), as well as personally identify with the pathos of what’s going on. That being said, I don’t think it is willingness to stay in the face of unreasonable or bad arguments that forms the necessary link to a group–rather, it is the social cohesiveness and blend between self- and group– identity, though that doesn’t foreclose all possibility of moving from group to group or identifying with more than one group at a time (even if the group you outwardly conform with requires exclusivity).

  • A dude

    This rhymes with signals among the criminals where you need to signal that you won’t betray your companions when/if it is in your interest.

  • http://www.rationalmechanism.com richard silliker

    It must have been a slow day at the John P. Meier household when he came to this conclusion, “The a priori conviction of such polemics is simple and unshakeable:”

    Groups provide for continuity of purpose and as long as people like the ideas of the group, they will remain.

  • Charlie

    Then why do rational arguments sometimes convince people to leave a group? I suspect your answer will be that these people are now signalling to the ‘rational’ group instead of the original group by accepting ‘rational’ arguments. Would it be going to far to say that no one is persuaded by rational arguments, or that no one is rational. Some of us just try to signal to those who think of themselves as rational.

    • http://hanson.gmu.edu Robin Hanson

      There are many reasons one might want to leave a group. If “because rational arguments convinced me” is a better looking reason that the real reason, you may want to claim that was your reason.

  • Steven Hales

    Doesn’t experimental econ handle this idea in information cascades leading individuals to ignore their own private information and follow the herd? If the herd is wrong then loyalties will be weak the cascade dies out. In the case of trying to prove a negative “wrongness” is more difficult to establish (as you outline above) and loyalties might be stronger or more resistant to challenges. Reiterating the argument by respected agents might reinforce the herd behavior and again private doubts are ignored say in the case of religious loyalty or loyalty to a cause like “save the earth”. Isn’t this also a network problem as in six degrees of separation. The relative strength of the connections or closeness to the center might explain degrees of loyalty.

    In a real world case of apologetics theologians shore up christian doctrine in their attacks on Rick Warren, the popular evangelical minister. Since Warren deviates from accepted christian doctrine in his church he is fertile ground for christian apologetics. Bur Warren’s ideas are an information cascade that the theologians can’t effectively counter. His ideas seem to be accepted by many christians of all denominations even though many of Warren’s ideas and practices run counter to their own church’s liturgical message. So we have an apologetic within an information cascade engulfing the target of the apologetic but having little real effect. So who is the target of the apologetic? Perhaps it is just other theologians.

  • Buck Farmer

    How about the simple fact that offering a reason makes a statement or request more likely to be believed regardless of what the reason is. See the photocopy experiment where people complied 93% of the time when asked:

    “Can I use the photocopier because I need to make copies?”

    Compared to 94% for:

    “Can I use the photocopier because I’m in a rush?”

    or 60% for:

    “Can I use the photocopier?”

  • Charles Twardy

    The discussion attempts to answer “Does the essential irrationality of the arguments have a function in building group solidarity?”, but I must have missed the link where apologetics (Christian, Bayesian, Mathematician) were shown to be essentially irrational.

  • http://www.angryblog.org Brian Moore

    I know I’m late, but Daniel Dennett uses almost the exact same wording as this:

    “Someone willing to embrace unreasonable arguments for his group shows a willingness to continue supporting them no matter which way the argument winds blow.”

    … in his book “Breaking the Spell: Religion as a Natural Phenomenon.” Which I think anyone reading this site would enjoy, if they haven’t read it already.

  • http://assistantvillageidiot.blogspot.com Assistant Village Idiot

    Max Planck: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” We usually accuse the religious or political groups of engaging in this behavior, but there may indeed be a universal tendency to this perseverance. Maybe Dennett especially.

  • Pingback: Robin Hanson on our tendency to argue for our group and against enemy groups « Mike Kenny