Remorseless Power

Making people feel powerful makes them good at lying:

Dana Carney divided research subjects into two groups: bosses and employees. Bosses got larger offices and more power; they were asked, for instance, to assign employees’ salaries. Half of all subjects were instructed by a computer to steal a $100 bill. If they could convince an interviewer they hadn’t taken it, they could keep it. The other subjects were questioned as well. In the interviews, lying bosses displayed fewer involuntary signs of dishonesty and stress. … We measured subjects on five variables that indicate lying—involuntary shoulder shrugs, accelerated speech, the level of the stress hormone cortisol in their saliva, cognitive impairment, and emotional distress. Only the low-power liars could be “seen” as lying; the readings for the liars with power were essentially the same as those for truth tellers on all five variables. (more)

I’d always heard that the reason humans can’t lie well is because our minds are leaky, sending signals about our anxiety every which way.  But this result suggests not; it suggests we are quite capable of lying well, but are designed to not always use that full capacity.  So now I’ll guess the same thing holds for blushing; we often reveal feelings we think we want to hide via blushing, but are quite capable of not doing so if we feel powerful enough.

I’ve been posting lots lately on ways we seem to give the powerful a pass, not holding them to the same high standards we hold others, perhaps for deference or fear of retribution.  So now I’ll guess that we blush and leak lies out of a fear of a larger punishment if we are caught; for the non-powerful, the punishment for a norm violation when we give out such clues that we feel guilty about our violation is far less than if we don’t give out such clues.  The powerful apparently needn’t fear such extra punishment for remorseless lies, though they do fear being caught lying.  Why?

Perhaps for our homo hypocritus ancestors, the implicit elites in a band were better able to read such clues, either via better raw abilities or because power frees one to use such abilities (perhaps by reducing fear of retribution).  So by lying but giving off subtle clues about your lies you might have been saying to the elites, “I’m only lying to these other fools, not to you.”  When elites caught non-elites in well-hidden remorseless lies, they made sure to punish them much more severely.

FYI, one can also make folks feel powerful just by making their body take up more space:

You know how peacocks spread their feathers? What they’re doing is taking up more space, an assertion of power that’s common in animals. Cobras rear; birds spread their wings. Humans do it, too. Think of the CEO with his feet up on the desk, leaning back in his chair, hands clasped behind his head with elbows out … We found that people in power poses show higher testosterone levels and lower cortisol levels. They feel more powerful and less stressed out, just because they take up more space. When prompted, they take more risks than people in subordinate poses. (more)

Added 3p: The details that give away lies are much less reliably communicated to distant others.  You could get folks to clearly testify that someone had said certain words, but this would be much harder to do regarding how much speech was sped-up, or how unusual were any shoulder shrugs.  So to enforce an added punishment based on the presence of these added clues, one needs enough discretion to be able to act on one’s own judgement, rather than on what one can prove to outsiders.  Perhaps powerful folks can better prevent those they hurt with such lies from acting with such local discretion.

Added 7p: Perhaps this is like my suggestion that we “Choke to Submit“; perhaps lying with relaxed confidence is seen as a bid for high status, which if discovered will be squashed vigorously on those who can’t support such status.

GD Star Rating
loading...
Tagged as: , ,
Trackback URL:
  • Buck Farmer

    Awesome find, Robin.

    For your explanation to work (where elites can recognize each other lying), the underclass would have to be willingly or unconciously ignoring the signals that the overclass are lying.

    Is this just an example of the underclass signalling to the overclass that they are harmless and not worth harming? i.e. “I’ll let you lie about small things, because I know you could hurt me if I held you to higher standards, and am no threat to you,”

    If so then why lie at all? I think there are good social reasons for it (the underclass collectively is stronger than the overclass at times), but I suspect teasing out what drives the equilibrium point will be hard. Do societies with more inequality exhibit more blatant bad behavior and less lying by elites?

    Could these famous “tells” i.e. involuntary shoulder shrugs, rapid speech, etc. just be a way for the underclass to signal to the overclass that though they are lying they are harmless and not worth harming?

    Is there a difference between big and small lies?

  • http://www.angryblog.org Brian Moore

    “Is this just an example of the underclass signalling to the overclass that they are harmless and not worth harming? i.e. “I’ll let you lie about small things, because I know you could hurt me if I held you to higher standards, and am no threat to you,””

    I think that’s a very powerful motivation. But I think it probably extends beyond “small lies” — because think about it: letting your leader get away with a small lie certainly shows some loyalty. But think if you let him get away with a big lie? Doesn’t that show your loyalty even more?

    I vaguely remember something in the Daniel Dennett book about religion, (Piercing the Veil?) where he said the reason so many religions had such hard-to-follow arcane rules was that they were better signals for group loyalty. No one would be impressed if you adhered to a rule that said “you shall eat food each day” but they are impressed if you adhere to one that had specific rules about the presentation, cooking and timing of such meals.

    The more outrageous the requirement or lie that you accept, the ever more loyal you appear to be. You are essentially saying “if I’m crazy enough to do this, what won’t I do to support our side?”

    • Buck Farmer

      Presumably, the countervailing force is the harm or inconvenience that is caused to the underclass by permitting the overclass to lie.

      To build on your arcane rules example, it demonstrates loyalty to not eat meat on Fridays for group solidarity, and it demonstrates even greater loyalty to say only eat the flesh of animals that die of natural causes. The former is an inconvenience, but the latter is a serious nutritional risk.

      Is it any wonder that some of the most extreme signals of group solidarity have only limited adoption?

      In the lying case, we might show deference to lies about say personal sexual conduct, but would we show the same deference to lies about a chemical weapon that had been secretly tested on the populace?

      The latter demonstrates much greater loyalty than the former, but places the underclass at much more real risk, making that loyalty less valuable.

      • http://www.angryblog.org Brian Moore

        You’re definitely right, and I think that’s what people weigh against subservience — is this “too big a lie”?

        In your “chemical weapon” case I think perhaps what truly determines if someone will “call out” the leader is which side you’re on. If that leader is part of an opposing group, you will certainly do so. But if the leader you should call out is on your side, I think very few people will — case in point: for every whistleblower in government or industry there are hundreds who knew the same information but did nothing. People have indeed kept silent about weapons testing on ignorant civilians in our own society.

        The ideal sort of “loyalty test” is one that has no larger-group costs (or even benefits, like not eating potentially disease causing pork) but has high (but not permanent or crippling) costs to the individual. Also important is that it be visible and frequently verifiable. I think this defines lots of these such rules in many religions, such as the kosher/halal rules.

  • Lo Statuz

    Another argument for insider trading.

  • Douglas Knight

    Isn’t it simpler if you drop the remorse hypothesis and merely posit that the powerful don’t fear getting caught?

    • Buck Farmer

      In the experimental setup though there’s no rational fear of getting caught beyond fear of the loss of $100. Since being the leader or subordinate in this case doesn’t effect your ability to retain the $100, there should be no difference if the only mechanism was rational fear of getting caught (and losing $100).

      Are you positing that fear of getting caught behaves differently? Can you elaborate?

    • http://hanson.gmu.edu Robin Hanson

      But why would fear trigger us to make our lies visible? If you think it is fear itself that is visible, then why do we leak fear more than other emotions?

      • http://thelandquestion.blogspot.com/ Quiet Griot

        Maybe because our brains tailor our emotional reactions to things depending on where we happen to be in the hierarchy? If you are lower status, your brain tells to be afraid of conflict, ambition, lying etc., to reduce your amition to get killed by challenging higher status people. But if the high-status people in our group get killed off, our brains tell us to stop being fearful and start being more decisive, confident in our ability to lie, etc. I guess I wonder if the increased comfort with lying in the artificially status enhanced group is just part of a broader set of alpha-type traits that our brains signal us to start displaying when circumstances put us in a high status position.

  • http://hanson.gmu.edu Robin Hanson

    I just added to the post.

  • Pingback: Tweets that mention Overcoming Bias : Remorseless Power -- Topsy.com

  • Galina Davidoff

    Could this be not about lying and fear, but about doing everything better, once one is given a bit of a boost? The “bosses” in the experiment may have became better at a number of things, not just lying, once they were put into this category. Being named a boss is after all an assurance of support of the majority of people around you, of respect and even of less critical attitude toward them – nobody picks on a boss; while being named a subordinate is an assurance that one will be judged and perhaps picked upon. Would it be surprising to learn that ‘bosses’ became better at organizational and complex tasks and perhaps worse at some manual tasks?

  • Pingback: Law, Power, and Status « azmytheconomics