Making people feel powerful makes them good at lying: Dana Carney divided research subjects into two groups: bosses and employees. Bosses got larger offices and more power; they were asked, for instance, to assign employees’ salaries. Half of all subjects were instructed by a computer to steal a $100 bill. If they could convince an interviewer they hadn’t taken it, they could keep it. The other subjects were questioned as well. In the interviews, lying bosses displayed fewer involuntary signs of dishonesty and stress. … We measured subjects on five variables that indicate lying—involuntary shoulder shrugs, accelerated speech, the level of the stress hormone cortisol in their saliva, cognitive impairment, and emotional distress. Only the low-power liars could be “seen” as lying; the readings for the liars with power were essentially the same as those for truth tellers on all five variables. (
Could this be not about lying and fear, but about doing everything better, once one is given a bit of a boost? The "bosses" in the experiment may have became better at a number of things, not just lying, once they were put into this category. Being named a boss is after all an assurance of support of the majority of people around you, of respect and even of less critical attitude toward them - nobody picks on a boss; while being named a subordinate is an assurance that one will be judged and perhaps picked upon. Would it be surprising to learn that 'bosses' became better at organizational and complex tasks and perhaps worse at some manual tasks?
You're definitely right, and I think that's what people weigh against subservience -- is this "too big a lie"?
In your "chemical weapon" case I think perhaps what truly determines if someone will "call out" the leader is which side you're on. If that leader is part of an opposing group, you will certainly do so. But if the leader you should call out is on your side, I think very few people will -- case in point: for every whistleblower in government or industry there are hundreds who knew the same information but did nothing. People have indeed kept silent about weapons testing on ignorant civilians in our own society.
The ideal sort of "loyalty test" is one that has no larger-group costs (or even benefits, like not eating potentially disease causing pork) but has high (but not permanent or crippling) costs to the individual. Also important is that it be visible and frequently verifiable. I think this defines lots of these such rules in many religions, such as the kosher/halal rules.
Maybe because our brains tailor our emotional reactions to things depending on where we happen to be in the hierarchy? If you are lower status, your brain tells to be afraid of conflict, ambition, lying etc., to reduce your amition to get killed by challenging higher status people. But if the high-status people in our group get killed off, our brains tell us to stop being fearful and start being more decisive, confident in our ability to lie, etc. I guess I wonder if the increased comfort with lying in the artificially status enhanced group is just part of a broader set of alpha-type traits that our brains signal us to start displaying when circumstances put us in a high status position.
I just added to the post.
But why would fear trigger us to make our lies visible? If you think it is fear itself that is visible, then why do we leak fear more than other emotions?
In the experimental setup though there's no rational fear of getting caught beyond fear of the loss of $100. Since being the leader or subordinate in this case doesn't effect your ability to retain the $100, there should be no difference if the only mechanism was rational fear of getting caught (and losing $100).
Are you positing that fear of getting caught behaves differently? Can you elaborate?
Presumably, the countervailing force is the harm or inconvenience that is caused to the underclass by permitting the overclass to lie.
To build on your arcane rules example, it demonstrates loyalty to not eat meat on Fridays for group solidarity, and it demonstrates even greater loyalty to say only eat the flesh of animals that die of natural causes. The former is an inconvenience, but the latter is a serious nutritional risk.
Is it any wonder that some of the most extreme signals of group solidarity have only limited adoption?
In the lying case, we might show deference to lies about say personal sexual conduct, but would we show the same deference to lies about a chemical weapon that had been secretly tested on the populace?
The latter demonstrates much greater loyalty than the former, but places the underclass at much more real risk, making that loyalty less valuable.
Isn't it simpler if you drop the remorse hypothesis and merely posit that the powerful don't fear getting caught?
Another argument for insider trading.
"Is this just an example of the underclass signalling to the overclass that they are harmless and not worth harming? i.e. “I’ll let you lie about small things, because I know you could hurt me if I held you to higher standards, and am no threat to you,”"
I think that's a very powerful motivation. But I think it probably extends beyond "small lies" -- because think about it: letting your leader get away with a small lie certainly shows some loyalty. But think if you let him get away with a big lie? Doesn't that show your loyalty even more?
I vaguely remember something in the Daniel Dennett book about religion, (Piercing the Veil?) where he said the reason so many religions had such hard-to-follow arcane rules was that they were better signals for group loyalty. No one would be impressed if you adhered to a rule that said "you shall eat food each day" but they are impressed if you adhere to one that had specific rules about the presentation, cooking and timing of such meals.
The more outrageous the requirement or lie that you accept, the ever more loyal you appear to be. You are essentially saying "if I'm crazy enough to do this, what won't I do to support our side?"
Awesome find, Robin.
For your explanation to work (where elites can recognize each other lying), the underclass would have to be willingly or unconciously ignoring the signals that the overclass are lying.
Is this just an example of the underclass signalling to the overclass that they are harmless and not worth harming? i.e. "I'll let you lie about small things, because I know you could hurt me if I held you to higher standards, and am no threat to you,"
If so then why lie at all? I think there are good social reasons for it (the underclass collectively is stronger than the overclass at times), but I suspect teasing out what drives the equilibrium point will be hard. Do societies with more inequality exhibit more blatant bad behavior and less lying by elites?
Could these famous "tells" i.e. involuntary shoulder shrugs, rapid speech, etc. just be a way for the underclass to signal to the overclass that though they are lying they are harmless and not worth harming?
Is there a difference between big and small lies?