11 Comments

Cognitive load makes people also prone to all sorts of attribution errors and anchoring (see Gilbert 1989, 90 etc).It is probably a joke, but that basically means "bring it on, System 1!" 

Expand full comment

I immediately think of "stress positions" and other features of interrogation. The purpose in interrogation is to get information that the possessor doesn't want to give up accurately. Putting this person in a situation where his mind is highly occupied with other things (dealing with fear of drowing, falling, and so on) presumably makes him a LOT less clever at telling lies that will mask the truth.

If the mind is busy doing other stuff, it doesn't have bandwidth to lie. Who knew.

Expand full comment

I think your inference is correct., and its result veridical. If the problem in making a judgment is (mainly) one of overcoming self-serving bias, then you should make it quickly, under high cognitive load. Many problems, of course, require overcoming obstacles besides self-interest. If the problem itself imposes a high cognitive load, you won't benefit from making it in conditions of high cognitive load. This seems to be the point that many commenters are making, but it doesn't vitiate the point about overcoming self-serving bias as such.

Expand full comment

This is odd as it seems to ethier contradict this study or suggest the relationship is more complex. The link suggest that cognitive load does not change the amount of utilitarian v. non utilitarian solutions to trolley-type problems. But it does make utilitarian responses come out more slowly.

So it looks like if both studies are accurate cognitive load reduces your self-interest but doesn't actually help you to solve trolley problems. So the solution proposed may not be so general as thought.

Expand full comment

Kudos on digging that stuff up.

Expand full comment

People under cognitive load make more affect-based decisions (like choosing cake over fruit) and are more vulnerable to suggestion from advertising even after the cognitive load is removed.

The salience bias seems to be amplified by cognitive load.

Expand full comment

Absence of "the usual bias towards self" is not the same thing as "honesty". It could simply be another bias.

Expand full comment

Coincidentally enough, just now reading Pinker's new book _Better Angels of Our Nature_ etc, ran into the experiment (I think):

> A pair of social psychologists, Piercarlo Valdesolo and David DeSteno, have devised an ingenious experiment that catches people in the act of true, dualbook self -deception.30They asked the participants to cooperate with them in planning and evaluating a study in which half of them would get a pleasant and easy task, namely looking through photographs for ten minutes, and half would get a tedious and difficult one, namely solving math problems for fortyfive minutes. They told the participants that they were being run in pairs, but that the experimenters had not yet settled on the best way to decide who got which task. So they allowed each participant to choose one of two methods to decide who would get the pleasant task and who would get the unpleasant one. The participants could just choose the easy task for themselves, or they could use a random number generator to decide who got which. Human selfishness being what it is, almost everyone kept the pleasant task for themselves. Later they were given an anonymous questionnaire to evaluate the experiment which unobtrusively slipped in a question about whether the participants thought that their decision had been fair. Human hypocrisy being what it is, most of them said it was. Then the experimenters described the selfish choice to another group of participants and asked them how fairly the selfish subject acted. Not surprisingly, they didn’t think it was fair at all. The difference between the way people judge other people’s behavior and the way they judge their own behavior is a classic instance of a self-serving bias.>> But now comes the key question. Did the self-servers really, deep down, believe that they were acting fairly? Or did the conscious spin doctor in their brains just say that, while the unconscious reality-checker registered the truth? To find out, the psychologists tied up the conscious mind by forcing a group of participants to keep seven digits in memory while they evaluated the experiment, including the judgment about whether they (or others) had acted fairly. With the conscious mind distracted, the terrible truth came out: the participants judged themselves as harshly as they judged other people. This vindicates Trivers’s theory that the truth was in there all along.

Expand full comment

Interesting, but I can't help but think that distraction introduces other biases and stupidities. The architectural implications are much more interesting.

Expand full comment

"This suggests an interesting way to avoid bias"A specific type of bias until this is replicated for other known biases. If that hasn't already been done, somebody do it!

Expand full comment

If the bias is "seeing oneself acting acting more fairly", what difference does it possibly make? None at all to action. This seems in line with making decisions and justifying them rationally after the fact and, if there is anything to be avoided, it is asking people why they do what they do.

Expand full comment