Category Archives: Standard Biases

Pretending To Be What You Are

Which is harder: pretending to be what you are, or to pretending to be what you are not?   For example, imagine you are a news reporter, and want to, via your style and manners, convince typical folks that you are a) a reporter, or b) a stuntman.  Which task would be easier?   Which task would be easier for the stuntman?  We could ask such questions about not just reporters and stuntmen, but about a wide range of other roles.

The way to convince the public that you are an X is to act the way the public thinks that X folks act.  And the more vivid an image X folks have in the public mind, and the fewer real X the public know in person, the more the way X folks are will diverge from how the public thinks they are.  And so the more work it would be for X folks to convince the public, via their manner and style, that they are in fact X.

So while it is probably easier for a shoe salesman to convince folks that they sell shoes than that they are a private investigator, I'm guessing that it is harder for a P.I. to convince folks they are a P.I. than that they sell shoes. 

GD Star Rating
loading...
Tagged as:

Telephone Game With Functions

< ?xml version="1.0" standalone="yes"?> < !DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd">

In the old telephone game each person would pass on a phrase to the next person in the chain; the final phrase might little resemble the first.  An interesting variation appeared in the Phil. Trans. Royal Society last November:

Inductivebias

Here each row is a chain of people passing along a function relating X to Y.  Each person first guesses and is corrected on 50 (X,Y) cases, then just guesses on 100 more cases.  The final guesses of the last person become data for the next person.  The final relations are all basically lines, 7/8 with a positive slope, 1/8 with a negative slope.

The lesson?  When we are mainly rewarded for predicting what others will say on a topic, rather than predicting a more basic reality, our answers become dominated by typical prior expectations; reality has little influence.  HT to Jef Allbright. More from that paper:

Continue reading "Telephone Game With Functions" »

GD Star Rating
loading...
Tagged as:

Rational Paranoia

The Post reminds us paranoia can be quite rational:

"People from North Korea are very paranoid," said Kim Heekyung, a clinical psychologist at Hanowan [in South Korea] who supervises group therapy for defectors.  Paranoia, she added, is a rational response to reality in North Korea.

A new U.N. human rights report describes North Korea as a place where ordinary people "live in fear and are pressed to inform on each other. The state practices extensive surveillance of its inhabitants. . . . Authorities have bred a culture of pervasive mistrust."

When defectors arrive at Hanowan, they whisper. They are reluctant to disclose their names or dates of birth. They question the motives of people who want to help them. They say South Koreans look down on them. On field trips from Hanowan to get their first checking accounts, they find bank tellers to be terrifying. …

"Paranoia in North Korea helped people survive, but here in South Korea, it is an obstacle to assimilation," Kim said. "Many defectors are scared to do anything."

Our problem isn't a capacity for paranoia, but is misreading clues about when to invoke that capacity.   We say someone has a mental problem if they are more paranoid than we think makes sense in our society.  But of course personal circumstances will vary, so we should beware of overconfident paternalism in judging when others are excessively paranoid.

Among North Korean defectors the opposite mental problem of insufficient paranoia is probably more common.  Alas we don't see many of those folks for obvious reasons.

GD Star Rating
loading...
Tagged as:

Incentives, Allies Cut Bias

This paper reports the results of a series of experiments designed to test whether and to what extent individuals succumb to the conjunction fallacy. Using an experimental design of Kahneman and Tversky (1983), it finds that given mild incentives, the proportion of individuals who violate the conjunction principle is significantly lower than that reported by Kahneman and Tversky. Moreover, when subjects are allowed to consult with other subjects, these proportions fall dramatically, particularly when the size of the group rises from two to three. These findings cast serious doubts about the importance and robustness of such violations for the understanding of real-life economic decisions.

More here.  Hat tip to Dan Houser.

GD Star Rating
loading...
Tagged as:

Write Your Hypothetical Apostasy

Let's say you have been promoting some view (on some complex or fraught topic – e.g. politics, religion; or any "cause" or "-ism") for some time.  When somebody criticizes this view, you spring to its defense.  You find that you can easily refute most objections, and this increases your confidence.  The view might originally have represented your best understanding of the topic.  Subsequently you have gained more evidence, experience, and insight; yet the original view is never seriously reconsidered.  You tell yourself that you remain objective and open-minded, but in fact your brain has stopped looking and listening for alternatives.

Here is a debiasing technique one might try: writing a hypothetical apostasy.  Remind yourself before you start that unless you later choose to do so, you will never have to show this text to anyone.

Imagine, if you will, that the world's destruction is at stake and the only way to save it is for you to write a one-pager that convinces a jury that your old cherished view is mistaken or at least seriously incomplete.  The more inadequate the jury thinks your old cherished view is, the greater the chances that the world is saved.  The catch is that the jury consists of earlier stages of yourself (such as yourself such as you were one year ago).  Moreover, the jury believes that you have been bribed to write your apostasy; so any assurances of the form "trust me, I am older and know better" will be ineffective.  Your only hope of saving the world is by writing an apostasy that will make the jury recognize how flawed/partial/shallow/juvenile/crude/irresponsible/incomplete and generally inadequate your old cherished view is.

(If anybody tries this, feel free to comment below on whether you found the exersise fruitful or not – but no need to state which specific view you were considering or how it changed.)

GD Star Rating
loading...
Tagged as: , ,

The surprising power of rote cognition

Even if you're familiar with the ideas that are presented on this blog, it can be surprising just how strong the forces of habit and rote cognition and behavior can be.

One of the schools of cognitive psychology that addresses biases describes "system 1" and "system 2" thinking, where "system 1" is everyday automatic processing, deciding by intuition, relying on heuristics, and totally filled with biases, and "system 2" is thoughtful and careful consideration, logical and methodical. But this seems inadequate, because we can slip into automatic cognitive patterns even when we are consciously trying to be careful.

A few examples from personal experience below the fold…

Continue reading "The surprising power of rote cognition" »

GD Star Rating
loading...
Tagged as:

Why We Like Middle Options, Small Menus

Many say that consumers are biased to prefer the middle of three options, and to buy less when offered more options.  In the latest American Economic Review, Emir Kamenica shows these need not be biases:

Numerous studies demonstrate that seemingly irrelevant factors influence people's decisions. … when three alternatives are available, the middle alternative is chosen more often than when it is paired with only one other option. … In choice overload experiments, customers are less likely to make a purchase if more products are added to the choice set. …

In this paper, I develop a model where uninformed consumers learn payoff-relevant information by observing what goods are available. The tendency to select the middle option thus naturally arises when there are consumers who are unsure which option is best for them, but know their tastes are middlebrow. Choice overload comes as no surprise if excessive product lines reduce consumers' information about which varieties are likely to suit them. …

Continue reading "Why We Like Middle Options, Small Menus" »

GD Star Rating
loading...
Tagged as:

Friendship is Relative

Violet: I like him.
Mary: You like every boy.
Violet: What's wrong with that?
                     It's a Wonderful Life

In places like Sweden, folks are more reserved and less "friendly" than in the U.S.   When reserved and friendly cultures meet, the reserved folks often say they were initially fooled into thinking others liked them in particular.  It took time to realize that their acting "friendly" did not actually indicate that they were more likely end up being friends in deeper ways.  Eventually they learned to gauge how much foreigners from that friendly culture like them by comparing how those foreigners treat them, relative to how they treat others.  Friendliness, as a signal of deeper interest and loyalty, is relative.  

The movie quote above describes a common insight, that some people are "too easy" as friends.  But salesman, politicians, etc. seem to usually act extra friendly to everyone; do we discount them enough for their being too easily "friendly"?

GD Star Rating
loading...
Tagged as:

Experts Are For Certainty

Chris Dillow:

It's a bad day for experts. The Times complains that economic forecasters are as blind as ancient soothsayers, whilst proof that Colin Stagg was innocent discredits Paul Britton's expertise as a forensic pyschologist.  To point out that experts are wrong, however, is to misunderstand the purpose of them. Their function is not to provide knowledge, and still less clear thinking. Instead, it is to provide certainty. People hate dissonance, doubt and uncertainty. Experts help dispel these.

So, Paul Britton's function was to tell the police that they had the right man, whilst economic forecasters' job is to provide an impression that the future is knowable; no-one wants to hear about standard errors, parameter uncertainty or the Lucas critique.  What's so pernicious here, though, is that people have ways of achieving an illusory certainty anyway. … 1. The confirmation bias. … 2. The halo effect. … 3. Groupthink. … 4. Ego-involvement. Admitting that we are wrong … [is] a sign that we are not the infallible, uber-competent professionals we think. We'll do anything to squirm out of facing this. …

And herein lies the purpose of experts. It's to reinforce these mechanisms, to help people avoid the uncomfortable facts that the world is uncertain, that mistakes are inevitable, and that we are not as in control of things as we think. Blaming experts for being wrong is like complaining that the economy is not yellow. It's a category error so howling as to be nonsensical.

OK, maybe Chris exaggerates; we do sometimes want experts more for info than certainty.  How can we tell when?  And how much does this contribute to resistance to prediction markets that give honest certainty estimates?  Hat tip Mark Thoma via Tyler Cowen.

GD Star Rating
loading...
Tagged as:

Distrusting Drama

Imagine someone made a unlikely claim that to you, i.e., a claim to which you would have assigned a low probability.  For many kinds of unlikely claims, such as that your mother was just in a car accident, you usually just believe them.  But if the claim was suspiciously dramatic, you may well suspect they had fallen prey to common human biases toward dramatic claims.  Here are a four reasons for such distrust:

Incentives If a stockbroker said you should buy a certain stock because it will double in a month, you might suspect he was just lying because he gets paid a commission on each trade.  You have similar incentive reasons to suspect emails from Nigerian diplomats, and I’ll-love-you-forever promises from would-be seducers.  This doubt might be overcome for those who show clear enough evidence, or showed they actually had little to gain.

Craziness If someone told you they were abducted by aliens, talked with God, or saw a chair levitate, you might suspect a serious mental malfunction, whereby he just could not reliably see or remember what he saw.  This doubt might be overcome if he showed you reliable sight and memory, and that he was not then in some altered more susceptible state of mind (e.g., trance).  Adding more similarly qualified observers would help too.

Continue reading "Distrusting Drama" »

GD Star Rating
loading...
Tagged as: