Monthly Archives: May 2007

Cheating as Status Symbol

Two years ago I posted on "Tantrums as Status Symbols":

CEOs throw more tantrums than mailboys.  Similarly movie stars, sports stars, and politicians throw more tantrums than ordinary people  in those industries.  Also famous for their tantrums: spoiled young wives, bigshot patriarchs, elite travelers, and toddlers.   … Of course, like a swagger, the signal is not so much the tantum itself as the fact that someone can get away with it.

A related status indicator is acting like the usual rules don’t apply to you.  From the May 9 New Scientist:

John Trinkaus … One of his specialities is the study of minor acts of dishonesty and antisocial behaviour. In his 25 years of research, one demographical group has come to stand out above all others as being most likely to push boundaries and break rules. These are not disaffected teenagers nor Italian football hooligans. They are women van drivers.

Trinkaus’s important sociological finding is perhaps best illustrated by his extensive work covertly monitoring a supermarket’s "10 items or fewer" checkout over a span of nine years. As many of us may have seen for ourselves, Trinkaus found that some shoppers using this lane had more than 10 items. Some cunningly placed their items in groups of 10 and paid for each group separately. Trinkaus found that about 80 per cent of all the supermarket lane cheats were female van drivers.

This is by no means the only time that these women have been linked with small-scale social transgressions. Trinkaus has also shown that 96 per cent of women van drivers break the speed limit, compared with 86 per cent of male ones, and in one study, a staggering 99 per cent of female van drivers failed to come to a complete stop at a T-junction with a stop sign, compared with 94 per cent of the total.

Female van drivers feel like, and are, the highest status people in their social circle.  I’ll bet they throw a lot of tantrums.

Added:  One report says "Forty-three percent of cell phone users do not turn their phones off at the movies."  If rich men happened to be more guilty here, I doubt folks would be as eager to explain this away, e.g., maybe they are just extra busy.

GD Star Rating
loading...
Tagged as: ,

Underconfident Experts?

In Myth of the Rational Voter, Bryan Caplan claims people are overconfident and otherwise irrational about their political beliefs:

Voters are worse than ignorant; they are, in a word, irrational – and vote accordingly. …  emotion and ideology – not just the facts or their “processing” – powerfully sway human judgment. Protectionist thinking is hard to uproot because it feels good. When people vote under the influence of false beliefs that feel good, democracy persistently delivers bad policies. … irrationality, like ignorance, is selective. We habitually tune out unwanted information on subjects we don’t care about. In the same vein, I claim that we turn off our rational faculties on subjects where we don’t care about the truth.

Bryan’s finds voters irrational by comparing them to experts, such as economists or toxologists, and to counter this irrationality he suggests that people use their discretion to push expert views (p. 199):

Continue reading "Underconfident Experts?" »

GD Star Rating
loading...
Tagged as: , ,

Are Almost All Investors Biased?

The data suggest, yes almost all are.  But there’s another interpretation, mainly, that people mistakenly apply a rule of thumb (higher risk-higher return) to areas where it doesn’t, and can’t, hold.  Let me explain.

Risk that can be diversified away does not generate returns in theory (see the CAPM ) or practice (see Ang, et al ).  That is, taking more of this risk does not imply a greater average return.  Given the general massive lack of diversification, the only conclusion is that most people think they can pick stocks or funds that will rise more than average. They must be wrong (on average investors are average), but they are the vast majority of investors.  Thus the traditional theory of mean-variance optimization is more prescriptive than descriptive, in that most people are not rational about their forecasts, they are biased upwards in an area of proficiency that seems to be more costly than trivial overestimation of one’s driving ability.  How can this massive delusion be an equilibrium? 

Continue reading "Are Almost All Investors Biased?" »

GD Star Rating
loading...
Tagged as:

Data Management

In January I said:

Decision makers talk and act like they want more info, and prediction markets would provide such info.  But deep down I think decision makers know they really don’t need most of the info they collect; they collect it to show they are sharp and up on the latest.   

In today’s Dilbert, Dogbert advises the pointy-haired boss:

You need a dash-board application to track your key metrics.   That way you’ll have more data to ignore when you make your decisions based on company politics.

GD Star Rating
loading...
Tagged as: ,

Are Any Human Cognitive Biases Genetic?

This is  important for Overcoming Bias, because overcoming genetic biases may be much more difficult than overcoming learned biases. But it is highly controversial.

Last week, economics Professor Paul Rubin proposed the hypothesis that humans have a genetic bias opposing Free Trade.

But earlier, Matt Ridley (former US editor of the Economist) proposed a genetic bias favoring Free Trade.

In Foreign Policy (March 2007) Robin Hanson proposed that Overconfidence Bias and the Fundamental Attribution Error are genetic biases. But Daniel Kahneman objected.

Is there any evidence from genetics on these hypotheses?

The only direct evidence would be finding genes for a bias. Identifying specific genes for human traits has  recently become possible, and human genes currently evolving have been found for at least 45 traits (here, in Types of Genes Under Selection, paragraphs 4-11): But not for cognitive biases.

Two sources of indirect evidence:

If the genes are fixated, then the trait will be universal in the species ( though not all universal traits are genetic). But no one claims universality for  biases about free trade or immigration, nor does Hanson claim universality for  Overconfidence Bias or the Fundamental Attribution Error, so this doesn’t apply.

If the genes for the bias are not yet fixated but are evolving, then the bias should run in families: Biological relations should have similar biases on free trade, etc., more so the closer the genetic relations. But no such evidence has been found.

So is there no scientific evidence from genetics for the hypothesis that any cognitive biases are genetic?

Robin Hanson says:

… it is fine to spin hypotheses, and evaluate them on the basis of how well they fit with preconceptions and other hypotheses ( personal communication, 5/15/07)

Let’s spin the hypothesis that human cognitive biases are genetic: how well does this fit with our preconceptions? And how well does it fit with what other hypotheses? If it fits well with them, then are we justified in concluding that human cognitive biases are genetic?

GD Star Rating
loading...
Tagged as:

Is Your Rationality on Standby?

Bryan Caplan’s Myth of the Rational Voter (p.126) on irrationality:

There is no need to posit that people start with a clear perception of the truth, then throw it away.  The only requirement is that rationality remain on "standby," ready to emerge when error is dangerous.

Bryan’s hypothesized "process of irrationality":

  1. Be rational on topics where you have no emotional attachment to a particular answer.
  2. On topics where you have an emotional attachment to a particular answer, keep a "lookout" for questions where false beliefs imply a substantial material cost for you.
  3. If you pay not substantial material costs of error, go with the flow; believe whatever makes you feel best.
  4. If there are substantial material costs of error, raise your level of intellectual self-discipline in order to become more objective. 
  5. Balance the emotional trauma of heightened objectivity – the progressive shattering of your comforting illusions – against the material costs of error. 

Bryan’s theory suggests we might make ourselves more rational on a topic by imagining that our beliefs actually had large personal costs, and then checking to see how tempted we are to reconsider those beliefs.  Unfortunately, I suspect our imaginations are especially unreliable about such things.  This is why I want more betting markets on important topics, where large personal costs to being wrong could tell us what we really think. 

GD Star Rating
loading...
Tagged as: ,

Joke

Here is a (slightly shortened) joke from the book The Curious Incident of The Dog in the Night-Time:

There are three men on a train. One of them is an economist and one of them is a logician and one of them is a mathematician.  And they have just crossed the border into Scotland and they see a brown cow standing in a field from the window of the train.

And the economist says, "Look, the cows in Scotland are brown."

And the logician says, "No. there are cows in Scotland of which one at least is brown."

And the mathematician says, "No. there is at least one cow in Scotland, of which one side appears to be brown."

GD Star Rating
loading...
Tagged as: ,

Opinions of the Politically Informed

Via Bryan Caplan’s Myth of the Rational Voter (p.27), Scott Althaus reviews what a better informed U.S. public would think:

Fully informed opinion on foreign policy issues is relatively more interventionist than surveyed opinion but slightly more dovish when it comes to the use and maintenance of military power. … fully informed opinion …  hold[s] more progressive attributives on a wide variety of social policy topics, particularly those framed as legal issues. …. [is] more ideologically conservative on the scope and applications of government power. … [it] tends to be fiscally conservative when it comes to expanding domestic programs, to prefer free market solutions over government intervention to solve policy problems, to be less supportive of additional government intervention to protect the environment, and to prefer a smaller and less powerful federal government. 

Bryan elaborates:

If the public’s knowledge of politics magically increased, isolationism would be less popular.  … They want to be involved in world affairs, but see an greater downside of outright war.  … a more knowledgeable public would be more pro-choice, more supportive of  gay rights, and more opposed to prayer in school.  … Beliefs about welfare and affirmative action fit the same patterns: While political knowledge increases support for equal opportunity, it decreases support for equal results.

The method here is to survey people on political facts, political opinions, and demographics, then make a model predicting opinions from demographics and fact accuracy, and finally use that model to predict average opinion given high fact accuracy.   All else equal, shouldn’t learning this make you move toward these more informed opinions?

Added:  The specific questions, and average and informed opinions, are here.

GD Star Rating
loading...
Tagged as: ,

The case for dangerous testing

In 1983, NASA was planning to bring back Martian soil samples to Earth. Contaminating the Earth with alien organisms was an issue, but engineers at Jet Propulsion Laboratories had devised a "safe" capsule re-entry system to avoid that risk. However, Carl Sagan was opposed to the idea and

explained to JPL engineers that if they were so certain […] then why not put living Anthrax germs inside it, launch it into space, then [crash the capsule back to earth] exactly like the Mars Sample Return capsule would.

The engineers helpfully responded by labeling Sagan an alarmist and extremist. But why were they so unwilling to do the test, if they were so sure of their system? The answer is probably they feared that if the test failed, their careers would be over and they would have caused a catastrophe. But an out of control Martian virus, no matter how unlikely, would have been equally a catastrophe. However, that vague threat didn’t concentrate their minds like the specific example of anthrax.

Imagine for a moment that those engineers had been forced to do Sagan’s test. Fear of specific disaster would have erased their overconfidence, and they would have moved from ‘being sure that things will go right’ to ‘imagining all the ways things could go wrong’ – and preventing them. The more dangerous the test, the more the engineers would have worked to overcome every contingency.

Continue reading "The case for dangerous testing" »

GD Star Rating
loading...
Tagged as:

I Had the Same Idea as David Brin! (Sort Of)

A few days ago I wrote a post about how a much more defensible position regarding religion can be disadvantaged in debate against a much less defensible one because the defensible position is a complicated and partial truth while the indefensible one is a simple and snappy falsehood.  David Brin has a similar idea on a different topic.

Oh, there is something you are now hearing over and over. The BIG ROVEAN TACTIC is this. Demand that their opponents choose a simple, one sentence strategy for Iraq.

"Well? What would YOU do?"

It is horrendous and a "Have you stopped beating your wife?" question. Because No one-sentence answer will sound mature or sage, given the horrific political, social, military, and moral quagmire that we are inheriting. Moreover, any attempt to avoid giving a one sentence answer sounds equivocating and mealy-mouthed.

GD Star Rating
loading...
Tagged as: