Author Archives: Bruce Britton

Two Meanings of ‘Overcoming Bias’ – For One: Focus is Fundamental. For Second: ?

‘Overcoming Bias’ has two meanings.

First: Right Now, as in ‘You have a mistaken belief, caused by a cognitive bias you don’t know you have, and I will cause you to correct that belief by pointing out the cognitive bias which caused it.’

Almost always, these claims are disguised injunctions to change your Focus

Usually to Expand your Focus:

  • Availability Bias – Expand your Focus to include information besides the striking and vivid information that is carrying you away;
  • Confirmation Bias – Expand your  Focus to include information that lessens  the force of the information (you cherish) that confirms your existing belief;
  • Disconfirmation Bias – Expand your Focus to include information that heightens the force of information (you despise)  that is inconsistent with your existing belief;
  • Fundamental Attribution Error –Expand focus to see Situations as possible causes of others’ behavior, besides the Personality characteristics you are using now;
  • Status Quo Bias – Expand to see alternatives besides the status quo
  • De`formation Professionelle – Expand beyond the conventions of your own profession:
  • Illusion of Control – Expand to see that you may not be able to influence the outcomes of interest;

And maybe 15 others (of 67), but who’s counting?

Rarely an injunction to Narrow your Focus:
Information Bias – Narrow your Focus to seek only information that can affect action.

Second meaning of ‘Overcoming Bias’: In the Future, as in ‘How can I avoid being influenced by my own (not yet known to me ) biases in the future?’
        The only effective way I have found is to invite criticism of my ideas by others – present my ideas in seminars, send them to journals for blind reviewers, bring up with colleagues at lunch, on blogs, etc — because I am blind to the biases that I have, by definition:  if I were not blind to them I wouldn’t ‘have’ them. Of course, this only works if I am free of the Bias Blind Spot Bias. (Some biases I can prevent by avoiding the occasion of bias, as by not gambling to forestall the probability biases.)

GD Star Rating
loading...
Tagged as:

Overcoming Bias Sometimes Makes us Change our Minds, but Sometimes Not.

Suppose our opponents Overcome their Cognitive Biases? What then?

Take Global Warming. The only good reason for Believers in Global Warming to advocate expensive actions to reduce greenhouse gases is that the Benefits of prevented harm are greater than the Costs of reducing the gases. 

But suppose the vividness of climate catastrophe scenarios activates Believers’ Availability Bias, which in turn causes Focusing effects, narrowing the Believers’ focus to exclude that:

a)    There are other concurrent global crises – Malnutrition, Communicable Diseases, Access to Clean Water, Access to Education, Poverty, etc.– which are arguably as Harmful as Warming, but
b)    not enough economic resources (moneys) are available to deal with both Warming and also all the other global crises, so
c)    therefore priorities must be set, with better cost/benefit ratios winning.

Now suppose Believers Overcome their Availability and Focusing Biases.

Then economists tell Believers that the Cost/Benefit ratios for action on Warming are 1 to 5–e.g.,Costs of  reducing greenhouse gases of $10 billion gives $50 billion in Benefits of prevented harm– as  Nordhaus, Cline do.

But suppose Confirmation/Disconfirmation Bias causes the Believers to avert their eyes from Cost/Benefit ratios that economists give us for Malnutrition of  1 to 10, Communicable Disease of 1 to 50, etc.

Now suppose Believers Overcome their Confirmation/Disconfirmation Bias too.

Wouldn’t they  have to change their minds about the relative priority of expensive actions to reduce greenhouse gases?

(It did for some Nobel Prize-winning economists, UN Ambassadors, and Youth Groups. But it seems that changing their mind also required seeking the greatest good for the greatest number of people, which is not a universal value.)

On the other hand, charges of  ad  hominem  bias against believers (e.g.,  Believers just want  government grant money) won’t change our minds, because ad hominem arguements have no weight.

What about the other charges of Cognitive Bias against the Believers:
o    Overconfidence in the models that predict Harm
o    Loss Aversion and Endowment Effects causing overestimates of Costs
o    Illusion of Control causing overestimates of avoidability of Harm.

Under what circumstances would these give good reasons to change our minds?

GD Star Rating
loading...
Tagged as:

Global Warming Skeptics Charge Believers with more Cognitive Biases than Believers do Skeptics: Why the asymmetry?

Skeptics accuse Believers of 9 cognitive biases.  Believers and Skeptics mutually accuse each other of 4 more. Why don’t Believers accuse Skeptics of any others?

Skeptics accuse Believers of:

  • Overconfidence — in the predictions of their computer models.
  • Hindsight Bias — Because the computer models have (admittedly) been tweaked to post-dict past cimate changes, Believers assume wrongly that past climate events were more ‘predictable’ than they really were, according to Skeptics. 
  • Illusion of Control — Believers think that human reductions of greenhouse gases will make a large enough contribution to reduce global warming, but Skeptics think that’s an illusion.
  • Loss Aversion, exacerbated by Endowment Effects — Skeptics claim Believers overestimate the costs of warming (compared to the benefits). 
  • Bandwagon Effects
  • Appeal to Authority Fallacy
  • Availability Bias with Focusing Effects — due to the vividness of climate catastrophe scenarios.

Mutual accusations include:

  • Ad Hominem claims —  by Believers that Skeptics are beholden to oil company money: by Skeptics that Believers are seeking grant money, are anti-capitalist, anti-corporation, anti-free trade, anti-development/growth, anti-consumer, or are socialist, communist, anarchist, etc.
  • Status Quo Bias —  Skeptics claim Believers want to keep the climate stabilized at its present level, and Believers claim Skeptics want stability for present manufacturing processes, distribution of wealth, SUVs, etc. 
  • Confirmation/Disconfirmation biases —  leading to irrational belief persistence

Finally, I accuse the whole gang of subjection to Polarization Effects

But where are the Believers’ accusations of bias in the Skeptics?

GD Star Rating
loading...
Tagged as:

Are any Human Cognitive Biases Genetically Universal?

Genetically universal human traits are such things as the eye, appendix, having two legs and two arms and a head, etc. There are no exceptions to universal traits (embryological accidents don’t count).

Of 67 Cognitive Biases, 62 are not claimed  to be universal, based on published evidence for them that  ‘many’ or ‘most’ subjects show a ‘tendency’ ‘toward’ them ‘often’ to a ‘statistically significant’ degree, etc.

Of the 5 exceptions, the Anthropic Bias and the Adaptive Bias are defined as universal, so empirical evidence for universality is superfluous.  The Contrast Effect is perceptual (e.g., a hefted weight is perceived as heavier when contrasted with a lighter weight, lighter when contrasted with a heavier weight) and ‘ubiquity ‘ is claimed. The other two are memory limitations — in the Primacy and Recency Effects, items at the beginning and end of a long list are remembered better.

However, the other biases could still be genetically ‘universal’  if present in all persons without special training (as children or uneducated adults) or in all persons in genetically isolated communities.

Failing evidence of universality, a cognitive bias could  be shown to be  genetic if it runs in families (one-egg twins more similar in a particular bias than two-egg twins, etc.).

Is there any other evidence for universality of Cognitive Biases?

GD Star Rating
loading...
Tagged as:

Are Any Human Cognitive Biases Genetic?

This is  important for Overcoming Bias, because overcoming genetic biases may be much more difficult than overcoming learned biases. But it is highly controversial.

Last week, economics Professor Paul Rubin proposed the hypothesis that humans have a genetic bias opposing Free Trade.

But earlier, Matt Ridley (former US editor of the Economist) proposed a genetic bias favoring Free Trade.

In Foreign Policy (March 2007) Robin Hanson proposed that Overconfidence Bias and the Fundamental Attribution Error are genetic biases. But Daniel Kahneman objected.

Is there any evidence from genetics on these hypotheses?

The only direct evidence would be finding genes for a bias. Identifying specific genes for human traits has  recently become possible, and human genes currently evolving have been found for at least 45 traits (here, in Types of Genes Under Selection, paragraphs 4-11): But not for cognitive biases.

Two sources of indirect evidence:

If the genes are fixated, then the trait will be universal in the species ( though not all universal traits are genetic). But no one claims universality for  biases about free trade or immigration, nor does Hanson claim universality for  Overconfidence Bias or the Fundamental Attribution Error, so this doesn’t apply.

If the genes for the bias are not yet fixated but are evolving, then the bias should run in families: Biological relations should have similar biases on free trade, etc., more so the closer the genetic relations. But no such evidence has been found.

So is there no scientific evidence from genetics for the hypothesis that any cognitive biases are genetic?

Robin Hanson says:

… it is fine to spin hypotheses, and evaluate them on the basis of how well they fit with preconceptions and other hypotheses ( personal communication, 5/15/07)

Let’s spin the hypothesis that human cognitive biases are genetic: how well does this fit with our preconceptions? And how well does it fit with what other hypotheses? If it fits well with them, then are we justified in concluding that human cognitive biases are genetic?

GD Star Rating
loading...
Tagged as:

Overcome Cognitive Bias with Multiple Selves

Four Cognitive Biases can be overcome in part by powering up some multiple selves, including deformation professionelle, hyperbolic discounting, and  myside bias (to include  confirmation bias and disconfirmation bias).

Our tendency to look at an issue  only according to the conventions of our own profession can be countered by intentionally switching to another self, for example, from one’s Inner Economist to Mr. Dad.

Our Hyperbolic Discounting Bias  explains aspects of addiction, not saving for retirement, borrowing on credit cards, buying health club memberships (vs. using them) and procrastination. We may be able to counter it by unleashing our long-term (responsible) self against our short-term (impulsive) self  — confronting Dr. Jekyll with Mr. Hyde, or setting The Very Reverend Doctor SuperEgo against Master-Monster Id (as in this Dual-self Model of Impulse Control).

Our Myside Bias makes us actively search for only confirmatory evidence for our existing beliefs, and should we happen to stumble across disconfirming evidence,  to scrutinize it intensively for shortcomings. It’s as if we had fallen in love with our own belief, thus admitting our ‘   …intellectual affections to the place that should be  dominated by impartial intellectual rectitude. "

But suppose we free the multiple Lovers within our self, so that each of Us can caress his favorite one of a whole bevy of Multiple Working Hypotheses,  each of whom one of us loves the best of all when we are  with them?  Then  for our whole Pride of Lovers within,  ‘Having thus neutralized the partialities of [our] emotional nature … proceeds with a certain natural and unforced erectness of mental attitude to the investigation…’

Finally, ‘Personality’ is arguably situation dependent ; indeed many of us already know we can switch selves simply by putting ourselves in a different situation (in fantasy seems to work as well as in reality). Is this the accurate part of our trait ascription/ actor-observer bias?  And is the wrong part our denial of the same mutability to others?  My answer to that question starts with the question: Which do I know the best, myself or others?  And one  good  answer seems to be: Do I know anything at all outside of my own selves?

GD Star Rating
loading...
Tagged as:

Paternal policies fight cognitive bias, slash information costs, and privelege responsible subselves

All of us are of diminished capacity when cognitive biases degrade our thinking, and one way to fight cognitive biases is to expose our thinking to the scrutiny of others, as was done in the recent debate on Paternalism. In all of our thinking, whether we are thinking of taking cocaine or Dr. Quacko’s Snake Oil Miracle Tonic, or gambling with futures options or roulette wheels, we are subject to cognitive biases. For example, for gambling we humans are subject to the following biases: gamblers fallacy, clustering illusion, availability heuristic, attentional bias, illusory correlation, ludic fallacy, optimism bias, overconfidence effect, positive outcome bias, rosy retrospection, and the Texas sharpshooter bias.

To help us avoid these biases we have hired representatives, who create agencies (like the FDA and the SEC) with committees and subcommittees that debate the issue: they put on seminars and conferences and write working papers and white papers and cost/benefit analyses, and invite comments, etc, in short, consider the matter in depth, and then decide to ban certain drugs or activities, and not others. We voters ratify this every 2 years, except when we change our mind, as with alcohol, tobacco, or thalidomide.

Besides saving our lives, minds and fortunes, a side benefit is savings for us citizens in information costs, because we citizens don’t have to read all the papers, and a good thing too, because we have to use our time to make profits, the froth from which is used to pay our representatives. (We also save on decision costs.)

All this we do because we know we have many selves within ourselves, including a short term self who wants to snort cocaine, and guzzle Dr. Quacko’s Tonic because ‘everyone’s doing it’ (the bandwagon bias), and try our luck with porkbellies, who has a bad case of the Bias Blind Spot (the meta-bias which makes us think we don’t have to compensate for our cognitive biases) VS. a long-term self, who knows it has cognitive biases and knows it can’t overcome them, not alone, it needs help.

GD Star Rating
loading...
Tagged as:

Just World Bias and Inequality

Discussions of inequality are pervaded by the Just World Bias — the tendency for people to  believe that the world is or ought to be ‘just’ and people should get what they deserve. But often equality for some entails inequality for others, like hanging lead weights on ballerinas.

But the serious general problem for overcoming cognitive bias is the cases where the biases come in opposed pairs, with an example being the Just World Bias versus what I’ll call the Perverse World Bias (or Murphy’s bias): ‘If I take my umbrella it’s sure to be fine, but if I don’t, it’s sure to rain,’ or ‘No good deed goes unpunished,’ etc.

Many proverbs have a similar problem, as in ‘Look before you leap’ versus ‘He who hesitates is lost.’ Proverb pairs with this property are useless as guides to action, though handy for hindsight.

Opposed pairs of biases have a similar shortcoming — because they cover all the ground, they end up covering none at all, and throw us back on ‘It depends’ which is where we started anyway.

How many opposed pairs of cognitive biases are there?

GD Star Rating
loading...
Tagged as:

Benefits of Cost-Benefit Analyis

Cass Sunstein in ‘Cognition and Cost-Benefit Analysis‘ thinks at least six cognitive biases can be overcome by pushing your mind through a cost-benefit analysis.

Loss Aversion, Endowment Effects, and Ignoring Trade-offs are obviously all about cost-benefit arithmetic — addition and subtraction, mainly. The Availability Heuristic — the tendency to base  decisions on whatever memories are most vivid or quickly available — is counteracted by cost-benefit analyses’ foregrounding of less vivid and more remote but clearly probative information.

This also addresses Informational and Reputational Cascades based on Availability bias — people would believe the sky is falling if Al Gore told them,  but if he doesn’t mention how much it would cost to prevent, just the hazards of not doing it ….. You really need to look at both the costs and the benefits.

Most generally, explicit cross-comparisons of cost-benefit ratios between alternative courses of action foregrounds the incoherence of our belief systems; coherence is often what we seek to maintain by mobilizing our biases, but often the world doesn’t come furnished with coherence.

While teaching a course on Lomborg’s Copenhagen Consensus (‘ Global Crises, Global Solutions’ Cambridge University Press, 2004 — it’s all cost-benefit analyses) mere exposure to the analyses caused noticable shifts (even in elderly students) in deeply held views on topics including Climate Change, Migration and Free Trade.

GD Star Rating
loading...
Tagged as: