39 Comments

The technique of attributing self-destructive thoughts to homunculi inside our brain—such as "Bruce"—is in my experience quite effective. I sometimes imagine myself as the character in the movie Inside Out, and this often helps me control my negative emotions.

Expand full comment

Are you really dumb enough to believe that you can identify the most impacting biases by having people vote on them?

You are clearly to dumb to be doing any kind of research.

Expand full comment

Also, the what-the-hell effect seems to be an instance of Bruce.

Bruce is the guy in your mind constantly looking for excuses to lose.

Based on the psychological truth that it's better to lose with an excuse than to do what it takes to win.

The "Bruce"-idea actually spawned from outside LessWrong, but it's written about here: http://lesswrong.com/lw/9o/...

Expand full comment

A completely unsubstantiated suggestion for how to ameliorate the what-the-hell effect: reframe the indulgence as a reward that you earned. The what-the-hell effect seems to have its power from the mechanism "ah, now that I've already broken the vow and become unvirtuous, continuing to indulge won't change that", but I don't think this mechanism activates if you reframe the indulgence as a reward earned for obedient goal-tracking up to that point.

Expand full comment

Can you tell me from where you got your image, I would like to buy it and use it on my website.

Expand full comment

Surely the particular biases that matter are the ones that are affecting your decision right now and if it was easy to identify the biases you were affected by as they affected you life would be a lot easier than it is.

I think it's dangerous to see each item on the WIkipedia list of cognitive biases as a separate, individually solvable problem. "Today I will work on my availability heuristic, tomorrow I'll do confirmation bias". It'd be lovely it worked like that but it's a more insidious problem.

Expand full comment

 The "What the Hell" effect sounds a lot like the Broken Window theory on a personal level. Buildings with graffiti get graffitied more.

Or maybe Irrational escalation?

The entirely different broken window fallacy(that destruction and waste can have a positive economic effect) would also have been nice to address, but that's probably a subset of some other bias. The focusing effect, perhaps? I'm not sure.

Expand full comment

I'd say 'stereotyping' but it seems to be one bias people are already working to correct. There's something similar, though, which is less well addressed, where an outgroup of people are viewed as homogeneous.  View any group closely enough and there's internicean strife. But we rarely see that from a distance. Americans talk about Chinese culture, for example, but not Nanjing culture, Sichuan culture.

Expand full comment

My number one tip would be to learn to recognize when you WANT, for some reason (financial, convenience, reputation, status, group-pride, etc.), one hypothesis to be true. 

When that's the case, take extra care to correct for bias by 1) not prematurely truncating your search for alternative hypotheses, 2) looking at the issue from a variety of perspectives, 3) setting up experiments that are well-insulated against subjective judgment 4) getting outside opinions, etc.

Expand full comment

You should prune the identical and near-identical answers such as the fundamental attribution error and the actor-observer bias.

Expand full comment

I don't think that given a lecture for self improvement purposes is a high fixed cost low marginal cost purpose. 

Expand full comment

Most damaging to me, most damaging to others and most damaging to everyone over all - most damaging to who? The answer changes when the question is clarified. Confirmation bias of a person versus a group have different levels of likely damage.

Expand full comment

I suspect our biggest biases are universal: ie we tend to trust and believe what better looking people say more than ugly people. There is a cultural component to pretty/ugly, but the consequences of the judgment are widely shared.

Expand full comment

I would think the ranking depends a lot on context, learning about the gotchas in various things you're trying to pursue. The problem with a more general analysis is that evolution has already done that! That's why we have the biases that we do. Generally speaking, not being eaten by a predator is more important than getting a math problem right. If you're a mathematician living in 2013 though, maybe not so much :) Same goes for our bias towards understanding things socially, though I think people underestimate how important that is to our lives.

Expand full comment

Kahneman’s book Thinking Fast and Slow represents an excellent summary of the literature on biases and heuristics, but risks overwhelming or demoralising the reader with the number of errors they need to avoid.

I don't know that Kahneman "risks" anything: he didn't intend to write a self-help book. But Kahneman does reveal which bias he thinks must harmful: failure to realize that things are important to you only while you're thinking of them.

My humble impression is that the endowment effect is a particularly destructive bias, having repercussions for far-flung subjects from death to clutter. (My take on the endowment effect in relation the writing process: http://tinyurl.com/9sw54v8 )

But from a remedial perspective, you would need to address the numerous consequences separately (unless you can come up with something very new). This goes to Christian Kleineidam's important point.)

Expand full comment

 Also, Bandwagon Effect includes 'herd behavior' in it's description, while Herd Instinct is listed as a separate option.

Expand full comment