Here is our monthly place to discuss Overcoming Bias topics that have not appeared in recent posts.
Thinking(too much) as a cognitive bias?
At first this may sound absurd, but bear with me. I wonder if thinking could be a bias. I mean the type of thinking where you end up with analysis paralysis. In some areas like math you have to think things through. But in many other areas of human endeavour, you it's more important to decide, take action and learn through experience. If you think too much you get stuck up in your mind. I could write more about this if you want. Any comments?
How about major criticisms of Bayesian theory and responses?
Here's the working link to the interview; the one above is dead.
This recent interview with Daniel Kahneman on 'Intuition and Rationality' may be of interest to readers of this blog. Kahneman even addresses the question of whether biases can be overcome:
Does your research offer insights to individuals with regard to how they might change their own behavior?Yes, potentially it does, but they are not insights that are very easy to use. You can be aware of mistakes you make, [but] it's very difficult to learn to avoid them, because System One operates automatically. Occasionally when a decision is very important, then you can stop to think, and you have to recognize that you can stop to think. And then there is still the extra stage, which is a very painful and difficult stage, [where] even if your analysis tells you to do something, you don't want to do it. To impose the discipline of rationality on your desires is extremely difficult. I've just lived something like this in deciding whether or not to write a book: I just want to do it. When I analyze the pros and cons it's absurd for me to do this, but I'm going to do it, I think. So, you know, it's very difficult to impose discipline on yourself.
How about an analysis of biases affecting futurists.
An easy example is the tendency to be over-optimistic in the short run and too conservative in the long run -- what others can be found?
I'd like to see if there is some common explanation for several cases where large numbers of smart thinkers, who believed themselves to be rationalists, predicted a technological apocalypse/Utopia but were wrong:
Some cases include the future predicted by Socialist thinkers at the end of the 19th century, the rapid development of space travel foreseen in the mid-1960's, the Y2k warnings -- and perhaps, Singularity thinking today?
For this purpose, I am not interested in knowing the errors that these people made, just the meta-errors, the biases; and whether multiple historical cases had biases in common.