How paranoid should I be? The limits of ‘overcoming bias’.

If the project of ‘overcoming bias’ is rational correction of the biases of spontaneous human cognition, it may hit a limit when it comes to ‘theory of mind’ inferences such as evaluating the dispositions, motivations and intentions of other humans. For instance, trying to answer the question – how paranoid should I be?

We cannot know for sure whether other people intend to harm us, given that such intentions are concealed. If we are too-paranoid we will miss-out on potentially valuable alliances and waste resources on pointless precautions; yet if we are not-paranoid-enough then we will be harmed or even killed (especially in the tribal ancestral human environment, when the brain evolved).

Such social evaluations are the very basis of human intelligence, according to the Machiavellian Intelligence theory that humans evolved big brains to deal with the vast complexities of human social living.

There is a category of psychiatric patients who suffer ‘delusional disorder’ who have (so it seems to everyone else) made incorrect inferences about the dispositions, motivations and intentions of others. They falsely believe such things as that they are pursued by hostile gangs (persecutory delusions), that their wives are being unfaithful (delusions of jealousy), or that famous men are secretly in love with them (erotomania).

Aside from this encapsulated false belief, and its behavioural consequences, patients with delusional disorder are cognitively entirely normal. Yet these beliefs typically cannot be shaken by rational discussion, since they are based on theory of mind inferences; and anyway, from the patient’s perspective, who is to say that the attempting-persuader is not part of the conspiracy to deceive, or themselves deceived?   

I suggest that the same psychological mechanism underlies political affiliations, religious or other metaphysical beliefs, and such phenomena as nationalism and racism. These beliefs are hard to shake, not because they are irrational, so much as that they are non-rational – based on theory of mind inferences concerning the motivations, dispositions and intentions of others.

Which is why the overcoming bias strategy is not much specific help here. Indeed, in trying to weaken specific theory of mind beliefs, rational OB-like strategies such as cognitive therapy may simply be weakening self-confidence and fluency in _all_ types of theory of mind inferences – which would probably be socially maladaptive.

Insofar as the implicit OB aspiration is to learn a habit of applying rational filters to what are intrinsically ‘theory of mind’ inferences involving evaluations of other peoples’ motivations, dispositions and intentions; this would – unintentionally – lead to social incompetence. 

Reference: www.hedweb.com/bgcharlton/delusions

GD Star Rating
loading...
Tagged as:
Trackback URL:
  • http://profile.typekey.com/robinhanson/ Robin Hanson

    There are two issues. 1) How different is our world from our ancestors’ world in the social function of each kind of bias? 2) If we are willing to sacrifice our social success to some degree for the noble cause of overcoming bias, with what biases or overcoming methods can we gain the most accuracy benefits at the lowest social cost?

  • http://www.jamesdmiller.blogspot.com/ James D. Miller

    Perhaps there are many people who have a tendency towards delusional disorders but use reason to convince themselves that they are just being paranoid and so are able to lead normal and productive lives.

  • zzz

    Overcoming bias does not require applying any sort of cumbersome “rational filter” over your intuitions. What it _does_ require is calibrating them so that they’re correct on average, without systematic deviations either way. I can’t see how this would “weaken self-confidence and fluency”.

  • http://profile.typekey.com/sentience/ Eliezer Yudkowsky

    Bruce, it seems to me that one of the great common roots of many failure modes of rationality is a decision on someone’s part that some subset of their beliefs does not need to be rational. They don’t lose the battle so much as surrender instantly.

    Given this point – believing some set of beliefs is specially immune from criticism is the root of much failure – I should very much like to know what causes you to think that beliefs about “theory of mind” are any less normatively subject to Bayesian updating than beliefs about atoms. Yes, other people’s minds/brains are hard to observe. So are atoms. Both theories are causal ones where different parameter settings yield different probable consequences.

  • Bruce G Charlton

    Thanks to the commentators.

    The point I wish to highlight is that there may be personal costs incurred from Overcoming Bias. For example, the treatment of delusional disorder often involves using neuroleptic/ antipsychotic drugs to make pateints calmer and lees convinced; but these drugs also blunt emotions and reduce motivation. I suspect that the same applies to cognitive therapy strategies (which try to inculcate habits of rational analysis before acting upon spontaneous thoughts).

    The point is that our attempts to change and improve personality are crude, and they have costs as well as benefits.

    Certainly, I regard spontaneous cognition – which evolved slowly during our evolutionary history – as potentially very flawed as a giuide to behaviour in the modern world. I was very impressed by The Robot’s Rebellion by Keith E Stanovich – which argues for the importance of rational modes of thinking in generating acceptable and adaptive modern ethics.

    The way I think this happens is mainly by formal education – over many years our instincts are trained by the rewards and punishments of an artificial formal education system (insofar as this inculcates systematic and abstract thinking) – and reshaped in the direction of greater rationality.

    My worry is that there is often a personal cost to pay in precisely the realm of theory of mind inferences – as people learn to ignore their ‘gut feelings’ and apply rational criteria which can only be very general and on average true. I think this may lead to social incompetence especially by ignoring negative emotional ‘warnings’ and making bad decisions such as picking the wrong marriage partners or trusting confidence tricksters.

    I have a hunch (no more than a hunch) that highly-educated women may be more prone to this type of error from over-riding gut instinct; because women are intrinsically more empathic (see Simon Baron Cohen’s work on empathizers – which I have replicated) – and in learning to overcome this spontaneous bias women may in the process lose more personal social competence than men.

  • zzz

    Bruce G Charlton:

    “My worry is that there is often a personal cost to pay in precisely the realm of theory of mind inferences – as people learn to ignore their ‘gut feelings’ and apply rational criteria which can only be very general and on average true. I think this may lead to social incompetence especially by ignoring negative emotional ‘warnings’ and making bad decisions such as picking the wrong marriage partners or trusting confidence tricksters.”

    Presumably there are benefits which compensate for these costs? If these people are making systematically bad decisions from their “overly rational” thinking, they’re not really being rational. A truly unbiased person would learn to give their gut hunches a little more weight.

  • Bruce G Charlton

    zzz says: “If these people are making systematically bad decisions from their “overly rational” thinking, they’re not really being rational. A truly unbiased person would learn to give their gut hunches a little more weight.”

    I think you are correct. I’ve read some New Age-type self-help books which argue almost exactly this. Modern and future life sure is complex, but that’s the way its gotta be!