

Discover more from Overcoming Bias
About a recent European Journal of Personality article:
The participants recorded a one minute television commercial, … then watched … themselves, having been given guidance on non-verbal cues that can reveal how extraverted or introverted a person is. … They were then asked to rate their own personality. … The participants’ extroversion scores on the implicit test showed no association with their subsequent explicit ratings of themselves, and there was no evidence either that they’d used their non-verbal behaviours (such as amount of eye contact with the camera) to inform their self-ratings.
In striking contrast, outside observers who watched the videos made ratings of the participants’ personalities that did correlate with those same participants’ implicit personality scores, and it was clear that it was the participants’ non-verbal behaviours that mediated this correlation … Two further experiments showed that this general pattern of findings held even when participants were given a financial incentive.
[Folks seem] extremely reluctant to revise their self-perceptions, even in the face of powerful objective evidence. … Participants seemed able to use the videos to inform their ratings of their “state” anxiety (their anxiety “in the moment”) even while leaving their scores for their “trait” anxiety unchanged.
(Hat tip to Michael Webster.) This sort of thing terrifies me. Let me explain why.
Any long complex design or calculation is subject to errors. And those who do such things regularly must get into the habit of testing and checking for such errors. This may take most of the effort, but it is at least manageable, because we expect that such errors are not very correlated with other features of interest. If something has worked ten times in a row in field tests, it will probably work the first time for a customer, at least if that customer’s environment is not too different from field test environments.
People who have to worry about spies and liars, on the other hand, have to worry more about troublesome correlations. Liars can coordinate their lies to tell a consistent story. Spies and liars can choose carefully to betray us exactly when such defections are the hardest to detect and the most expensive. So the fact that a possible spy performed reliably ten times in a row gives less confidence that he will also perform reliably the next time, if the next time is unusually important. In these cases we rely more on private info, i.e., what the spy or liar could not plausibly know. For example, if we do not let the possible spy know which are the important cases, he can’t choose only those cases to betray us. And if we can check on him at unexpected times, we might catch him in a lie.
We humans have many conscious beliefs, and we are built to have accurate ones in many situations, but in many other situations we are built to have misleading conscious beliefs, i.e., to be self-deceived. Evolution judged that such misleading beliefs would tend to help us fool our colleagues, and so better survive and reproduce. It created subconscious mental processes to manage this process of deciding when our beliefs should be accurate or misleading.
We seem almost completely defenseless against such manipulation. Yes we can try to check our conscious beliefs against outside standards, but our subconscious liars can not only choose carefully when to lie about what, but they probably also have access to all our conscious thoughts and info! They might even lie to us about whether we checked our beliefs, and what those checks found. So in principle our unconscious liars can execute extremely complex and subtle lying plans. For example, the study above suggests that such processes choose to make us blind to clues about our average public speaking anxiety, while letting us see momentary fluctuations about that average.
If our subconscious liars were as smart and thoughtful as our conscious minds, we would seem to be completely at their mercy. The situation may not be that bad, but it is not clear how we can tell just how bad the situation is; even if they had complete control, they would probably want us to think otherwise.
This is the context in which I find myself interested in “minimal rationality,” similar to minimal morality. In the limit of my being subject to very powerful subconscious liars, how can I best avoid their distortions? It seems I should then become especially distrustful of intuition, and especially interested in trustworthy processes outside myself, such as prediction markets and formal analysis.
If I have a choice between two ways to make an estimate, and one of them allows more discretion by subconscious mental processes, I should try to go with the other choice if possible. If the data is pretty clear and theory needs a lot of judgment calls to get an answer, I go with the data. If the data is messy and needs judgement calls while standard theory gives a pretty clear answer, I go with that theory.
Of course this minimal rationality approach makes me subject to my subconscious lying about which estimates allow more subconscious discression. So I need to be especially careful about those judgments. But what else can I do?
Many folks figure that if evolution planned for them to believe a lie, they might as well believe a lie; that probably helps them acheive their goals. But I want, first and foremost, to believe the truth.
Errors, Lies, and Self-Deception
Sometimes just asking yourself whether a belief is true or not helps. You must ask in a genuinely curious way, the way you'd ask a favorite professor whom you respect and expect to occasionally surprise you. If the answer comes out in less than a second, you can be sure it is your deceptive robotic part of the mind talking.
Your feelings have a sphere of usefulness in decision making or else they wouldn't be there. After you've verbalized the belief to be questioned, test how it feels to fully believe it vs how it feels to be convinced that the opposite is true. This will not determine whether the objective truth of the statement but it will certainly give you insight on which choice will be easier to live with.
There are an infinity of universes that can be created with lies.
There is only one universe that can be created with truth; however, it is also impossible to know this universe in a social sense because it depends on people's perceptions of value, and often people do not control how they think of things or even do not allow their value judgement to be known outside their self. So, again from a social standpoint, the truth only makes things simpler when people have themselves decided to be simple.
This desire for simplicity, and for a reality that is not dependent on random perceptions and judgements as a result of heuristics that are often deliberately (but innocently) manipulated by individuals or organizations in their environment, is one reason to seek truth. Another reason is as a source of moral guidance, in that the overall best strategy in this world is cooperation and it is only a matter of revealing that strategy through attempts to obscure it.
However, the problem with this approach is that when you learn to much you discover that how much deception pervades the world and it seems impossible to prosper, or even survive in this world when you consider that many possible choices are a conflict that can only result in either you suffering and someone else benefiting and the world as a whole becoming worse; or you benefiting and someone else suffering and whether or not the world may benefit, you have already done a selfish action and as a result cannot truthfully see yourself as contributing to the world.
Are you truly being honest with yourself? You may not want to ask yourself who else may be losing out if you are not. Given the limitations of our memory and our ability to compare situations and predict the future, it seems inevitable that there will always be some dishonesty. Does it not?