29 Comments

Sometimes just asking yourself whether a belief is true or not helps. You must ask in a genuinely curious way, the way you'd ask a favorite professor whom you respect and expect to occasionally surprise you. If the answer comes out in less than a second, you can be sure it is your deceptive robotic part of the mind talking.

Your feelings have a sphere of usefulness in decision making or else they wouldn't be there. After you've verbalized the belief to be questioned, test how it feels to fully believe it vs how it feels to be convinced that the opposite is true. This will not determine whether the objective truth of the statement but it will certainly give you insight on which choice will be easier to live with.

Expand full comment

Do I need to mention that "First and foremost wanting to believe the truth" is exactly the type of socially desirable character trait that your subconscious liar would be motivated to lie about?

Expand full comment

There are an infinity of universes that can be created with lies.

There is only one universe that can be created with truth; however, it is also impossible to know this universe in a social sense because it depends on people's perceptions of value, and often people do not control how they think of things or even do not allow their value judgement to be known outside their self. So, again from a social standpoint, the truth only makes things simpler when people have themselves decided to be simple.

This desire for simplicity, and for a reality that is not dependent on random perceptions and judgements as a result of heuristics that are often deliberately (but innocently) manipulated by individuals or organizations in their environment, is one reason to seek truth. Another reason is as a source of moral guidance, in that the overall best strategy in this world is cooperation and it is only a matter of revealing that strategy through attempts to obscure it.

However, the problem with this approach is that when you learn to much you discover that how much deception pervades the world and it seems impossible to prosper, or even survive in this world when you consider that many possible choices are a conflict that can only result in either you suffering and someone else benefiting and the world as a whole becoming worse; or you benefiting and someone else suffering and whether or not the world may benefit, you have already done a selfish action and as a result cannot truthfully see yourself as contributing to the world.

Are you truly being honest with yourself? You may not want to ask yourself who else may be losing out if you are not. Given the limitations of our memory and our ability to compare situations and predict the future, it seems inevitable that there will always be some dishonesty. Does it not?

Expand full comment

You are trying to decide between A and B. A is correct, but your subconscious is trying to get you to choose B. You try using your judgment and choose B. Wait! Intuition is untrustworthy. You should use a trustworthy process outside yourself such as formal analysis.

You count up and find that you have a dozen processes outside yourself that you might rely on. God knows that 4 of those merit trust and indicate A. God also knows that 8 of them are "on the wrong topic" or otherwise untrustworthy. What they indicate is a toss up, 4 for A, 4 for B. Lacking omniscience Hanson has to use his judgment.

You use your judgment to pick one of these twelve processes. If it indicates A your subconscious undermines your confidence in its relevance. Realising that it is on the wrong topic, you switch to a different process. Rinse and repeat.

Eventually you have an external process that tells you to believe B. Your subconscious has lied to you, getting you to judge that you have picked the most trustyworthy and relevant one of the twelve.

When I sit at the Go board I don't think to myself "Intuition is untrustworthy, I'll spend my time in the opening reading out end game sequences and then make a crap move." What is really going on is that my subconscious wants a timid defensive move and it knows that getting me to read out tactical sequences is a good way to get one. So it whispers in my ear about how untrustworthy intuition is. Use "minimal rationality". Formal analysis is always trustworthy. Then I read out some sequences, under the banner of formal analysis, find a threat to a group and play a timid move to defend it. I play my timid move smugly. My subconscious has lied to me, telling me that my explicit thoughts were on the right topic.

Expand full comment

Odd thought - what defines people that do, and don't, get better at seeing though self-deception.

I've noticed that there *are* many people that get better at not deceiving themselves, and continue that way. There are *also* people that get better at self-deception.

Expand full comment

Robin, if you are right about this, it still demands a change to signal theory for some signals as the theory assumes the choice of type is known to the signaller.

Expand full comment

I certainly won't disagree that some status-seeking in academia produces truth which is very valuable. But I don't think all or even a majority of the status-seeking is also truth-seeking. The social sciences seem to have especially weak feedback mechanisms associating status with truth, even in the long term. For example, people are still hotly divided over Keynes' GT, but not Einstein's relativity. We could say this is because economics is much harder to test than physics, but testing theories is the feedback which keeps academics honest.

I don't know what Robin is incentivized to do.

Expand full comment

Robin is, as an academic, incentivized to reliably get to the truth; his status depends upon it. This is especially true in the long-term.

Expand full comment

> Roko, How do you decide which psychology results to believe?

It is clear that I am a more effective thinker than most people, so if a study is performed on a fairly average group, and they are shown to be lacking, I would be foolish to think that the results automatically carry over to me.

However, Carl has shown me evidence that suggests that many biases afflict the intelligent as badly as the stupid. So perhaps I need to change strategy a touch.

Expand full comment

It seems that intelligence is no defence against many biases.

I am suitably more scared.

Expand full comment

In 7 different studies, the authors observed that a large number of thinking biases are uncorrelated with cognitive ability. These thinking biases include some of the most classic and well-studied biases in the heuristics and biases literature, including the conjunction effect, framing effects, anchoring effects, outcome bias, base-rate neglect, “less is more” effects, affect biases, omission bias, myside bias, sunk-cost effect, and certainty effects that violate the axioms of expected utility theory. In a further experiment, the authors nonetheless showed that cognitive ability does correlate with the tendency to avoid some rational thinking biases, specifically the tendency to display denominator neglect, probability matching rather than maximizing, belief bias, and matching bias on the 4-card selection task. The authors present a framework for predicting when cognitive ability will and will not correlate with a rational thinking tendency.

Expand full comment

But I want, first and foremost, to believe the truth.

I'm not trying to insult Robin or disagree with this post, but this seems like a lie to me. This is exactly the sort of thing we'd expect academics' internal liars to make them believe: that they selflessly serve the public good, while in fact being biased towards their own theories and accomplishments. Why should we believe that Overcoming Bias is really about overcoming bias?

Given the nature of evolution, expecting life (or its creations) to be truly selfless in the pursuit of truth seems impossible? The approach of creating institutions which incentivize honesty (prediction markets, etc) seems like a more realistic approach.

Expand full comment

Roko, How do you decide which psychology results to believe?

Alan, yes intuitions on the right topic can beat explicit thoughts on the wrong topic.

Michael, people might be unconsciously aware of what they are consciously unaware.

Expand full comment

I think the findings are based on a false premise;

[Folks seem] extremely reluctant to revise their self-perceptions, even in the face of powerful objective evidence

An analysis of a one minute video of yourself using non-verbal cues would not be considered by the subjects as "powerful objective evidence", even if it is. Therefore they will ignore it. Their error, if it is indeed an error, is in thinking that the non-verbal cue evidence is not a particularly useful measure of their level of extroversion which is hardly a surprising conclusion for them to draw.

Expand full comment

I thought that this was an important study, and relevant for those who study signalling. (Which is why I sent it to Robin, in the first place.)

I have a different take on experiments, even though I find Robin's observations very interesting.

Signal theorists, it appears to me, assume that if I want to signal my type, X, through message Y, to a receiver, then I must have private access to my type.

This experiment suggests otherwise: how I learn my type is to observe what other's think my type is. There is no correct introspection available to me of my type, for certain types.

It would like playing poker with your hand constantly changing based upon what other people thought you had, as described or inferred by their bets.

(As a philosopher, I am generally attracted to paradoxes so I may have read into the experiment this very oddity. But, I wonder if others saw the same thing?)

Expand full comment