About a recent European Journal of Personality article: The participants recorded a one minute television commercial, … then watched … themselves, having been given guidance on non-verbal cues that can reveal how extraverted or introverted a person is. … They were then asked to rate their own personality. … The participants’ extroversion scores on the implicit test showed no association with their subsequent explicit ratings of themselves, and there was no evidence either that they’d used their non-verbal behaviours (such as amount of eye contact with the camera) to inform their self-ratings.
Sometimes just asking yourself whether a belief is true or not helps. You must ask in a genuinely curious way, the way you'd ask a favorite professor whom you respect and expect to occasionally surprise you. If the answer comes out in less than a second, you can be sure it is your deceptive robotic part of the mind talking.
Your feelings have a sphere of usefulness in decision making or else they wouldn't be there. After you've verbalized the belief to be questioned, test how it feels to fully believe it vs how it feels to be convinced that the opposite is true. This will not determine whether the objective truth of the statement but it will certainly give you insight on which choice will be easier to live with.
There are an infinity of universes that can be created with lies.
There is only one universe that can be created with truth; however, it is also impossible to know this universe in a social sense because it depends on people's perceptions of value, and often people do not control how they think of things or even do not allow their value judgement to be known outside their self. So, again from a social standpoint, the truth only makes things simpler when people have themselves decided to be simple.
This desire for simplicity, and for a reality that is not dependent on random perceptions and judgements as a result of heuristics that are often deliberately (but innocently) manipulated by individuals or organizations in their environment, is one reason to seek truth. Another reason is as a source of moral guidance, in that the overall best strategy in this world is cooperation and it is only a matter of revealing that strategy through attempts to obscure it.
However, the problem with this approach is that when you learn to much you discover that how much deception pervades the world and it seems impossible to prosper, or even survive in this world when you consider that many possible choices are a conflict that can only result in either you suffering and someone else benefiting and the world as a whole becoming worse; or you benefiting and someone else suffering and whether or not the world may benefit, you have already done a selfish action and as a result cannot truthfully see yourself as contributing to the world.
Are you truly being honest with yourself? You may not want to ask yourself who else may be losing out if you are not. Given the limitations of our memory and our ability to compare situations and predict the future, it seems inevitable that there will always be some dishonesty. Does it not?
You are trying to decide between A and B. A is correct, but your subconscious is trying to get you to choose B. You try using your judgment and choose B. Wait! Intuition is untrustworthy. You should use a trustworthy process outside yourself such as formal analysis.
You count up and find that you have a dozen processes outside yourself that you might rely on. God knows that 4 of those merit trust and indicate A. God also knows that 8 of them are "on the wrong topic" or otherwise untrustworthy. What they indicate is a toss up, 4 for A, 4 for B. Lacking omniscience Hanson has to use his judgment.
You use your judgment to pick one of these twelve processes. If it indicates A your subconscious undermines your confidence in its relevance. Realising that it is on the wrong topic, you switch to a different process. Rinse and repeat.
Eventually you have an external process that tells you to believe B. Your subconscious has lied to you, getting you to judge that you have picked the most trustyworthy and relevant one of the twelve.
When I sit at the Go board I don't think to myself "Intuition is untrustworthy, I'll spend my time in the opening reading out end game sequences and then make a crap move." What is really going on is that my subconscious wants a timid defensive move and it knows that getting me to read out tactical sequences is a good way to get one. So it whispers in my ear about how untrustworthy intuition is. Use "minimal rationality". Formal analysis is always trustworthy. Then I read out some sequences, under the banner of formal analysis, find a threat to a group and play a timid move to defend it. I play my timid move smugly. My subconscious has lied to me, telling me that my explicit thoughts were on the right topic.
Odd thought - what defines people that do, and don't, get better at seeing though self-deception.
I've noticed that there *are* many people that get better at not deceiving themselves, and continue that way. There are *also* people that get better at self-deception.
Robin, if you are right about this, it still demands a change to signal theory for some signals as the theory assumes the choice of type is known to the signaller.
I certainly won't disagree that some status-seeking in academia produces truth which is very valuable. But I don't think all or even a majority of the status-seeking is also truth-seeking. The social sciences seem to have especially weak feedback mechanisms associating status with truth, even in the long term. For example, people are still hotly divided over Keynes' GT, but not Einstein's relativity. We could say this is because economics is much harder to test than physics, but testing theories is the feedback which keeps academics honest.
I don't know what Robin is incentivized to do.
Robin is, as an academic, incentivized to reliably get to the truth; his status depends upon it. This is especially true in the long-term.
> Roko, How do you decide which psychology results to believe?
It is clear that I am a more effective thinker than most people, so if a study is performed on a fairly average group, and they are shown to be lacking, I would be foolish to think that the results automatically carry over to me.
However, Carl has shown me evidence that suggests that many biases afflict the intelligent as badly as the stupid. So perhaps I need to change strategy a touch.
It seems that intelligence is no defence against many biases.
I am suitably more scared.
In 7 different studies, the authors observed that a large number of thinking biases are uncorrelated with cognitive ability. These thinking biases include some of the most classic and well-studied biases in the heuristics and biases literature, including the conjunction effect, framing effects, anchoring effects, outcome bias, base-rate neglect, “less is more” effects, affect biases, omission bias, myside bias, sunk-cost effect, and certainty effects that violate the axioms of expected utility theory. In a further experiment, the authors nonetheless showed that cognitive ability does correlate with the tendency to avoid some rational thinking biases, specifically the tendency to display denominator neglect, probability matching rather than maximizing, belief bias, and matching bias on the 4-card selection task. The authors present a framework for predicting when cognitive ability will and will not correlate with a rational thinking tendency.
But I want, first and foremost, to believe the truth.
I'm not trying to insult Robin or disagree with this post, but this seems like a lie to me. This is exactly the sort of thing we'd expect academics' internal liars to make them believe: that they selflessly serve the public good, while in fact being biased towards their own theories and accomplishments. Why should we believe that Overcoming Bias is really about overcoming bias?
Given the nature of evolution, expecting life (or its creations) to be truly selfless in the pursuit of truth seems impossible? The approach of creating institutions which incentivize honesty (prediction markets, etc) seems like a more realistic approach.
Roko, How do you decide which psychology results to believe?
Alan, yes intuitions on the right topic can beat explicit thoughts on the wrong topic.
Michael, people might be unconsciously aware of what they are consciously unaware.
I think the findings are based on a false premise;
[Folks seem] extremely reluctant to revise their self-perceptions, even in the face of powerful objective evidence
An analysis of a one minute video of yourself using non-verbal cues would not be considered by the subjects as "powerful objective evidence", even if it is. Therefore they will ignore it. Their error, if it is indeed an error, is in thinking that the non-verbal cue evidence is not a particularly useful measure of their level of extroversion which is hardly a surprising conclusion for them to draw.
I thought that this was an important study, and relevant for those who study signalling. (Which is why I sent it to Robin, in the first place.)
I have a different take on experiments, even though I find Robin's observations very interesting.
Signal theorists, it appears to me, assume that if I want to signal my type, X, through message Y, to a receiver, then I must have private access to my type.
This experiment suggests otherwise: how I learn my type is to observe what other's think my type is. There is no correct introspection available to me of my type, for certain types.
It would like playing poker with your hand constantly changing based upon what other people thought you had, as described or inferred by their bets.
(As a philosopher, I am generally attracted to paradoxes so I may have read into the experiment this very oddity. But, I wonder if others saw the same thing?)
I don't think you can dodge the perils of biased judgment by refusing to use your judgment.
I like to play Go and have noticed my personality interfering with my play. Early in the game it is important to boldly take the big points, dealing with attacks on ones stones with counter attacks, not defensive replies. I am a timid and anxious person so I often make slack moves that defend a group that is already strong enough to be left. Worse, if I attempt directly to be less timid I play rash moves rather than good ones.
Sometimes I become vividly aware of what is happening. I recognise the situation as similar to a professional game that I have studied or as similar to that discussed in an instructional book. I know what move I'm supposed to play. My fears about a group I am anxious about crowd in on me. Will it be attacked and killed? It is an emotionally fascinating experience when I pluck up my courage, play the correct, scary move, and watch as my opponent makes a poor reply and gets into difficulty.
Now wait a minute. Why do I bother with all this inner angst, facing down my fear, plucking up my courage, trying exercise my judgment in a way that is bold without being rash? Go is a logical game. Why not just think about my moves?
The problem lies in the difference between strategy and tactics. One can certainly read out sequences: if I do this and he does that etc. It is important to do so. On the other hand the board is large and the options numerous. Strategy depends on judgment. These stones are light meaning that attacks can be dodged. These stones are thick meaning that if my opponent plays close to them I can attack him and if I play close to them I will be overconcentrated.
When I try to think logically, instead of exercising my judgment, I am trying to dodge the problem of my timid nature leading to timid judgment, but what actually happens is that I read out sequences and let tactics dictate strategy. This usually ends badly.
I draw a moral for real life. One is more likely to get crisp clear answers for tactical questions than for strategic ones. If your judgment has biases and you try to escape the bad consequences of those biases by favouring approaches that lead to clear answers that do not require the exercise of judgment, you create a new bias, towards the tactical tail wagging the strategic dog. Welcome to a new set of bad consequences.