The Most Important Topic

How purposely deluded and self-deceived are we about ourselves? If we were just rarely and a bit deluded, the subject would be of only moderate intellectual importance. Studying self-deception might offer interesting clues into human nature, but it wouldn’t help much to achieve other goals.

On the other hand, if we self-deceive more often, and on more important topics, then understanding the subject becomes more practially useful. And in the limit of being self-deceived on most important topics, the subject would be of central intellectual and practical importance. It would be hard to have much confidence in anything else without first having a handle on our self-deception. How could we trust our other thoughts, until we knew how to tell where we had self-deceived?

On the very important subject of our basic purposes, i.e., the key functions we most work to achieve via the details of our behavior, I do in fact think that we self-deceived more often than not. We are homo hypocritus, creatures built to fool ourselves in order to fool others on why we do things. While our beliefs seem reasonably reliable on near details, such as what exactly we see and do at each moment, we are quite often rather deceived about the overall far goals and purposes our behavior is designed to achieve.

I have thus become rather obsessed over the last few years with this subject of our self-deceptions. While I feel I’ve made some progress, far more remains to be done. But how is it that so few others seem to share my obsession? Do they a) think self-deception is rare, b) think it is common for most folks but not for them, c) not want to know about their self-deceptions, as they probably exist for good reasons, or d) expect it is very hard to make progress here, relative to other broadly useful subjects?

From a conversation with Russ Roberts.

GD Star Rating
Tagged as: , , ,
Trackback URL:
  • Hey Robin,
    I think it’s e) All of the above. Most people are so good at self deception that they have no real grasp of reality. They don’t take the time to think about it because they don’t even really truly know it’s there. It’s like wearing contacts. Sometimes you just forget that they are in your eye, but the world looks different because of them. With the mind, it seems that once we put those contacts on, they never come off unless we intervene and we rarely do.

    As a hypnotist, I see it all the time and take advantage of it in my work for therapeutic or entertainment purposes. How we relate to ourselves, others, and reality in general is so plastic, that we can change it to fit pretty much any mental model and perception we want.

    Joe Homs

  • michael vassar

    People don’t use the importance of a subject when deciding whether to investigate it, only the impressiveness of the people currently investigating it. These days, regarding self-deception, that mostly means hipppies.

    • Peter St. Onge

      I think you’re right about experts herding, but I think laypersons are more open to the idea of maverick truthbearers.

      • Peter St. Onge

        For example, Zeitgeist and The Venus Project. If the backers were mainstream credentialed it seems neutral-to-negative for the movement.

  • Robin, it’s your way of approaching it that’s unique. I can think of Freudians trying to discover “defense mechanisms” (habits to prevent self-knowledge) and twelve-steppers with “a searching and fearless moral inventory of ourselves.”

    I tend to think obsession means someone’s either found a vein of gold, or (as with some obsessions of my own) a plausible but quixotic goal to distract themself from other things.

  • Andy McKenzie

    I was hoping to disprove you via a google trends search, but alas, searches for “bias” show no trend of increasing,, and self-deception has too low of volume to show up. So I s’pose I have to broadly agree with your confusion. Feel an obligation to report this to fight against publication bias.

    In the meantime, these are the top five cities for searches about bias, 1. Brisbane, Australia , 2. Sydney, Australia , 3. Washington, DC, USA , 4. Melbourne, Australia , 5. Singapore, Singapore. Interesting to see Auzzies so high.

  • Pingback: myninjaplease()

  • Robin writes: “But how is it that so few others seem to share my obsession? [with self deception]”

    You have deceived yourself about what others are interested in.

  • Since self-deception deprives people of knowledge, it makes sense that knowledge is its remedy. But what knowledge? There’s the division, not whether the topic is important.

    Broadly, there are three approaches to overcoming self-deception 1) knowledge correcting specific self-deceptions versus knowledge about the process; the latter subdivides into 2) knowledge about the mechanism of self-deception and 3) knowledge about self-deception’s causes and characteristics. Under No. 1 come pursuers of truths that undermine human self-deception, such as regarding man’s putative centrality in the universe. Under No. 2 come bearers of methods based on a theory of the process users can apply to lessen self-deception. Psychoanalysis is the most well-developed example; Buddhism perhaps the most widely practiced. Under No. 3 comes a) search for the evolutionary sources of self-deception or (b) identification of cognitive heuristics leading (in certain applications) to self-deception.

    Many would fault No. 3 because it fails to challenge specific self-deceptions (No. 1) or aid anyone in becoming less self-deceived (No. 2). No. 3 might be perceived as the most speculative approach and at the same time the least useful (even if the speculations are true). At most (from a practical standpoint, that is) knowledge about self-deception (3) as opposed to knowledge of self deception (1 & 2) provides opportunities to to counteract or allow for various self-deceptions. But that knowledge, lacking measure how much correction is neede, can be worse than useless. (

  • Sandeep

    I think it is either (i) almost impossible to either make progress on this subject, or (ii) rather difficult to make progress on this subject *and* almost impossible to communicate it to others. Otherwise the few deviants who make progress, together with “markets in everything” (for instance through reality shows etc.) would make “self deception exposition” more widespread.

    That said, I hope you will prove me wrong and communicate the progress you say you have made (this is my first comment on this blog, made under the assumption that you don’t mind strangers).

  • sabril

    First of all, I have believed for a couple years now that self-deception is THE cognitive problem. It’s fascinating to learn about halo effects, missing variables, and so forth. But all that stuff is worthless if you are self-deceived. Actually, as Eliezer pointed out, it can make you stupider since you will be better able to nitpick other peoples’ arguments.

    On the other hand, most people are pretty good at critical thinking when the criticism is against thinking they dislike.

    I would say that self-deception is pretty much universal. As for me, I think I’ve managed to purge most of my self-deception on political/policy issues. However, I’m still pretty hopeless on personal/emotional issues. This is based on comparing my thinking now with what is was like before I learned how to think critically.

  • Pingback: Rational Wireheading – Sean Hastings()

  • For me, (d) is the closest to being right. I’ve never had any inspired ideas on how to eliminate self-deception in general. Combating specific bad ideas (religion, say) seems to me like a more tractable project.

    There’s also the issue, though, that there are so many other thigns competing for my attention. Call that (f). Maybe I would become a psychologist studying self-deception, if not for my interests in neuroscience, civil liberties, helping the poor, and so on.

    • Yerenia Hernandez

      I’m currently working on a psychology project that focuses on self-deception and its effects on depressive realism. Right now I have to figure out a way to manipulate self-deception in order to attempt to find a causal relationship. Fail.

  • Jeffrey Soreff

    “Self-deception” seems to suggest that there is a truth, which is then concealed. A lot of the discussion both here and on Less Wrong suggests that we are much less coherent than that, that we execute a variety of adaptations from our evolutionary past, which don’t add up to a self-consistent utility function.

    I’m also not sure that this is all that important a question. At the moment, human nature changes slowly – the heuristics for dealing with the average amount of deception (both internal and external) in people that have developed over centuries still apply. I find your other major interest, in opportunities to improve institutions, much more promising. Changes like the rise of the scientific community over the last few centuries can improve the average person’s life, even though the people involved have the same level of self- and other-deception as usual. Perhaps WikiLeaks will add similar gains…

  • It takes many years to become even marginally aware of one’s self-deception, our focus often going through incarnations political, sexual, and spiritual, not to mention the body of knowledge that is needed to start a coherent discussion with others. The basic immediate problem is the verifiable empirically driven baseline data about specifics that isn’t too easy to come by in a dynamic immediate response type situation.

  • Ray

    Self-deception is the highest priority in my own intellectual pursuits, but I agree that I don’t see a lot on the subject itself.

    What I do see is mostly behavioral psych/econ stuff that all centers on Tversky and Kahneman, and only briefly touches on self-deception itself in the context of which you and Russ have brought up.

    What’s most annoying is the tack that much of it seems to take however by somehow suggesting that we’re held hostage by our emotions when in fact we are self-deceiving by choice – in my opinion.

    Personally what I’ve been working on myself is to try and not speak certainly about things I’m not really certain about. Observing others I’ve found that the wise and the prudent are careful to make their statements measured or balanced, even when they are very confident of what they’re saying. Russ is very good at this I believe as anyone who listens to the podcasts regularly should notice.

    When I hear people say something with exact certitude when in fact there’s room for other conclusions I count that as a strike against their overall wisdom, even if they’re very smart people. And even if I think they’re actually correct.

    When someone is strident in their knowledge it seems that their emotions are too tied up with their thinking, and objectivity necessarily suffers. A person can be 100% certain of something, and still present themselves in a balanced, measured fashion. This I believe shows a resistance to self-deception – or at least an awareness of its dangers.

    However a person can be 100% certain, and be 100% correct, but if they’re emotionally motivated in their certainty, then they are liable to miss something, and not even realize that this particular situation is a little different than what they think it is, and thus they are still wrong. But they’ll never know it because they were too strident to listen or look around.

  • Scott

    I’d say most people “c) not want to know about their self-deceptions, since they probably exist for good reasons” and would add that it also probably makes them very uncomfortable to think about it. Do you ever lose sleep over this “obsession”?

  • Pingback: Recomendaciones « intelib()

  • People aren’t interested because they are self-deceived about their own self-deception. Most people are more interested in having their entire world view confirmed rather than learning what the truth actually is, regardless of what they beileve.

  • Danny

    I’d say (c) carries the most weight for me. When I think really hard about things, I begin to suspect that I am not that smart, not that attractive, not that interesting, not that nice, not that talented, and not that worthwhile. I begin to suspect that people keep me company not because I am special, but because humans are social animals which are generally inclined to bond with peers who seek their companionship. I begin to suspect that maybe I don’t deserve all of the love I have gotten, and that maybe I am not likely to find much more.

    Stepping back a bit further, I begin to suspect that most of my beliefs are held without compelling grounds, and that most of my desires are aimed at things that would not clearly be worth getting. I begin to suspect that many things in my life would not stand up to harsh scrutiny, and that if future people learned how I spend my time, they would find it hilarious.

    With another step back, I begin to suspect that I’m just a momentary glimmer of consciousness that will soon vanish, to be replaced by another glimmer of consciousness that will have memories of being me. I begin to suspect that the concepts of “having a point,” “having meaning,” or “being worthwhile” are nothing more than constructs of a cognitive machinery built to direct an extremely complex chemical reaction towards self-replication.

    But as Hume put it, “Most fortunately it happens, that since reason is incapable of dispelling these clouds, nature herself suffices to that purpose, and cures me of this philosophical melancholy and delirium, either by relaxing this bent of mind, or by some avocation, and lively impression of my senses, which obliterate all these chimeras. I dine, I play a game of backgammon, I converse, and am merry with my friends; and when after three or four hours’ amusement, I wou’d return to these speculations, they appear so cold, and strain’d, and ridiculous, that I cannot find in my heart to enter into them any farther.”

  • The first thing that came to mind was: fish can’t see the water they swim in, right?

    But I can refine that a little bit more.

    It’s like we homo hypocritus sub-species can’t see the water, unless we look for it, then as soon as we see it, there’s a new water we don’t see. Ad infinitum.

    Try to tell the other fish to look at the water, and you’ll get mixed results, however it seems to me that most people truly are fish that can’t see the water in the first place (similar to “a”, “b”, and “c”), and most of the rest see no great thing to be gained in analyzing their self-deception and deception of others (similar to “d”).

    Even so, I too think that study and knowledge of this subject is, as you say, of “central intellectual and practical importance”.

    And here is a counterpoint:

  • scott

    I think b) is almost entirely the case.

    Of course, that is a case of self-deception. The problems I see with the other options: a) should cause surprise in people when exposed to an obvious case of self-deception in someone else, c) is just a rare mentality in general (curiosity killed the cat is confirmed by experience for most people) and d) should mean that people latch on to new discoveries and work in the field, which I don’t see.

  • Sort of related to bias, ESR’s most recent post is Taxonomy of the haterboy. There are some interesting comments related to self-deception there.

    Notably this one by The Monster:

    I have had people say to me “You just want to be RIGHT all the time!” in response to me pointing out some flaw in their position.

    My response is “Of course I do. It would be silly to want to be wrong. So, yes, I try to be right. Most of the time I succeed at it, but sometimes I’m not. When that happens, I want to know what ‘right’ is so I can be that instead of wrong.”

    I am also often accused of being a smartass. To confirm that diagnosis, I consistently point out that it is better to be a smartass than a dumbass.

  • gregorylent

    self is an illusion, say the yogis. maybe they are on to something.

  • Personally, I believe (a) – self-deception is rare. Most of the examples given strike me as weak, and I would ascribe either to ignorance, logical errors or outward hypocrisy. I think most “self-deception” is only in the sense that we are choosing not to think about our motivations, but we could if we wanted to. So for example, someone psyching themselves up by thinking positive thoughts is in a sense engaging in self-deception, but the deception is accessible to him.

    Or perhaps I can turn the question around – caring about self-deception signals that you think you deceive yourself, making you untrustworthy, and it signals that you are unable to cope with people’s stated arguments, so you feel the need to invent their “true” motivations to argue with. Therefore, although people do care about self-deception, they signal otherwise. How’s that for unfalsifiable?

  • Learning Methods is good on the subject. It’s a psychological system based on the premise that people only notice that their behavior has a mismatch with reality if it’s causing them pain, and the spontaneous reaction is to find something which will shut down the pain as quickly as possible. Instead, people are taught to spend enough time observing the pain to find out what thoughts are leading to it (sometimes it’s thoughts causing behaviors which cause pain) and ways in which those thoughts don’t make sense.

    Figuring out what’s really going on is hard, meticulous work.

    I’ve been thinking about what I call group self-congratulation– the ways in which groups of people (sometimes very large ones like nations or races) tell themselves and others that they (the self-congratulating groups) are especially wonderful.

    I’m inclined to think that group self-congratulation is apt to be about real virtues and heroes, but wildly delusional that those virtues and heroes are better than what other groups are congratulating themselves about.

    I’ll add a1 to your list– people get used to the way they’re living, and are deluded about their amount of self-delusion.

    • >I’ve been thinking about what I call group self-congratulation– the ways in which groups of people (sometimes very large ones like nations or races) tell themselves and others that they (the self-congratulating groups) are especially wonderful.

      For an example of a self-congratulatory group, check the pollyanna discussion of donations in the last two entries of Less Wrong, where critical thought is abandoned in gushing self-adulation. Anyone challenging the prevalent self-delusion of moral purity gets “voted down”; that is, a mechanism is in place (“karma”-what a term for use by rationalists!) to enforce the self-congratulatory mindset. If you want to study self-congratulating groups, look next door.

  • Peter St. Onge

    My guess is some b but mostly c. People cherish their deceptions. Which is why we don’t discuss religion and ideology in polite company.

  • The “Less Wrong” website is populated by a self-selected group of people who ostensibly wish to root out bias and make a closer approach to the truth. Yet I find extraordinarily little curiosity by members of that community to investigate many of their own cherished beliefs. It it far more pleasurable to “circle the wagons” and communally heap scorn on the benighted “others” (lottery ticket buyers, non-atheists, and assorted other rabble).

    Even when a relatively well integrated member like “Will Newsome” simply slam-dunks the silliness of cryonics by showing in absolute clarity to anyone paying attention that the personal self is a delusion, I see him gaining absolutely no traction whatsoever with the rest of the fold. As another point against why would the “less wrong” community use the idiotic “karma voting” scheme which creates obvious bias and groupthink tendencies in the very structure of their platform?! The answer is simple, LW is a human socio-cognitive organization following the laws followed by every other human socio-cognitive organizations like religions, political parties, sports fan divisions, gear aficionado groups and all the rest. The social trumps the objective. Always.

    “All lies and jest, still a man hears what he wants to hear and disregards the rest”. . .

    Robin, my only suggestion to you, if you genuinely want to “overcome bias” and become “less wrong” is to spend less time in the “rationalist” community (I would suggest it’s more of a “rationalizing” community) and start ranging a bit farther abroad — try some Psilocybin mushrooms in a supportive setting, do a regular meditation practice in a group setting over an extended period of time, maybe even go live in a rural village in Mexico or India for a few months if you can swing that. I think those activities would do far more to help you reach your goals than more of what you have been doing for the past decade. . .

  • Pingback: Self-deception()

  • Most people just aren’t generally inclined to think so much. Some might say those people are lucky.

  • FWIW, I’m b) – “think it is common for most folks but not for [me]”. I am *somewhat* interested in the signalling and self-deception of others – but I wouldn’t say it was an obsession of mine. People signalling unrealistic things is more irritating than anything else. If people don’t send honest signals, and lie to themselves, it just makes it harder to deal with them.

  • How about the Hayekian/Darwinian explanation? Self-deception provides advantages in spontaneous order. If that’s the case, the question becomes what advantages?

  • jc

    Richard P. Feynman: The first principle is that you must not fool yourself and you are the easiest person to fool.

    Fwiw, I attended a conference this summer and saw a presentation suggesting a curvilinear link b/w self-deception and the cultural intelligence of ex-pat employees (some deception = high cultural IQ). The author seemed to draw on an established literature.

    So there is at least some interest…

  • There’s a certian crudity and lack of depth to your approach to the issue, compared with others who have tackled it (Freud for instance). Picking out a phenomenon called “self-deception” or “bias” implies that you intend to arrive, after your bold slashing through the delusions, at some sort of objective truth. This seems most unlikely and most unwise. What is the objective truth — that we are infinitesimal bags of replicators in competition with billions of other such bags? That’s a truth, but not the truth.

    You can put be down as a sort of (c) — the delusions and deceptions we have about our motives are so baked-in, so essential that they stop being delusions — they are who we are. We live the fictions we create for ourselves. A saint who underneath his altruism is acting to enhance his status and underneath that his reproductive fitness is still an altruist, or at least, the closest thing we have.

    • Mtravern, Great comment. Not sure I completely agree, but it’s a good counterweight to the OP and much of the comments above yours.

    • Ray

      Picking out a phenomenon called “self-deception” or “bias” implies that you intend to arrive, after your bold slashing through the delusions, at some sort of objective truth.

      Well biases certainly do exist so if one was to attempt to arrive at an objective truth – as objective as possible anyway – how else would they begin but to question them? I know you cite Freud, but something more specific to counter would be helpful I guess.

      No offense meant, but you seem to think the asking in and of itself is crude or perhaps arrogant, but I don’t see the logic in that.

  • arch1

    I think that I put more effort than most into finding and rooting out self deception, but acknowledge that much work remains.

    To the extent I don’t do as much as I should, I think the main reasons are c and d, and also e) avoidance of anticipated psychological pain, f) laziness.

    I suspect that for typical people, a and b play more of a role than they do for me.

  • arch1

    indidentally the google books ngram viewer

    seems a potentially useful way to track word and phrase usage trends over time.

    the link (if it works) reflects a search on “self deception” over the period 1800-2000. This particular search result looks a bit uneven, perhaps because the phrase is used so infrequently. YMMV.

  • cournot

    mtraven’s point is a wonderful retort to the naive and often disingenuous ways bias is discussed on this blog. Consider signalling: It may well be that the most important issue is not narrow truth about specific claims but a) about establishing identity b) finding members of a tribe you trust and c) understanding your own desires. Even from a purely econ autistic viewpoint, some “useless” signalling may be essential to establish common knowledge. Knowing that someone reads the same books or watches the same tv may mean a lot in terms of deciphering even routine “factual” statements. For example if I ask you to flip a “fair coin” are we discussing probability or are you trying to trick me and cheat me? Without signalling and establishing your appropriate identity, even the most ordinary statements are fraught with danger.

  • Pingback: On Self-Deception()

  • Pablo Stafforini

    Our tendency to self-deceive affects all of our beliefs, including our beliefs about our own self-deception. The answer to your question therefore seems to be none of the ones you listed, but rather this self-referential feature of the mechanism generating self-deception.

  • Pablo Stafforini

    Troy Camplin actually makes essentially this point in an earlier comment, which I hadn’t read when I posted my own.

  • Do we self-deceive ourselves about the relative importance of self-deception?

  • Ray

    Most likely b)

    Going back to Russ’ related post from about 4 days ago or so, he was using the behavior of Greenspan, Rubin, Summers, and Fed gov’r Gramlich as an example of such behavior.

    What jumped out at me was the differing motives applied to similar men for the same kind of mistakes. One was considered to have made mistakes because of his ideology, while others were led astray by their egos. None of the causes were really quantifiable however, and were completely dependent on the person making the claim. So those with an anti-market bias would see Greenspan as an ideologue and Rubin as an egoist while their pro-market counterpart would simply turn those reasons around.

    Point being that a lot of smart people doing a lot of hard work and research still walk away with their views predictably tilted by their own emotionally inspired interpretations.

    Choice b) from above fits this since everyone has a self-deception issue except the one making the observation of course. Approaching true objectivity requires a person to come very near something that might initially tasted like cynicism, but in fact is a healthy dose of mistrust for virtually everyone, and everything. i.e. I’m willing to accept that Rubin, Greenspan, et al are all influenced somewhat equally by their incredible egos, and their ideologies don’t necessarily run that deep.

  • I think the question is imperfect.

    The assumption that truth in general is important is implicit in the question, and I think it’s a poor assumption. Truth is a nice academic idea, but it isn’t very high on most folks’ lists.

    The Hanson/Falkenstein thesis that status/envy drives everything is far more plausible than the idea that the truth matters much. And if you give up the idea that most folks care about the truth (except perhaps as a point-scoring mechanism), it’s all rather simple.

  • At a guess, research into self-deception will of course be research into the forms that self-deception takes. Much of that is going to be ex post facto moral posturing or ideological, of exactly the sort that humans won’t abide being investigated. So this is really an explanation based on careerism: it’s not in our interests to be constantly controversial.

  • Good questions.

    First, I think the mechanisms we have for NOT deceiving ourselves in a measurable way across scads of technical fields are outstanding. We build machines, and software of quickly increasing complexity that do increasingly subtle tasks. We have NOT run in to a limit yet on what we can do without severely additional methods.

    Second, the deceptions we have as commonplace only survive where they can do the least damage. It may not seem like that as you look at some common mistake, but those who are wrong about investing tend not to be professional investors, those who are wrong about physics don’t build machines, etc. etc. Even within the individual few people hold on to deceptions that cost them much.

    Finally, I think you are the man with a hammer. OK, you have a few hammers, but you carry them around looking for things that MIGHT be nails. And you see a lot of nails around you. Are there really more nails around you then there are around a priest or an astrologer? Probably not, but you make your living with your hammer…

    You might want to do some stuff to quantify the total “deception load” on society, just as we talk about “mutation loads” on certain organisms. I suspect properly examined it is really quite small.

  • Matt Young

    Self deception is a requirement of specialization. None of us our carpenters, and carpentry is not innate. We can only be carpenters because we self deceive. Going deeper, is is even not ourselves that are deceiving. Much of what we accept about ourselves is determined by external signals. When carpenters have tools and walk around a half finished house, they have no choice but to be something not innate.

  • Pingback: On Self-Deception | The Thinker()