Far is Overconfident

Since our minds are smaller than the world, the world tends to be more complicated than our mental models of it. Yes, sometimes we think things are more complex than they really are, but far more often reality is more complex than we appreciate. All else equal, since far mode makes us neglect detail, it tends to make us think things are even simpler, thus increasing our error. So far mode is a major source of human overconfidence. From the latest JPSP:

People generally tend to believe they are more competent than they actually are, and this effect is particularly pronounced among poor performers. … One striking demonstration, the illusion of explanatory depth (IOED), arises when people overestimate their ability to explain mechanical and natural processes. For example, people know that a zipper closes because it has teeth that somehow interlock, but they know very little about how the teeth actually interlock to enable the bridging mechanism. Similarly, many people know vaguely that an earthquake occurs because two geological plates collide and move relative to one another, but again they know little about the mechanism that initially produces these collisions. Nonetheless, people believe they understand these concepts quite deeply and are surprised by the shallowness of their own explanations when prompted to describe the concepts thoroughly. …

People who construe a ballpoint pen abstractly are more likely to focus on the pen’s function and perhaps its global appearance. In contrast, people who construe the pen concretely are more likely to focus on how well they understand how its parts work together to enable the pen to function—in this case, the appropriate metacognition. Accordingly, people are less likely to overestimate their understanding of how the pen works when their introspections focus appropriately on the pen’s concrete features rather than its abstract features. …

In six studies, we showed that IOEDs arise at least in part because people sometimes adopt an inappropriately broad or abstract construal style when evaluating their understanding of concrete processes. … Participants … experienced larger IOEDs the more abstractly they construed 13 basic human behaviors. … Participants rated their knowledge of how three mechanical devices worked more accurately when the devices were framed more narrowly according to their component parts. When asked to express how those devices worked, only participants in the broad construal condition were surprised by the incompleteness of their explanations. …

Participants were induced to adopt a concrete or an abstract mindset by expressing how (concrete) or why (abstract) they engage in certain everyday processes, like getting dressed in the morning. Again, participants in an abstract mindset tended to show a significantly greater IOED. … Participants … reported understanding their favored 2008 Presidential candidate’s policies better than they actually did when asked to express those policies in writing. … Participants who adopted a more abstract construal style showed a more pronounced illusion of political sophistication. …

Our findings suggest that … when adopting an abstract construal style, people might therefore be systematically overconfident about what the future holds and how well they understand themselves and others. … The IOED is both similar to and distinct from a range of overconfidence biases documented. … According to one account, egocentric over-confidence effects tend to emerge because people anchor on their own subjective experiences and fail to adequately account for the experiences and abilities of other people. … Other researchers have suggested that people are overconfident because … their memories tend to be overpopulated with successes rather than failures.

GD Star Rating
Tagged as: ,
Trackback URL:
  • Pingback: Tweets that mention Overcoming Bias : Far is Overconfident -- Topsy.com()

  • Hyena

    All of this makes me wonder why people are set up to be overconfident. Alternatively, whether the study measures what it thinks it does.

    People might be abstractly confident but concretely unconfident, without being overconfident at all. There isn’t dissonance there; people who are pro-market often see abstract/concrete confidence/humility to be a necessary part of their justifications.

    In such a case the study might measure the fact that people don’t tend to give the two types of cognition different terms when discussing their “understanding”.

    • Even more so, all of this makes me wonder why people like confidence in others. Particularly others that will do work for them or make recommendations to them.

    • Doug

      Most outcomes of decisions that we make have exponential, not linear distribution of outcomes. I.e. say you’re starting a business, I think your uncertainty is on the scale of orders of magnitude. I.e. it could be a bust, a steady income or make you very rich. It’s not like you expect it to make between $200-600k a year with 95% certainty.

      Let’s say I’m fitting a model for how to make these types of decisions, I’ll fit my model on the logarithm of the dependent variable, not a linear model. Now my model gives me predictions of Y for various independent variables X, but because I fit on a log-log the unbiased linear expectation is going to be higher than my what my log-log model says.

      Overconfidence bias is a good correction to another bias that comes from the uncertainty in the models we use. I think this is why optimism works best in areas where outcomes do tend to be the most exponential (e.g. technology or entrepreneurs) and the most disastrous where outcomes tend to be more linearly distributed (e.g. finance is the best example I can think of).

      • Hyena

        Actually, I’m not really sure your priors are right here. Turnover in small business is about 10% and less than that represents the collapse of companies is less, bankruptcy represents occurs to about 6% of businesses. There are other complicating factors that suggest low numbers because businesses aren’t people and so “business survival” isn’t such a concrete term.

        Add in the fact that most businesses seem to be started by people who know how the business operates and what the market is like. So a pretty certain expectation of income and success doesn’t seem like a bad assumption.

        I think that a lot of our discussions about confidence in business stem primarily from a misinformed view of what the business population is actually like. For example, a lot of studies for overconfidence test CEOs; but that title carries with it prestige and indications of ambition, it’s perfectly possible that such studies would be hugely skewed by their own selection. But this basic objection, that a lot of this seems to ignore any… economic anthropology, is my standard issue with a lot of what is said here and in economics generally.

  • Aron

    Far mode is really lazy mode. We are in it by default. It gives us fast conclusions on few details, but that’s what we ask of it. It’s not the lack of details considered that creates overconfidence, it’s the disinterest we have in discovering overconfidence that gives us overconfidence.

    It’s all a bit like vision. We are overconfident in the quality of our perception of the entire field in front of us. But we can actually divert the fovea to a target, and we know that we need to do this, if we are motivated to do so.

    The biggest question in practice is whether we realize the need to focus on a piece of the field, prior to what’s in front of us changing and removing that possibility.

    • Hyena

      Could we even be sure of that? One issue with the lack of resolution in modes of thought is that many people have never learned to increase their mental resolutions to begin with.

      You make an analogy with vision. I took an awesome amount of drawing and painting in college. One thing you learn is how to observe objects so that you can draw them with a high degree of fidelity. Likewise, conceptual resolution is a learned capacity. People do not naturally envision complex systems with many moving parts or have a way of incorporating the limitations of knowledge.

      It might not be that people are “lazy” but rather that they are untrained.

    • mjgeddes

      The vision analogy is another good analogy providing support for my ideas!

      If far mode is like wide-field vision and near mode is like *focused* vision, that means that near mode is just a special case of far mode. If far mode is analogical/narrative reasoning and near mode is Bayesian reasoning, it follows that Bayesian reasoning is just a special case of analogical/narrative reasoning (a case of especially *focused* analogical inference).

      Based on this (far-mode) ‘vision’ analogy, my conclusion seems obvious. But perhaps I am suffering huge over-confidence caused by my swallow understanding of the (near-mode) details of Bayesian inference? 😉

  • We propose regulation of social science journals so that readers are not subjected to superfluous acronym proliferation (SAP). Cox and Hardy (2004) showed that SAP-induced fatigue causes greater impairment of reading comprehension than does that associated with legalistic writing, where long phrases are truncated to a single key word or two.

    These critics would, for example, render “the illusion of explanatory depth” as The Illusion rather than IOED.

    Although it may seem counterintuitive that lawyerly writing proved easier to understand than writing with SAP, their results were significant at the 0.0001 level. Still, SAP defenders Higglefirth and Blumenstein (forthcoming) have prepared a methodological improvement on the basic model, tentatively titled not quite so superfluous acronym proliferation (NQSSAP). Details remain to be worked out.

  • Self plug, but here’s a discussion I had with Slavisa Tasic on the Illusion of Explanatory Depth (mp3) for the Critical Review journal’s alumni site.

  • Hi – For me the premise is problematic. complex and complicated are not interchangeable concepts. They are not the same. Briefly, complicated things are deterministic. They are suited well to analytical reductionism. People lack IOED for complicated things. Complex systems are non-deterministic. Reductionism does not work. People must use abstraction and holism to explain complex systems. For a better world most people should not worry about ‘…their ability to explain mechanical and natural processes.’ Rather, greater focus on abstract complex systems, something humans do rather well when give the chance, should be the priority. Many, many problems are caused by Newtonian reductionism and process IOED, so I say, forgetaboutit! -j

  • nzc

    I think the illusion of explanatory depth can serve as a kind of bet that when called to understand the details, we will be able to. The problem is, most folks don’t have much experience actually making that transition. This is mostly another aspect of rational ignorance — if the lazy/far theory works well enough, why waste time fleshing it out?

    What I’m trying to say is, IOED is just part of the natural human strategy for survival in the world, and luckily enough, it enables some folks (perhaps by luck, perhaps by dint of native skill in analysis) to make large cognitive leaps which can then be justified by analysis and synthesis, or by science.

    Of course, many (most?) leaps end up on the sharp rocks.

    Also, repeat mjgeddes final thought here.