Back in November I read this Science review by Nira Liberman and Yaacov Trope on their awkwardly-named "Construal level theory", and wrote a post I estimated "to be the most dense with useful info on identifying our biases I've ever written":
- [NEAR] All of these bring each other more to mind: here, now, me, us; trend-deviating likely real local events; concrete, context-dependent, unstructured, detailed, goal-irrelevant incidental features; feasible safe acts; secondary local concerns; socially close folks with unstable traits.
- [FAR] Conversely, all these bring each other more to mind: there, then, them; trend-following unlikely hypothetical global events; abstract, schematic, context-freer, core, coarse, goal-related features; desirable risk-taking acts, central global symbolic concerns, confident predictions, polarized evaluations, socially distant people with stable traits.
Since then I've become even more impressed with it, as it explains most biases I know and care about, including muddled thinking about economics and the future. For example, Ross's famous "fundamental attribution error" is a trivial application.
The key idea is that when we consider the same thing from near versus far, different features become salient, leading our minds to different conclusions. This is now my best account of disagreement. We disagree because we explain our own conclusions via detailed context (e.g., arguments, analysis, and evidence), and others' conclusions via coarse stable traits (e.g., demographics, interests, biases). While we know abstractly that we also have stable relevant traits, and they have detailed context, we simply assume we have taken that into account, when we have in fact done no such thing.
For example, imagine I am well-educated and you are not, and I argue for the value of education and you argue against it. I find it easy to dismiss your view as denigrating something you do not have, but I do not think it plausible I am mainly just celebrating something I do have. I can see all these detailed reasons for my belief, and I cannot easily see and appreciate your detailed reasons.
And this is the key error: our minds often assure us that they have taken certain factors into account when they have done no such thing. I tell myself that of course I realize that I might be biased by my interests; I'm not that stupid. So I must have already taken that possible bias into account, and so my conclusion must be valid even after correcting for that bias. But in fact I haven't corrected for it much at all; I've just assumed that I did so.