Where You Are Most Wrong
What are you the most wrong about? You know the least about stuff far away from you in distant galaxies, but as you have few opinions about that, and it hardly affects you, who cares?
But what are you the most wrong about where you do have opinions, and where they are consequential for you? Seven relevant factors say when you are BLINDED:
[B]ound: When you are judged by your group on your confident and unthinking belief in and loyalty to particular claims, you won’t study them well.
[L]ow-Impact: When you are wrong about factors relevant for collective choices, your vote barely moves them, and so you have little incentive to think about them to make them better.
[I]ndefinite: When concepts come from a high dimensional space where it seems hard to pin them down, separate them, or to define or measure them.
[N]on-Connected: When you see relevant concepts as coming from a whole separate realm that has no logical connections to all the usual realms where you know things.
[D]evalued: When you declare yourself to be largely indifferent to the consequences for you, as something else matters much more to you.
[E]vidence-Poor: When you actually have little relevant data to draw on, and the best data that you have supporting your opinion is the mere fact that some groups like yours have continued to exist and while holding this opinion.
[D]ynamic: When the topic is about what changes to be making to your group’s collective choices, either recently or in the near future, the mere fact that your group exists no longer offers even weak evidence for those choices.
The max mistake topic area, with all of these relevant factors, each of which suggests that you are wrong, is: the adaptiveness of your morals.
Your group suspects that you are evil if you do not see their morals as obvious, and even suspects you if you had to think to come to agree with them. Morality is a collective choice, where you are punished for deviating, so to have an impact you’d have to change your group’s shared moral opinions. Moral concepts tend to be hard to pin down, and today most see moral claims as sitting in a disconnected realm where all our usual non-moral claims are not relevant.
On the topic of the cultural and DNA adaptiveness of your group’s morality (and norms and status markers), most people say they care much less about the adaptiveness of their morals than about the “moral truth” of their morals. Figuring out theoretically which morals are more adaptive is actually quite hard, and so our best evidence is empirical: which successful societies have had which morals. But the fact that your society seems inclined to change its morals lately in a particular direction is far weaker evidence for the adaptiveness of that direction.
The topic where you most need careful thought is also where your community most punishes such thought. This is our big blind spot on which our civ will likely fall.


"The topic where you most need careful thought is also the topic where your community most punishes such thought. And this is our big blind spot that will likely make our civilization fall."
This is a memorable quote that deserves wide circulation.
Why would we not think our modern morals are adaptive? After all, it's the most advanced, modern, societies who are building the AIs who may be our "descendants" (and it's these same modern societies who are shaping the values of the AIs) . Furthermore, the more modern societies are also likely to be those who will be the first to create large space populations, which increases adaptivity in the long run.