"The topic where you most need careful thought is also the topic where your community most punishes such thought. And this is our big blind spot that will likely make our civilization fall."
This is a memorable quote that deserves wide circulation.
This piece omits the adversarial case — you’re wrong because there is an entity that is working against you, making it hard to know the truth. This could be said to be true, for example, of smokers in the 80s.
I often I see people believing wrong things simply because it benefits them to do so. This might fit with your "Bound" case but this is more self-serving self deception than community pressure.
I will predict right now that Mark Zuckerberg will never say on the record that social media could be harmful to young children. Regardless what the truth may be.
When it is more important to you to agree with the consensus than to be right, and you do agree with the consensus, you are more than usually likely to be wrong. (The consensus presumably includes both obvious truths and non-obvious tenets; the latter are in question here.)
Why would we not think our modern morals are adaptive? After all, it's the most advanced, modern, societies who are building the AIs who may be our "descendants" (and it's these same modern societies who are shaping the values of the AIs) . Furthermore, the more modern societies are also likely to be those who will be the first to create large space populations, which increases adaptivity in the long run.
Our morals were likely adaptive centuries ago when our society rose above the rest. But we've been changing them fast since then, without obvious confirmations of their adaptiveness.
If we define adaptive as "apt to grow in frequency", then surely the rise of the Asian countries – under a largely Western economic framework – is confirmation of Western culture's adaptiveness.
Pick a domain – legal standards, IP protections, styles of advertising, forms of birth control, styles of clothing, music formats, technology applications, programming languages, etc. – and over the last 10 years you will see more culture imported from West to East than vice versa. If that isn't adaptive success then what is?
I guess I see lots of upcoming selection pressures that confirm the adaptiveness of our new morals. Any society that can get to space in a sustainable way gets a big adaption boost, and the only ones that have the chance of doing that have very modern pro-science, pro-growth, pro-meritocracy values. Similarly, the creation of AI descendants (which is pro-adaption if those descendants to some extent retain our values) is primarily going to be done by the most modern-valued societies.
I'm not sure I agree with the idea that our 'modern morals' are the ones that contain pro-science, pro-growth, pro-meritocracy values. Indeed, I'd posit the load-bearing cultural infrastructure for those values is sitting firmly within the values of the previous few centuries. It is not obvious to me that the changes in the recent decades are the changes that encode those values. On the contrary, it seems obvious to me based on the historical trajectory of technological/economic development, we had much of the load-bearing cultural infrastructure prior to these mutations.
And these mutations are just as likely to be unadaptive as they adapative: or rather, we should get out the realm of 'just as likely'. Some changes will be adaptive and some won't. But the fact is, just because we are doing them in our society doesn't mean they are. It just isn't really strong evidence. Which is the point of this post.
"The topic where you most need careful thought is also the topic where your community most punishes such thought. And this is our big blind spot that will likely make our civilization fall."
This is a memorable quote that deserves wide circulation.
This piece omits the adversarial case — you’re wrong because there is an entity that is working against you, making it hard to know the truth. This could be said to be true, for example, of smokers in the 80s.
I often I see people believing wrong things simply because it benefits them to do so. This might fit with your "Bound" case but this is more self-serving self deception than community pressure.
I will predict right now that Mark Zuckerberg will never say on the record that social media could be harmful to young children. Regardless what the truth may be.
Yes, there are other factors I didn't mention, and that is one of them.
When it is more important to you to agree with the consensus than to be right, and you do agree with the consensus, you are more than usually likely to be wrong. (The consensus presumably includes both obvious truths and non-obvious tenets; the latter are in question here.)
Why would we not think our modern morals are adaptive? After all, it's the most advanced, modern, societies who are building the AIs who may be our "descendants" (and it's these same modern societies who are shaping the values of the AIs) . Furthermore, the more modern societies are also likely to be those who will be the first to create large space populations, which increases adaptivity in the long run.
Our morals were likely adaptive centuries ago when our society rose above the rest. But we've been changing them fast since then, without obvious confirmations of their adaptiveness.
If we define adaptive as "apt to grow in frequency", then surely the rise of the Asian countries – under a largely Western economic framework – is confirmation of Western culture's adaptiveness.
Pick a domain – legal standards, IP protections, styles of advertising, forms of birth control, styles of clothing, music formats, technology applications, programming languages, etc. – and over the last 10 years you will see more culture imported from West to East than vice versa. If that isn't adaptive success then what is?
It may not remain this way of course.
A successful society will get others to copy it, and even to copy its changes. Doesn't mean those changes were adaptive.
I guess I see lots of upcoming selection pressures that confirm the adaptiveness of our new morals. Any society that can get to space in a sustainable way gets a big adaption boost, and the only ones that have the chance of doing that have very modern pro-science, pro-growth, pro-meritocracy values. Similarly, the creation of AI descendants (which is pro-adaption if those descendants to some extent retain our values) is primarily going to be done by the most modern-valued societies.
I'm not sure I agree with the idea that our 'modern morals' are the ones that contain pro-science, pro-growth, pro-meritocracy values. Indeed, I'd posit the load-bearing cultural infrastructure for those values is sitting firmly within the values of the previous few centuries. It is not obvious to me that the changes in the recent decades are the changes that encode those values. On the contrary, it seems obvious to me based on the historical trajectory of technological/economic development, we had much of the load-bearing cultural infrastructure prior to these mutations.
And these mutations are just as likely to be unadaptive as they adapative: or rather, we should get out the realm of 'just as likely'. Some changes will be adaptive and some won't. But the fact is, just because we are doing them in our society doesn't mean they are. It just isn't really strong evidence. Which is the point of this post.