Compared to before writing, religions that had sacred texts were better able to resist changes to their religious dogmas and dogma-enforce social rules. The more explicit were sacred texts, and their widely-accepted interpretation texts, and the more concrete yet robust were their concepts to social changes, the longer such resistance could last.
Shakespeare’s last play was the Tempest in 1611. Halfway between then and now was 1818, when Mary Shelley wrote Frankenstein and Jane Austen wrote Persuasion. That first period saw much bigger changes in the English language than did the second, as we developed and spread formal theories of English grammar and dictionaries that defined vocabulary. People who tried to change language recently have faced much stronger policing, by others telling them they were doing it wrong, and pointing to these official definitions.
Together with the rise of mass production, systems developed over the last few centuries for accounting, architecture, and civil and mechanical engineering, also locked in many ways to calculate plans for businesses, buildings, and machines, and many standard sizes and features of parts. Furthermore, standard programming languages, software libraries, and operating systems have locked in many software practices.
We have plausibly been suffering from cultural drift for several centuries. Could we use systems somehow to prevent such drift? In principle yes, but our current culture doesn’t seem inclined to do so.
Over the last few centuries our cultures have purposely rejected many prior sacred texts as no long binding. They’ve directly rejected religious texts and liberally reinterpreted constitutions and laws. And the main thing that modernist cultural communities have agreed on is to reject traditional cultural elements, and celebrate exploration of the widest possible range of alternatives. Cultural activists, who purposely change our norms and values, are now our most celebrated heroes.
So, yes, we might in principle develop explicit new systems of norms and values, and encode them not only in language but also in law. Surveillance and AI would even let us define and measure more things more clearly and verifiably. Such as what emotions people expressed, or how much someone people were hurt by things others did. All of which could produce a system that locked in our norms and values far more strongly than did any ancient religious texts, and keep them from drifting.
Of course such systems might well rot, as our current legal and regulatory systems seem to be doing, and also many of our technical systems. But English doesn’t seem to be rotting, nor do most of our math systems, so it does seem possible to keep systems simple and robust enough to keep rot to a minimum.
But first we’d need a huge cultural change to want to lock them in. At the moment, that’s the main limiting factor.
The concept of heaven (putting aside its metaphysical reality) functions (functioned?) as a social technology that modern frameworks don’t replicate. The usual ills mentioned – political and cultural polarization, short-term thinking, family disintegrating, crises of meaning – are more easily addressed with the heaven concept. It's not a sacred text but it is a sacred text.
Heaven is the anchor tenant for an accountability framework that extends beyond all other large and small social institutions, offering a psychological balm for the suffering injustice and (potentially) constraining those in power. It offers deferred justice as a correction to imperfect (or corrupt) earthly systems. It transforms relationships with mortality. It softens grief, creating intergenerational continuity. It both grounds and transcends rule-based ethics, privileging thoughtfulness and character development (internalizing values) over simple compliance. It counterbalances short-term biases, enabling more thoughtful engagement with multigenerational challenges. It encourages humility. Most importantly, it provides a framework for reconciliation, which absolutely no current value system offers.
If we want an adaptive culture, we want to allow *some* wiggle room for new ideas - most new ideas are bad, but if we don't have a mechanism for testing them and only adopting the good ones, we don't adapt. FA Hayek wrote about this.
BTW, for a while I (and I suspect other of your readers) thought you were worried about *humanity* becoming maladapted, which seemed a bit pointless because humans don't have competitor species. I realized you're actually worried about your preferred culture (Western liberal democracy) becoming maladapted and losing out to other cultures.
I share the concern; you might want to make that a little more explicit for those of us who are slow.