16 Comments

I don't think it actually presents a challenge to standard deciscion theory -- only the assumption that what we call our values and beliefs actually correspond to the decision theory concepts.

I mean you could model us as actually having values that highly prize signalling we aren't weird or divergent from our community. And looking at people's willingness to bet on their supposed beliefs in high stakes situations suggests that in some sense we don't really believe them.

Though, we should keep in mind (especially in AI discussions as people often make this error) that beliefs/values aren't real things out there in the world and are instead just hueristic idealizations used to predict behavior within a certain range of cases.

I mean, this can trivially be seen to be true with extreme torture where people can be made to say things even knowing that long term it will result in worse pain just to make the immediate pain stop. Or in the fact that what people profess to believe in religious contexts about life after death doesn't actually impact behavior in the way one might assume.

Ultimately, if you cut open a person or computer you don't find beliefs or values just algorithms for behavior. It's just that within certain parameters we can use these theoretical ascriptions to make predictions about that behavior.

Expand full comment

Been loving these explorations, really thought provoking. My supposition is that we evolved without respect to explicit values or beliefs, in an environment in which the feedback loops for cultural change were fairly swift and the rate of drift fairly slow. We shouldn't expect ourselves to be able to deal with this well. Hopefully the cautionary tales that accompany the physical ruins of our civilization will at least be entertaining and well written. Another Homer awaits.

Expand full comment

“This difficult situation seems a deep challenge to standard decision theory, in which one never changes values, only beliefs. If one accepts this standard framework, as I do, and also that one is in fact choosing [whether] to accept one’s culture’s new values, then one must find a way to see these as “values” as actually subgoals that we change as a way to preserve our deeper constant values. But then what could be these deeper values?”

I have always found this assumption of decision theory to be highly dubious. The complete distinction between values and beliefs is not ultimately coherent. That is, our values are a subset of our beliefs: we must have beliefs that certain things are valuable. But if we encounter an argument that we perceive as successfully challenging one of those values, then we cease to believe that it is valuable (and this does not appear to be a choice). Perhaps the sole deepest a priori value, which cannot be changed, is that “we should prefer whatever appears to be for the best”. Hence, but even more controversially, there is neither akrasia nor intentional immorality (as Socrates/Plato realised).

Expand full comment

I wonder whether a sudden shift in deep values might be explained as what Joseph Tainter calls “scanning behavior.” If one anticipates some sort of collapse, or even just diminishing returns on investment in old beliefs, new ones, even risky ones, may seem more attractive.

Expand full comment

Attaining social status is a balancing act between mirroring the norms around you (especially as displayed by those with high status), and becoming a leader by departing from those norms and getting others to follow.

Walking this tightrope is a subtle game but directionally a few things are clear: As you gain in status, you shift from mirroring to leading. And as you lose status (or simply drift too far from the cultural center), you need to shift back to mirroring. That is, if you want to keep playing the game.

Also I question how firmly held "core values" truly are. We have a rationalization engine in our minds that excels at post-facto justifications for whatever it is we commit ourselves to, which makes us better moral salespeople among other things. Put that same rationalization engine into a different historical/cultural context and voila, things like human sacrifice become core values. (How else are we going to appease Baal?)

Expand full comment

You state that you accept standard decision theory, according to which one never changes values, only beliefs.

I'm skeptical of that theory. One reason follows: I can't think of a deeper value than the value one places on his own life, for in the absence of life no values are possible. People who continue to live apparently value their lives, since every day they perform acts that sustain their lives. Yet some people eventually commit suicide, which suggests they no longer value their lives. If you claim that doesn't count as a value change, then what does? If your answer is "nothing", then your theory is unfalsifiable and hence unscientific.

Expand full comment

> the US won’t consider changing from English to metric units, from presidential to parliamentary democracy, or stop daylight savings time, even though these changes seem clearly better to most analysts.

The US has considered stopping daylight savings time. E.g. https://www.reuters.com/world/us/us-senate-approves-bill-that-would-make-daylight-savings-time-permanent-2023-2022-03-15/

Expand full comment

good questions to ponder upon.

Another might be how do we develop values in the first place without being exposed to optional value alternatives?

Are values ours if we simply accept the values that we are presented with?

And if our values cannot adjust to accept alternatives isn't that what might cause conflict?

If we only accept the values of the past, might that not in itself, be a form of mind control?

But if values can shift, are they values at all?

Expand full comment
Apr 18·edited Apr 18

You point out that the arguments being used in favor of values changes were known in the past. This argument, itself, was also known in the past. Values are always a matter of optimization under constraints of time, physical resources, mental resources, and social resources. In some sense we've all been arguing over the same few frameworks for evaluating and selecting among what we say are our values (consequentialism, legalism, virtue ethics, and variations under these) for as long as there has been writing. All the other differences are downstream of that, and whatever we might think ideal values *should* be, practicalities constrain what we are willing to espouse as the values a person needs to adhere to in order to remain in good standing in a community.

"We all fall short of our ideals, but some fall short of better ideals than others." It's easy to get people to acknowledge that the actual ideals Jesus preached were good ideals; much harder to love thy neighbor as thyself when at least one of you is going to starve after a bad harvest. It's easy to persuade people that slavery is bad; much harder to abolish when the bad people who set it up have built their entire identities and social edifice around it.

Whatever you think of his idea of "natural slavery," Aristotle was clearly reasoning in this mold when pointing out that without slaves providing their masters with leisure time, there would be no one in society at liberty to focus on higher goals, and therefore it was a necessity. The industrial revolution, in different times and places, has made the apparent necessity of slavery higher or lower. How much of the cause of the US civil war derives from the fact that we managed to automate processing cotton a century and a half before we managed to automate picking cotton? I'd say quite a lot. If Eli Whitney had somehow built a cotton harvester the same decade as the cotton gin, the whole 19th century could have looked completely different.

I think everything else you're describing is downstream of that. It's not too different from your own discussions of the dream time, and of farmer and forager values. The past few decades, the past century really, have been characterized by a massive increase in available wealth and leisure among a rapidly growing number of people who could be considered elites. This has meant they finally *can* accept the arguments that in some sense "should" have always been convincing. They've gotten a lot wrong too, because they're human. But aside from practicalities and familiarity built up stepwise over time, I don't think there needs to be more explanation than this.

I do think there has been a sense in recent decades that elites are moving so fast in favor of new values, and so quickly to demonize what they see as past values, that a lot of society is getting cast into the outer darkness as irredeemably evil for no fault except moving slightly slower. This is not good for many reasons, but I do think it is an expression of the same underlying phenomenon.

Expand full comment

Really interesting line of thought. I would make the point that the values of the prestigious elite are endogenous. The article seems to imply that they are exogenous, although you don't explicitly claim that. Nevertheless, I believe they are endogenous because prestige depends on people's approval. My perception is that elites face strong incentives to express values that resonate with people. Indeed, elites often become elite by being unusually well-attuned to the zeitgeist.

This would mean our culture's evolution isn't random. Rather, its direction at each moment is tightly influenced by its current state. People adopt the values of elites that resonate with them, and elites express the values that will earn them the most prestige by resonating with people.

If correct, this wouldn't compromise your key points. No-one is in control of this process, and there is no guarantee that it is leading us in in the "right" direction.

Expand full comment

I think you have misconceived *basic values* in your conceptual scheme of rational action. There is just one *basic* rational value, namely The Good, which the rational agent seeks to maximize. Subsidiary “values” are merely instrumental; however firmly held, they are not really basic. For example, the rational agent will support the legal punishment of homosexual acts if and only if he believes that such a regime will best promote The Good. If he once believed that, but now has come to believe that legalization of and general toleration of homosexual acts will better promote The Good, he has changed a value, but not a *basic* value. The change is fundamentally a change in *belief*, the rationality of which can be assessed by the familiar method.

The rational agent does not fear a change in his *basic values*: he is forever committed to *maximizing The Good*.

Expand full comment

I wonder if the cultural changes you describe represent individual values changes or instead represents the replacement of older people with older values with newer people. I suspect it is a bit of both, but I don’t know what ratio.

I also wonder how one defines deep values and whether they are changeable or just their expression changes, and if people even know what their deep values are ahead of time or just sort of work day to day on intermediate levels. I certainly notice that if you ask people what their deep values are the response is often something trite and dripping social desirability that also doesn’t seem to line up with their behavior well. That makes me think many people never really give the question much thought, and indeed might spend a lot of time chasing things they don’t really value, and only if they are lucky do they learn what really matters to them.

Someone must end on their deathbed thinking “I should have spent more time at the office,” but you never hear it.

Expand full comment

You could construe as moving toward truth, reducing error as we go along. If you look back to before you, they were obviously wrong about a lot, right? Some of those errors have been addressed...

Expand full comment