Katja:
Relatively minor technological change can move the balance of power between values that already fight within each human. [For example,] Beeminder empowers a person’s explicit, considered values over their visceral urges. … In the spontaneous urges vs. explicit values conflict …, I think technology should generally tend to push in one direction. … I’d weakly guess that explicit values will win the war. (more)
The goals we humans tend to explicitly and consciously endorse tend to be more idealistic than the goals that our unconscious actions try to achieve. So one might expect or hope that tech that empowers conscious mind parts, relative to other parts, would result in more idealistic behavior.
A relevant test of this idea may be found in the behavior of human orgs, such as firms or nations. Like humans, orgs emphasize more idealistic goals in their more explicit communications. So if we can identify the parts of orgs that are most like the conscious parts of human minds, and if we can imagine ways to increase the resources or capacities of those org parts, then we can ask if increasing such capacities would move orgs to more idealistic behavior.
A standard story is that human consciousness functions primarily to manage the image we present to the world. Conscious minds are aware of the actions we may need to explain to others, and are good at spinning good-looking explanations for our own behavior, and bad-looking explanations for the behavior of rivals.
Marketing, public relation, legal, and diplomatic departments seem to be analogous parts of orgs. They attend more to how the org is seen by others, and to managing org actions that are especially influential to such appearances. If so, our test question becomes: if the relative resources and capacities of these org parts were increased, would such orgs act more idealistically? For example, would a nation live up to its self-proclaimed ideals more if the budget of its diplomatic corps were doubled?
I’d guess that such changes would tend to make org actions more consistent, but not more idealistic. That is, the mean level of idealism would stay about the same, but inconsistencies would be reduced and deviations of unusually idealistic or non-idealistic actions would move toward the mean. Similarly, I suspect humans with more empowered conscious minds do not on average act more idealistically.
But that is just my guess. Does anyone know better how the behavior of real orgs would change under this hypothetical?
"An example: after the introduction of airplanes, there was a lot more high-speed travel - but did people suddenly start valuing high-speed travel a great deal after the invention of airplanes, or did they always want to travel fast and simply couldn't do it before?"
With technology it's often a third option: people don't really miss is when it hasn't been invented yet, or they dream about using it in some way that's not the way it mainly ends up being used. People thought a cellphone would save working time and thus give them more leisure time but it ended up being used to just make people more productive in the same amount of working time. These are group dynamics that people tend to forget about. People don't value punctuality more than they used to, people are punctual because they have to be since one business once upon a time discovered he could make a bigger profit by using alarm clocks for punctuality and so forced all the other businesses to do the same or go out of bankrupt.
"Marketing, public relation, legal, and diplomatic departments seem to be analogous parts of orgs. They attend more to how the org is seen by others, and to managing org actions that are especially influential to such appearances. If so, our test question becomes: if the relative resources and capacities of these org parts were increased, would such orgs act more idealistically? For example, would a nation live up to its self-proclaimed ideals more if the budget of its diplomatic corps were doubled?"
Marketing and diplomacy are about cloaking/sweeping-under-the-carpet your true intentions and spinning inconvenient truths. Giving them more resources would give their parent orgs more leeway to divert from purported noble ideals.
"The goals we humans tend to explicitly and consciously endorse tend to be more idealistic than the goals that our unconscious actions try to achieve."
Orgs aren't people, they are collections of people and therefore work very differently and again, marketing and diplomacy don't set goals, they spin the actual goals (which are determined somewhere else) around for public consumption.