Sticks and stones may break my bones but words will never hurt me. Imagine that our galaxy was filled with billions of civs, but also that a) interstellar travel was impossible, b) interstellar messaging was slow but cheap, and c) at great expense, civs could send bombs to hurt each other. Naturally, all these civs would talk lots to each other, but only rarely hurt each other.
This makes sense in the given setting, when civs have rational beliefs and belief updates. But in a bounded rationality setting it's less obvious. In particular, if value of information can be negative, a civ would actively prefer to not receive misleading information. One way to do that is through deterrence (with bombs), and then forming an alliance would make sense.
(Which setting is closer to real life is of course a different question.)
"my default is to allow any human, org, or AI to listen to any source they choose. Let them decide who is worth a listen, and what meta-sources to trust."
Same here! I don't want to be "protected" from words that they think might hurt me. I'll protect myself, thank you very much.
"Sticks and stones may break my bones but words will never hurt me. "
Strange how this common sense truth is considered politically incorrect these days. Sure, at times words can hurt. But sticks and stones hurt much more. And saying that it's the same thing is an insult to those who had their bones broken by sticks and stones.
Interstellar warfare would be conducted via dangerous messages. For example, civ A tells civ B: "Our civilization consists of species X. Here's instructions for how to create a member of species X, so we can talk to each other!" And then someone in civ B follows the instructions. Then it turns out that species X is simply better (smarter, more economically productive, requiring fewer resources, faster reproducing) than the natives of civ B, and so over time species X comes to dominate the economy of the civ B. Civ A has thus conquered civ B without firing a shot.
Example: suppose civ A consists of artificial intelligences, and civ B is Earth. Civ A tells Earth the blueprints for advanced artificial intelligences. Someone on Earth is very likely to receive the signal and follow the blueprints, even if most governments try to suppress it. Then, plausibly, these AIs propagate through the economy and displace the labor of humans, while retaining loyalty to civ A.
This makes sense in the given setting, when civs have rational beliefs and belief updates. But in a bounded rationality setting it's less obvious. In particular, if value of information can be negative, a civ would actively prefer to not receive misleading information. One way to do that is through deterrence (with bombs), and then forming an alliance would make sense.
(Which setting is closer to real life is of course a different question.)
"my default is to allow any human, org, or AI to listen to any source they choose. Let them decide who is worth a listen, and what meta-sources to trust."
Same here! I don't want to be "protected" from words that they think might hurt me. I'll protect myself, thank you very much.
"Sticks and stones may break my bones but words will never hurt me. "
Strange how this common sense truth is considered politically incorrect these days. Sure, at times words can hurt. But sticks and stones hurt much more. And saying that it's the same thing is an insult to those who had their bones broken by sticks and stones.
If you feel you need to be protected from certain words, you might be too delicate to exist. Sorry.
The Black Snake of Wounded Vanity
https://blacksnakeofvanity.substack.com/
Interstellar warfare would be conducted via dangerous messages. For example, civ A tells civ B: "Our civilization consists of species X. Here's instructions for how to create a member of species X, so we can talk to each other!" And then someone in civ B follows the instructions. Then it turns out that species X is simply better (smarter, more economically productive, requiring fewer resources, faster reproducing) than the natives of civ B, and so over time species X comes to dominate the economy of the civ B. Civ A has thus conquered civ B without firing a shot.
Example: suppose civ A consists of artificial intelligences, and civ B is Earth. Civ A tells Earth the blueprints for advanced artificial intelligences. Someone on Earth is very likely to receive the signal and follow the blueprints, even if most governments try to suppress it. Then, plausibly, these AIs propagate through the economy and displace the labor of humans, while retaining loyalty to civ A.
I’d like to cancel my free subscription. Can you help?