Three years ago I described the “What if Failure Taboo”: A simple moral principle: when a future change is framed as a problem which we might hope our political system to solve, then the only acceptable reason to talk about the consequences of failing to solve that problem is to scare folks into trying harder to solve it. If you instead assume that politics will fail to solve the problem, and analyze the consequences of that in more detail, not to scare people but to work out how to live in that scenario, you are seen as expressing disloyalty to the system and hostility toward those who will suffer from that failure.
Taboos at considering reality and how to deal with it are utterly silly. Reality exists and reality is that which continues to exist even when you stop believing in it.
That's a useful comparison. Global warming is bad, and it's coming, and these two things alone seem to be enough to trigger the "get outraged and signal that *something* must be done" response. As Robin noted, outrage-free descriptions of bad news about the future (climate, society, etc.) are taboo.
This urgent compulsion to do *something* means that psychological forces push us towards action before we check whether that action would be effective. Our agenda isn't simply to prevent the bad future. Even if we suspect that it's inevitable, we want to make a show of fighting it anyway, so that we can't be retrospectively blamed for having stood idly by as the (inevitable) bad stuff unfolded. That's clearly irrational, but it's how we think, and this kind of bias should have a name.
It's related to our double standard about predictions for purely mechanical systems, and predictions for systems that involve human decisions. Very reliable predictions can be made about both kinds of systems, but we're only comfortable with thinking of the first kind of predictions as inevitable. The sun will inevitably increase its output: fine. Humans will inevitably increase CO2 levels: heresy. It's as if we were fundamentally skeptical about the causal modelability of the human realm.
In some situations, like emergencies, it really helps to have backup plans. But if the situation doesn’t require it, a backup plan can actually be detrimental, according to new research. In several experiments, people were given a mental task, with the prospect of a reward for high performance. Before starting the task, some of these people were asked to think about how, if they didn’t get the reward, they could obtain something comparable. These people subsequently performed worse on the mental task. The effect appears to be explained by lower motivation, not distraction.
Shin, J. & Milkman, K., “How Backup Plans Can Harm Goal Pursuit: The Unexpected Downside of Being Prepared for Failure,” Organizational Behavior and Human Decision Processes (July 2016).
I believe the audience's conscious or subconscious reaction is this: If this man truly believes all this crazy shit, why isn't he right now on a huge mission trying to alter the future and prevent his apocalyptic scenario from happening? Maybe he's fine with this apocalypse? (Compare to Eliezer Yudkowsky: he identified an apocalyptic scenario he deemed high probability, and went on to dedicate his life to preventing it.)
Thus, humans have been "designed" to view any discussion, even a discussion of pure fact, in the context of how it would effect the balance of power between factions.
Because of the way we're designed, any intellectual discussion does implicate factional commitments. But it's intellectually dishonest to avoid the factual claims because of proponents' ulterior motives.
But there's another kind of dishonesty which comes with denying such motives. Or in claiming that those motives aren't ever a fitting topic for discussion.
Do you view this taboo as yet another example/consequence of how evolution has selected genes that that lead to winning tribal coalition power struggles more than it has selected genes that lead to accurately determining abstract truths?
Thus, humans have been "designed" to view any discussion, even a discussion of pure fact, in the context of how it would effect the balance of power between factions.
Sheesh, tough crowd. On a Friday yet. My comment below arguably qualifies me as a non-acolyte, and I'm not aware of anything Robin's done in the current context which could remotely be considered evil. Quite the contrary (assuming his book's anything like his blog entries on ems) - he's shed a lot of light on a very important topic.
This wouldn't be a concern if they met you. Then they would know for sure that you are in favor of the outcome and can be relied on to frame everything to put it in a positive light.
I've also seen the opposite reaction: rather than assuming that a neutral analysis of a phenomenon is endorsing it, people sometimes assume that the neutral analysis is complaining about it.
E.g. reactions to some analyses by male authors about how dating markets are more difficult for men, were interpreted as "whining" even when they made no normative claims.
Reminds me of http://andrewgelman.com/201... , where Andrew Gelman and a number of his commenters were shocked and appalled for someone applying economic analysis to torture.
I recall when Robert Greene wrote the 48 Laws of Power, he was attacked for simply documenting patterns of power in historical periods and modern times. His sin was only to relay that knowledge to the general public in a cold fashion.
Problem, No Solution Taboo?
Yeah, but you might not continue to keep existing if you violate your society's taboos.
Taboos at considering reality and how to deal with it are utterly silly. Reality exists and reality is that which continues to exist even when you stop believing in it.
That's a useful comparison. Global warming is bad, and it's coming, and these two things alone seem to be enough to trigger the "get outraged and signal that *something* must be done" response. As Robin noted, outrage-free descriptions of bad news about the future (climate, society, etc.) are taboo.
This urgent compulsion to do *something* means that psychological forces push us towards action before we check whether that action would be effective. Our agenda isn't simply to prevent the bad future. Even if we suspect that it's inevitable, we want to make a show of fighting it anyway, so that we can't be retrospectively blamed for having stood idly by as the (inevitable) bad stuff unfolded. That's clearly irrational, but it's how we think, and this kind of bias should have a name.
It's related to our double standard about predictions for purely mechanical systems, and predictions for systems that involve human decisions. Very reliable predictions can be made about both kinds of systems, but we're only comfortable with thinking of the first kind of predictions as inevitable. The sun will inevitably increase its output: fine. Humans will inevitably increase CO2 levels: heresy. It's as if we were fundamentally skeptical about the causal modelability of the human realm.
I ran into this in Sunday's Boston Globe:
https://www.bostonglobe.com...
In some situations, like emergencies, it really helps to have backup plans. But if the situation doesn’t require it, a backup plan can actually be detrimental, according to new research. In several experiments, people were given a mental task, with the prospect of a reward for high performance. Before starting the task, some of these people were asked to think about how, if they didn’t get the reward, they could obtain something comparable. These people subsequently performed worse on the mental task. The effect appears to be explained by lower motivation, not distraction.
Shin, J. & Milkman, K., “How Backup Plans Can Harm Goal Pursuit: The Unexpected Downside of Being Prepared for Failure,” Organizational Behavior and Human Decision Processes (July 2016).
I believe the audience's conscious or subconscious reaction is this: If this man truly believes all this crazy shit, why isn't he right now on a huge mission trying to alter the future and prevent his apocalyptic scenario from happening? Maybe he's fine with this apocalypse? (Compare to Eliezer Yudkowsky: he identified an apocalyptic scenario he deemed high probability, and went on to dedicate his life to preventing it.)
Upvoted for subtle irony.
Thus, humans have been "designed" to view any discussion, even a discussion of pure fact, in the context of how it would effect the balance of power between factions.
Because of the way we're designed, any intellectual discussion does implicate factional commitments. But it's intellectually dishonest to avoid the factual claims because of proponents' ulterior motives.
But there's another kind of dishonesty which comes with denying such motives. Or in claiming that those motives aren't ever a fitting topic for discussion.
Do you view this taboo as yet another example/consequence of how evolution has selected genes that that lead to winning tribal coalition power struggles more than it has selected genes that lead to accurately determining abstract truths?
Thus, humans have been "designed" to view any discussion, even a discussion of pure fact, in the context of how it would effect the balance of power between factions.
Sheesh, tough crowd. On a Friday yet. My comment below arguably qualifies me as a non-acolyte, and I'm not aware of anything Robin's done in the current context which could remotely be considered evil. Quite the contrary (assuming his book's anything like his blog entries on ems) - he's shed a lot of light on a very important topic.
But did meet me, for several hours.
You know what comes off as really evil? Writing a series of posts about how things you do, contrary to appearances, aren't really evil at all.
This wouldn't be a concern if they met you. Then they would know for sure that you are in favor of the outcome and can be relied on to frame everything to put it in a positive light.
I've also seen the opposite reaction: rather than assuming that a neutral analysis of a phenomenon is endorsing it, people sometimes assume that the neutral analysis is complaining about it.
E.g. reactions to some analyses by male authors about how dating markets are more difficult for men, were interpreted as "whining" even when they made no normative claims.
Reminds me of http://andrewgelman.com/201... , where Andrew Gelman and a number of his commenters were shocked and appalled for someone applying economic analysis to torture.
Absolutely correct.
I recall when Robert Greene wrote the 48 Laws of Power, he was attacked for simply documenting patterns of power in historical periods and modern times. His sin was only to relay that knowledge to the general public in a cold fashion.
There's a certain irony in how you're presenting a neutral analysis of the problem in this blog post without presenting a solution.
That said, could it be as simple as people on average simply disliking pessimistic views and prefering optimistic ones?