26 Comments

This bias is easily explained: we expect (in the sense of "demand") people to treat each other decently. One can't be sapient without having this duty. Thus, people close to the victim of a killer or rapist quite reasonably believe there's no excuse for the crime to have occurred -- a specific person is responsible and needs to be punished and/or made to pay restitution, so feeling outrage toward him/her is the only appropriate response.

In contrast, car wrecks (excluding those caused deliberately or by gross dereliction) are sufficiently unforseeable (and the precautions that might have prevented them are sufficiently non-worthwhile) that there is nobody to blame: such things inevitably happen to a few. The same reasoning applies to lung cancer, if we credit the victim's own decision that smoking is enjoyable enough to be worth the risk to him/her. (If we don't, then it is self-inflicted.) In either case there is no reasonable target for outrage, except maybe "God".

Expand full comment

subsidiary to repugnancy bias and commission bias (bad things committed by an entity with apparent agency is more repugnant).

Expand full comment

Hal:

"perhaps time spent aiming to avoid disasters involving super-intelligence might be better spent working to improve institutions to prevent war"

I would go even further saying that perhaps Strong AGI is the institution to prevent the war and other disasters.

There are so many risks to human civilization and most of them can be prevented in posthumanity environment...

Nancy:

"Aging is probably the biggest non-badguy damage happening in the civilized world."

Yep. Another case for moving things ahead.

Expand full comment

Aging is probably the biggest non-badguy damage happening in the civilized world.

Expand full comment

That's a good question, Nancy; I was wondering about the same thing. It is certainly possible that warfare will turn out to be the overwhelmingly biggest disaster facing humanity in the 21st century, and that perhaps time spent aiming to avoid disasters involving super-intelligence might be better spent working to improve institutions to prevent war. Of course, catastrophic future wars, armageddon, post-apocalyptic survival, etc, are among the oldest tropes in speculative fiction. So I'm not sure that the bias here applies, that we under-estimate the impact of war.

Eliezer, minds do have a disproportionately large impact, but at the same time, historically humanity has arguably been more harmed by non-minds than minds. This bias makes us focus too hard on the actions of minds.

I am trying to think of what future problems we might be ignoring because of this bad-guy bias. I suppose it comes down to the kinds of things Vedantam mentions, natural disasters and accidents. Maybe the real key to happiness and prosperity in the future is encouraging more people to buy insurance. That will not only protect them against certain harms, it will send price signals to avoid risky behavior.

Expand full comment

"these populations still show bad guy bias, even to the point of explicit beliefs that it's somehow worse to be killed than to die in other similarly violent ways" -michael vassar

Michael, are you saying you shouldn't be more upset if I kicked you in the shin than if you just bumped your shin on a railing?

Expand full comment

I think war gets treated as more like a natural disaster than the result of ill intent. The number of Jews killed in the Holocaust is quite well known. The number of non-Jews killed in the Holocaust is fairly well known. The number of people killed by Hitler's invasions is generally not mentioned.

Expand full comment

I could make the same point with respect to unFriendly AI; the ones you have to worry about aren't "evil" in the sense of carrying out deliberately malevolent deeds, but rather, the ones who don't care about your existence one way or another (and you are made of atoms that they could use for things they do care about).

Minds are more powerful and have a larger impact than just about anything else; smarter minds have larger impacts. People concerned with large impacts on the future have a natural cause to be concerned about extremely intelligent actors. And for obvious reasons, we shouldn't be worrying about those whose desires approach our own reflective equilibria, but rather those which are neutral, twisted-good, or (in the case of augmented uploads) evil.

Expand full comment

Robin: Only in the sense that evolution or economic attractors are ill intentions.It's VERY hard to avoid anthropomorphizing AGI unless you keep those in mind as part of your default sample set of "optimizers".

Expand full comment

Right, I think the point at the end is that this bias makes us too concerned about unfriendly AI, because it fits into this "evil actor" model. One might also point to the many criticisms raised here over Robin's "ems" scenario along these lines, that the ems would be evil (or at least selfish and/or uncaring) and do things that would wipe out the human race or worse. Both of these concerns direct us to focus on the motivations of intelligent and powerful beings as a principal threat to our happiness. This framework is much the same as how people today view the threat of terrorism.

Even more naturalistic threats, like global warming or resource exhaustion, tend to be interpreted in a model based on "bad guy" actions. It's nobody's fault, really, that the world may be running out of oil, or capacity to absorb CO2 - it's just an unfortunate fact of nature. Kind of an odd coincidence that it is happening just as a demographic transition is reducing population growth rates, as well as when technology is almost there for cheap work-arounds to these problems. If the world had had one order of magnitude more capacity for these resources then we'd probably have a much smoother time transitioning. But the issues are often framed such that we look for someone to blame, we see the problem as a result of bad behavior.

Expand full comment

Yes the Friendly AI concern is not about ill intended creators, but it is about ill intended creations.

Expand full comment

If you ask how much should victims be compensated, [we feel] victims harmed through actions deserve higher compensation.

Probably they are the only ones who deserve any compensation at all. For a person to deserve compensation, another person must be obligated to give them compensation. This other person is the person morally responsible for the harm. In the case of a natural disaster there is likely no person morally responsible for the harm, therefore no person obligated to give compensation to the victim. Therefore the victim is not entitled to compensation from any particular person. Therefore the victim is not entitled to compensation, period.

Expand full comment

Nations tend to focus far more time, money and attention on tragedies caused by human actions than on the tragedies that cause the greatest amount of human suffering or take the greatest toll in terms of lives.

There are dissimilarites which may justify this choice of focus as correct. The term "nation" might refer to government. While it may be popular nowadays for government to help people in case of natural catastrophe, the "regular job" of the government is often considered to be to enforce the law - i.e., go after bad guys, not go after earthquakes.

something clicks in our minds that makes the tragedy seem worse than if it had been caused by an act of nature, disease or even human apathy.

This is hard to measure. "Seem worse"? The way to measure this is surely by looking at reaction. And a dissimilarity in the cause may justify a dissimilarity in the reaction, even if the harm considered in isolation from the cause is equal.

We want to punish those who act and cause harm much more than those who do nothing and cause harm.

First, we should rationally punish those who are morally culpable, not merely those whose actions contributed causally to the harm. After all, it takes two to cause a head-on collision, but often only one of the participants is morally culpable (e.g. the person whose car was in the wrong lane). There is an important distinction to be made between those who are culpable and those who are not. And generally, those whose inaction "causes" a harm (in the sense that the person, by failing to act, fails to prevent it) are not culpable.

It would be hasty to call existing moral distinctions between those who are and are not culpable "biased". There are generally good reasons for these distinctions.

Expand full comment

Eliezer, just so long as the communists don't build it, nor the next default bad guys that come after the terrorists. Or the nazis and whoever the default bad guys were before them.

Although you must admit, if they do build an unfriendly, superintelligent, recursively improving AI with nano-replicators, then the terrorists will have already won. Ha, beat that, smart guy! (I am imaging Eliezer on a cable news network interview, on the split screen, as the person on the other side of the split makes that point with smug condescension.) (I think it works for some values of "unfriendly" and "won.")

Expand full comment

I am surprised that the "not just reporters" need to name the bad guys.

Expand full comment

Carl, I just seriously get asked that question all the time, and not just by reporters. "Islamic terrorists" are simply the default bad guys of our time.

Expand full comment