12 Comments

You're right, of course, but it highlights an uncomfortable truth: we don't help other people to make them feel better, but rather to me us feel better.

Expand full comment

While I agree with the comments, I also think that we don't want to signal that we will expend money to help distant victims as this signal has to be continually maintained by spending more money and potentially demands money without bound. We want to signal to others that we do what's right, but more importantly we want to signal that to ourselves, but we are vague regarding what we mean by "do what's right" and can fairly easily be made to think that it includes helping distant victims.

Expand full comment

"The death of one man is a tragedy. The death of millions is a statistic. ..."

--Stalin

Expand full comment

On the other side of the coin are celebrity deaths.

Many people care about a celebrity's death. However a common response to these people is to point out that the celebrity is only one person, and that there exist many people in the world whose deaths one should care about.

The goal of such a response can be understood then as both a way to signal ones greater scope of compassion, while diminishing the need for actual compassion towards the celebrity.

Expand full comment

Maybe singularity7337 was trying to make a point about far mode?

Expand full comment

This is my reading as well.

Expand full comment

This is how I feel...

Although that's still a bit of craziness, a problem of 250,000 deaths equalling a problem of 25,000 deaths.

Expand full comment

I agree with the facts addressed in this statement, but unless the end goal is to make SIAI look annoying, it's probably wiser to wait for a better excuse to plug SIAI. "A fanatic is someone who can't change his mind and won't change the subject."

Expand full comment

Seriously--the Singularity could have prevented the quake. There's no reason a 7.0 should kill anyone with modern technology. Singularity brings about more technology. Donate to SIAI.

Expand full comment

2/3rds sounds less like random fluctuation and more like a real solution.

Expand full comment

If a situation causes someone distress, and he can rectify that situation so as to relieve his distress, he is inclined to do so. But if his best effort will do little to relieve his distress, he is less inclined to act.

If I find out that a group of people are in danger of dying--which causes me distress--but I can save them--doing which would relieve my distress--then I am strongly inclined to save them. On the other hand, if I find out that a much larger group of people are in danger of dying--again causing me distress (about the same amount, since my emotions are not geared to make fine quantitative distinctions)--and I can save only a small percent of them--the same absolute number as in the first scenario, but leaving most of the group still to die--my distress will hardly be relieved, because of the very large number that will still be going to die; so my motivation will be much less. In the first scenario I *solve the problem*; I can feel good about the outcome, and about myself. In the second scenario, at most I *barely make a dent*; the problem will remain unsolved, and I will still feel bad about the outcome, and about the fact that I was not able to *solve the problem*. My psychic economy is more favorable to acting in the first scenario, though in an abstract (far-mode) view this is irrational.

Expand full comment

The issue is one of framing. Saving 20,000 lives feels like failure when it's set against 270,000 ongoing deaths. The setting underlines the seeming futility of even trying to help. Saving 10,000 lives feels like success when it's set against 5,000 remaining deaths. 2/3 of the problem is "solved" rather than less than 7% of the problem.

Expand full comment