Blaming The Unlucky

A recent working paper finds that we call the same decision immoral when it leads to a bad outcome, but moral when it leads to a good outcome: 

Two studies investigated the influence of outcome information on ethical judgment. Participants read a series of vignettes describing ethically-questionable behaviors. We manipulated whether those behaviors were followed by a negative or positive consequence. As hypothesized, participants judged behavior as less ethical when it was followed by a negative consequence. In addition, they judged the behavior as more blameworthy and to be punished more harshly. Participants’ ethical judgments mediated their judgments of both blame and punishment. The results of the second experiment showed again that participants rated behavior as less ethical when it led to undesirable consequences, even if they saw that behavior as acceptable before they knew its consequences. Implications for both research and practice are discussed.   …

We show that outcomes of decisions lead people to see the decisions themselves in a different light, and that this effect does not depend on misremembering their prior state of mind. In other words, people will see it as entirely appropriate to allow a decision’s outcome to determine their assessment of the decision’s quality. … The tendency demonstrated in our studies might lead people to blame others too harshly for making sensible decisions that have unlucky outcomes. … Too often, we let ethically-questionable decisions slide for a long time until they result in negative outcomes, even in cases in which such outcomes are easily predictable. 

This makes morality look more like a social convention for who we can blame for what, rather than a direct guide to decision making. 

GD Star Rating
Tagged as:
Trackback URL:
  • In moral philosophy, we often distinguish between two senses of ‘ought’: the subjective and the objective. Someone subjectively ought do something if it is the right thing to do given their information, while someone objectively ought to do something if it is the right thing to do in light of all facts (including how it will actually turn out). In conversational use, ‘ought’ is ambiguous between the two senses. Thus, asking people whether they think certain action is more or less moral is ambiguous between two questions and to a large degree the answers seem merely to reflect this ambiguity. I agree that there may well be more going on, but the authors don’t even seem to have recognized this basic ambiguity in their question and this makes the results much more difficult to interpret.

  • Sam B

    This would explain why “rogue traders” are only caught after they’ve made massive losses. Can you think of a single Kerviel or Leeson who was caught, fired and hung out to dry after the bank noticed he was making far too much profit for them?

  • Torben

    Maybe we judge the outcome of a morally ambiguous act as the empirical answer to whether it was moral or not?

  • michael vassar

    It seems to me that the demand for direct guides to decision making is low while the demand for social conventions for who to blame is high. I generally think that its best to understand morality as largely the latter in terms of both how it actually works and how it came to exist, but to try very hard to not be overly guided, at least in final analysis, by the question “would this make me blameworthy” when seeking a direct guide to decision making.

  • Floccina

    Interesting, wisdom is doing now what will most likely bring good results in the future. Wisdom is often mixed up with morality.

  • Floccina

    BTW one of the great things that distinguished John Wooden from most coaches was that he would discipline players for taking bad shots than went in.

  • Matt

    This would seem to partly explain why “it’s easier to ask for forgiveness than for permission” – at least if you turn out to be right.

  • Similarly, there’s a fine line between genius and stupidity.

  • BTW, it seems to me that the point of this study has been appreciated at least since Williams introduced the idea of moral luck back in the early ’80s.

  • Douglas Knight

    This reminds me of the earlier thread on Having to Do something Wrong. That convinced me that I have little idea what people mean by morality. The current study doesn’t seem like great evidence of anything to me; the simplest explanation seems to me to be some kind of bias, that the subjects aren’t trying hard to figure things out, not that they’re trying to figure out something different from me.

    Toby Ord,
    if that’s your opinion after reading the quoted paragraphs, try reading the first few pages of the paper.

  • Larry D’Anna

    I’m not sure this is an irrational bias in all circumstances. We judge certain forms of recklessness, such as drunk driving as being immoral, and punish them accordingly. But as outside observers, we don’t really know how reckless the drunk drivers were being. Not everyone has the same level of impairment for the same BAC. If all we know is the BAC, then we know they were being reckless, but we don’t know exactly how reckless. If we also know they crashed into a schoolbus, then we have more information. Sure, they could have just been unlucky, but we must raise our probability that they were being especially reckless (according to Bayes theorem). So we punish the drunk driver who crashes more severely than the one that doesn’t because it’s more probable that he was being more highly reckless.

  • Silas

    Along the lines of what Larry said, if we accept the (IMHO reasonable) assumption that ethically questionable actions are less bad, the better the subject is able to predict the future, then taking into account the result is not so unreasonable.

    For example, I might consider it morally acceptable for an award-winning marksman to try to shoot a gunman holding a hostage, but not for a unpracticed gunholder.

    That said, I think there is a bias, but it’s in overestimating the *magnitude* of “how stupid that choice looked *at the time*”.

  • DL

    What Toby Ord said hits the spot. Your conclusion “This makes morality look more like a social convention for who we can blame for what, rather than a direct guide to decision making” doesn’t seem to follow from anything. Of course the test subjects are going to mix moral judgments of the specific action (from its outcomes) with moral judgments on the moral agent. Way too much vagueness.

  • Alan Gunn

    I don’t have a cite, but there was an interesting study a few years back comparing judges’ and jurors’ decision-making in negligence cases. The one big difference that emerged was that judges were a lot better than lay people about not concluding that a decision was bad just because it had turned out badly in a particular case.

    And, turning to a really important subject, I remember Howard Cosell dismissing Al Michaels’ sensible argument about why it was good in a particular situation to pitch to the number eight hitter with first base open and the pitcher on deck by pointing out that the batter had gotten a hit. I never understood why Michaels, who knew vastly more about baseball than Cosell, put up with the guy.

  • Constant

    Participants in the study may believe that they themselves are fallible judges of ethics. If that is the case, then it may be rational for them to update their judgment in the light of new evidence relevant to an ethical judgment. Specifically, one thing that is relevant to an ethical judgment is what it is reasonable to expect as the outcome of an action. The more reasonable it is to expect a bad outcome from the action, the more likely it is that the action is ethically suspect. And our judgment about what it is reasonable to expect is in turn informed by our experiences of what actually happens. The more often we observe that A leads to B, the more reasonable we judge it to expect A to lead to B. And in this study, the participants did in fact observe (or think they observed) one case in which the action led to a bad outcome.

  • Dave

    The most amazing part of this blog is finding out what amazing things otherwise intelligent people can find to be surprised by. That makes this place a gem beyond price.

    Can you think of a single Kerviel or Leeson who was caught, fired and hung out to dry after the bank noticed he was making far too much profit for them?

    Having worked in the futures and options trade (not, happily, as a trader), I actually have seen someone fired for following unhedged trading strategies that turned out to temporarily profitable. He was extremely surprised, particularly since he simultaneously received a sizeable bonus from his activities, but some of the old gray heads in the office understood. Bias evaporates like first frost when you have to make a hundred bets per day, every day.