Signaling bias in philosophical intuition

Intuitions are a major source of evidence in philosophy. Intuitions are also a significant source of evidence about the person having the intuitions. In most situations where onlookers are likely to read something into a person’s behavior, people adjust their behavior to look better. If philosophical intuitions are swayed in this way, this could be quite a source of bias.

One first step to judging whether signaling motives change intuitions is to determine whether people read personal characteristics into philosophical intuitions. It seems to me that they do, at least for many intuitions. If you claim to find libertarian arguments intuitive, I think people will expect you to have other libertarian personality traits, even if on consideration you aren’t a libertarian. If consciousness doesn’t seem intuitively mysterious to you, one can’t help wonder if you have a particularly un-noticable internal life. If it seems intuitively correct to push the fat man in front of the train, you will seem like a cold, calculating sort of person. If it seems intuitively fine to kill children in societies with pro-children-killing norms, but you choose to condemn it for other reasons, you will have all kinds of problems maintaining relationships with people who learn this.

So I think people treat philosophical intuitions as evidence about personality traits. Is there evidence of people responding by changing their intuitions?

People are enthusiastic to show off their better looking intuitions. They identify with some intuitions and take pleasure in holding them. For instance, in my philosophy of science class the other morning, a classmate proudly dismissed some point, declaring,’my intuitions are very rigorous’. If his intuitions are different from most, and average intuitions actually indicate truth, then his are especially likely to be inaccurate. Yet he seems particularly keen to talk about them, and chooses positions based much more strongly on they than others’ intuitions.

I see similar urges in myself sometimes. For instance consistent answers to the Allais paradox are usually so intuitive to me that I forget which way one is supposed to err. This seems good to me. So when folks seek to change normative rationality to fit their more popular intuitions, I’m quick to snort at such a project. Really, they and I have the same evidence from intuitions, assuming we believe one anothers’ introspective reports. My guess is that we don’t feel like coming to agreement because they want to cheer for something like ‘human reason is complex and nuanced and can’t be captured by simplistic axioms’ and I want to cheer for something like ‘maximize expected utility in the face of all temptations’ (I don’t mean to endorse such behavior). People identify with their intuitions, so it appears they want their intuitions to be seen and associated with their identity. It is rare to hear a person claim to have an intuition that they are embarrassed by.

So it seems to me that intuitions are seen as a source of evidence about people, and that people respond at least by making their better looking intuitions more salient. Do they go further and change their stated intuitions? Introspection is an indistinct business. If there is room anywhere to unconsciously shade your beliefs one way or another, it’s in intuitions. So it’s hard to imagine there not being manipulation going on, unless you think people never change their beliefs in response to incentives other than accuracy.

Perhaps this isn’t so bad. If I say X seems intuitively correct, but only because I guess others will think seeing X as intuitively correct is morally right, then I am doing something like guessing what others find intuitively correct. Which might be a bit of a noisy way to read intuitions, but at least isn’t obviously biased. That is, if each person is biased in the direction of what others think, this shouldn’t obviously bias the consensus. But there is a difference between changing your answer toward what others would think is true, and changing your answer to what will cause others to think you are clever, impressive, virile, or moral. The latter will probably lead to bias.

I’ll elaborate on an example, for concreteness. People ask if it’s ok to push a fat man in front of a trolley to stop it from killing some others. What would you think of me if I said that it at least feels intuitively right to push the fat man? Probably you lower your estimation of my kindness a bit, and maybe suspect that I’m some kind of sociopath. So if I do feel that way, I’m less likely to tell you than if I feel the opposite way. So our reported intuitions on this case are presumably biased in the direction of not pushing the fat man. So what we should really do is likely further in the direction of pushing the fat man than we think.

GD Star Rating
loading...
Tagged as: , ,
Trackback URL:
  • Robin Hanson

    So, I guess we should also infer that eating babies isn’t quite as bad as we make it seem.

    • http://profiles.google.com/philoscase R S

      Yes. And mothers eating their aborted fetuses should be considered no more objectionable than mothers consuming their own placentas.

      • PJF

        Katja’s argument is about the margins, not absolutes – it is credible that the badness of baby/fetus eating might at least be slightly overstated for, ultimately, reasons of social appearance.

  • AspiringRationalist


    Perhaps this isn’t so bad. If I say X seems intuitively correct, but only because I guess others will think seeing X as intuitively correct is morally right, then I am doing something like guessing what others find intuitively correct. Which might be a bit of a noisy way to read intuitions, but at least isn’t obviously biased.
    This sound a lot like a Keynsian beauty contest, which although it does not have biases that are predictable to someone unfamiliar with the culture in which it takes place, tends to amplify fairly random cultural fashions.

  • Anonymous

    The obvious solution: A new philosophy-oriented 4chan board.

  • Rasmus Berggren

    An Hansonian explanation to our conflicting intuitions in the trolley problem:
    We intuitively accept the persons at the wheel/lever as leaders, and we accept that leaders will have to make hard decisions. In the situation with the fat man, there is no obvious leader. In the long run, allowing non-leaders to make hard decisions for the group leads to chaos, so it must be punished. //Rasmus

    • Michael Vassar

      That’s GREAT!

    • http://juridicalcoherence.blogspot.com/ Stephen R. Diamond

      allowing non-leaders to make hard decisions for the group leads to chaos, so it must be punished.

      This does correspond well to my dominant introspection: it’s not a decision I have the right to make.

  • Fatman

    The train-derailing-fat man example is not very good.  
    What are the chances that pushing an actual man on the tracks would derail an actual train, or whatever? Very little, and that totally colors people’s responses..

    It’s much better to present the problem about switching the train to another track with only one man tied to the tracks instead of five.

    • Alexis Gallagher

      You have a point. Interpretations are likely distorted by the unlikelihood that a collision with a fat man would derail a train. 
      However, I think the alternative you suggest introduces new distortions, or perhaps just removes distortions that are intentional in the first example: namely, that your culpability is increased by the direct physical contact used if you push the fat man with your own hands.

  • Richardsilliker

    What about a fat woman?  Just checking.

  • http://juridicalcoherence.blogspot.com/ Stephen R. Diamond

    If his intuitions are different from most, and average intuitions actually indicate truth, then his are especially likely to be inaccurate.  

    The harm from irrational beliefs spreads by analogy, so the almost universal trust in the bare intuition that we have conscious experience engenders a general uncriticalness toward intuition.

    I discuss a critical attitude toward intuitions in “The raw-experience dogma: Dissolving the qualia> problem.” ( http://tinyurl.com/8gh9vbt 

  • http://juridicalcoherence.blogspot.com/ Stephen R. Diamond

    For instance consistent answers to the Allais paradox are usually so intuitive to me that I forget which way one is supposed to err. This seems good to me.

    If you’re interested in doing philosophy of ethics, I think it’s bad, in that you would seem to have lost introspective access to a conflicting intuitions. Intuitions settle nothing philosophical, but they start everything. Establishing a philosophical position requires explaining conflicting intuitions, and if you don’t experience their force, you can’t grasp them. ( http://tinyurl.com/cxjqxo9  )

    The philosophical landscape varies, yet it doesn’t seem to me that philosophers are heavily biased when they report their intuitions, because doing so subtly and articulately is an important measure of philosophical skill, which it’s usually more important to signal.

    We should all share the same important intuitions–at least that’s the reigning presumption. You would lose “status” for failing to acknowledge, say, the contours of nonconsequentialist morality (or of consequentialist morality) as a competing intuitions. There’s enough room for philosophers to suck up for status in their conclusions that there’s little reason for them to fudge their intuitions.

  • Pjd6896

    If others’ psychological reactions are not based on reality, how moral is it to not discount this fact? In a trolley problem, is your action changed if people will believe the fat man jumped of his own volition instead of you pushing him? Seems to me that what people think should not influence the morality of a decision. If perception changes morality, then morality doesn’t exist…. Only psychological utilitarianism that doesn’t need reality.

  • DanielHaggard

    Really interesting post –  but not sure I understand your point correctly…  Can I try to paraphrase to see if I’m getting this?

    1) Intuitions are evidence of truth
    2) Intuitions are also evidence of the nature of the people who have them
    3) People have an incentive to misreport their intuitions in order to appear better to their peers.
    4) People likely change their intuitions unconsciously in circumstances where there is a strong incentive to do so.
    5) In circumstances where we have a strong incentive to unconsciously change our intuitions, we should assume that the truth lies further in the opposite direction than the direction suggested by our intuitions.

    If this is a fair paraphrase – then I’m struggling with 5).  Robin’s reply below about eating babies seems a straightforward counter example.  I think what you said earlier about people being biased toward consensus seems to explain why it is a counterexample.  There will be many cases where my selfish signalling motives will coincide with agreeing with the majority who happen to have settled on the right answer.  Thus, there will be many cases where there are strong incentives to signal selfishly, that don’t obscure the evidential (with respect to truth) value of the intuitions in question.  So there is no a priori reason to suppose that our intuitions are biased incorrectly.

    • Katja Grace

      Thanks. Almost a correct paraphrase. Instead of 5), we should assume the truth lies further in the less good looking direction.

  • Pingback: Status Emotions and Punishment | anotherpanacea

  • Paul Tiffany

    So philosophy is mere sophistry…

  • Pingback: How people are « American Buddhist Net

  • Yaobviously

    Youre essentially arguing that people treat philosophical problems as strategy games and that they choose their responses to signal the personality traits they believe others will associate with their position.

    If you expanded your framework to include the techniques used to generate answers, i.e., how you come up with your answer signals traits that range from food to bad, i would say youve summarized exactly whats going on.

    • Yaobviously

      The question then is, do we even have moral intuitions that are independent of the signalling game being played?

  • http://juridicalcoherence.blogspot.com/ Stephen R. Diamond

    Robin’s reply below about eating babies seems a straightforward counter example.  I think what you said earlier about people being biased toward consensus seems to explain why it is a counterexample.  There will be many cases where my selfish signalling motives will coincide with agreeing with the majority who happen to have settled on the right answer.

    Assuming you’re at the average, selfish signaling needs will bias you to overstate. Those cases where selfish signaling needs coincide with agreeing with the majority would be those cases where you’re substantially below average yourself. But since they’re a minority of the instances, the trend is toward overstatement.

    Do you have any doubt that people would overstate just how bad they think eating babies is?

  • Michael Wiebe

    This is relevant: http://en.wikipedia.org/wiki/Social_desirability_bias

  • Pingback: Signaling bias in philosophical intuition | Meteuphoric

  • Alexis Gallagher

    This is quite interesting and plausible. Yes, reported moral intuitions are likely distorted by the desire to look good. But does this really imply that reported moral intuitions are biased away from “moral truth”, in the direction of looking good?

    I think there are two distinctions here that it’s helpful to spell out. The first distinction is whether signaling affects our reported intuitions consciously unconsciously:
    1. Lying. bias in reported intuition, in comparison with actually felt intuition.2. Self-delusion / socialization. bias in actually felt intuition, in comparison with what the felt intuition would be in the absence of social pressure.

    The second case is the interesting one. As your discussions hints, it’s a bit of a judgment call if the second case is best interpreted as the unconscious idiocy of self-delusion or the unconscious intelligence of taking into account others’ opinions.

    And I think that highlights the second distinction here, which is whether signaling effects distort or correct intuitions that are shaped in the absence of signaling effects:
    1. inner voice. our moral intuition functions mostly accurately outside of social pressures
    2. collective intelligence. moral intuitions are most accurate when they are shaped by taking into account others’ opinions of those intuitions.

    What’s interesting to me here is that it this question seems independent of the first one, since both the inner voice and collective intelligence scenarios are tenable whether or not the social pressures act consciously (via lying) or unconsciously (via self-delusion/socialization).

  • Anonymous Coward

    I used to think people deliberately lied about their intuitions, but I now think it’s mostly unconscious.  People have evolved to actually believe irrational things they would otherwise have to pretend to believe.  It takes less effort than lying, and comes across as more genuine, because it is.  This particular kind of cognitive bias seems to correlate negatively with autism, causing some of the social difficulties associated with autism.

    If you’re the kind of person who reads Overcoming Bias, the big lesson to learn here is not that the truth is less socially acceptable than your beliefs.  Rather, it’s that you need to make more conscious effort to lie about your true beliefs in order to succeed socially among competitors who do this instinctively and unconsciously.

    Ask yourself: do you want to say rational things, or say things that it’s rational to say?  You can’t have it both ways.