23 Comments

I used to think people deliberately lied about their intuitions, but I now think it's mostly unconscious.  People have evolved to actually believe irrational things they would otherwise have to pretend to believe.  It takes less effort than lying, and comes across as more genuine, because it is.  This particular kind of cognitive bias seems to correlate negatively with autism, causing some of the social difficulties associated with autism.

If you're the kind of person who reads Overcoming Bias, the big lesson to learn here is not that the truth is less socially acceptable than your beliefs.  Rather, it's that you need to make more conscious effort to lie about your true beliefs in order to succeed socially among competitors who do this instinctively and unconsciously.

Ask yourself: do you want to say rational things, or say things that it's rational to say?  You can't have it both ways.

Expand full comment

This is quite interesting and plausible. Yes, reported moral intuitions are likely distorted by the desire to look good. But does this really imply that reported moral intuitions are biased away from "moral truth", in the direction of looking good?

I think there are two distinctions here that it's helpful to spell out. The first distinction is whether signaling affects our reported intuitions consciously unconsciously:1. Lying. bias in reported intuition, in comparison with actually felt intuition.2. Self-delusion / socialization. bias in actually felt intuition, in comparison with what the felt intuition would be in the absence of social pressure.

The second case is the interesting one. As your discussions hints, it's a bit of a judgment call if the second case is best interpreted as the unconscious idiocy of self-delusion or the unconscious intelligence of taking into account others' opinions.

And I think that highlights the second distinction here, which is whether signaling effects distort or correct intuitions that are shaped in the absence of signaling effects:1. inner voice. our moral intuition functions mostly accurately outside of social pressures2. collective intelligence. moral intuitions are most accurate when they are shaped by taking into account others' opinions of those intuitions.

What's interesting to me here is that it this question seems independent of the first one, since both the inner voice and collective intelligence scenarios are tenable whether or not the social pressures act consciously (via lying) or unconsciously (via self-delusion/socialization).

Expand full comment

You have a point. Interpretations are likely distorted by the unlikelihood that a collision with a fat man would derail a train. However, I think the alternative you suggest introduces new distortions, or perhaps just removes distortions that are intentional in the first example: namely, that your culpability is increased by the direct physical contact used if you push the fat man with your own hands.

Expand full comment

Katja's argument is about the margins, not absolutes - it is credible that the badness of baby/fetus eating might at least be slightly overstated for, ultimately, reasons of social appearance.

Expand full comment

Thanks. Almost a correct paraphrase. Instead of 5), we should assume the truth lies further in the less good looking direction.

Expand full comment

allowing non-leaders to make hard decisions for the group leads to chaos, so it must be punished.

This does correspond well to my dominant introspection: it's not a decision I have the right to make.

Expand full comment

Robin's reply below about eating babies seems a straightforward counter example.  I think what you said earlier about people being biased toward consensus seems to explain why it is a counterexample.  There will be many cases where my selfish signalling motives will coincide with agreeing with the majority who happen to have settled on the right answer.

Assuming you're at the average, selfish signaling needs will bias you to overstate. Those cases where selfish signaling needs coincide with agreeing with the majority would be those cases where you're substantially below average yourself. But since they're a minority of the instances, the trend is toward overstatement.

Do you have any doubt that people would overstate just how bad they think eating babies is?

Expand full comment

The question then is, do we even have moral intuitions that are independent of the signalling game being played?

Expand full comment

Youre essentially arguing that people treat philosophical problems as strategy games and that they choose their responses to signal the personality traits they believe others will associate with their position.

If you expanded your framework to include the techniques used to generate answers, i.e., how you come up with your answer signals traits that range from food to bad, i would say youve summarized exactly whats going on.

Expand full comment

That's GREAT!

Expand full comment

So philosophy is mere sophistry...

Expand full comment

Really interesting post -  but not sure I understand your point correctly...  Can I try to paraphrase to see if I'm getting this?

1) Intuitions are evidence of truth2) Intuitions are also evidence of the nature of the people who have them3) People have an incentive to misreport their intuitions in order to appear better to their peers.4) People likely change their intuitions unconsciously in circumstances where there is a strong incentive to do so.5) In circumstances where we have a strong incentive to unconsciously change our intuitions, we should assume that the truth lies further in the opposite direction than the direction suggested by our intuitions.

If this is a fair paraphrase - then I'm struggling with 5).  Robin's reply below about eating babies seems a straightforward counter example.  I think what you said earlier about people being biased toward consensus seems to explain why it is a counterexample.  There will be many cases where my selfish signalling motives will coincide with agreeing with the majority who happen to have settled on the right answer.  Thus, there will be many cases where there are strong incentives to signal selfishly, that don't obscure the evidential (with respect to truth) value of the intuitions in question.  So there is no a priori reason to suppose that our intuitions are biased incorrectly.

Expand full comment

If others' psychological reactions are not based on reality, how moral is it to not discount this fact? In a trolley problem, is your action changed if people will believe the fat man jumped of his own volition instead of you pushing him? Seems to me that what people think should not influence the morality of a decision. If perception changes morality, then morality doesn't exist.... Only psychological utilitarianism that doesn't need reality.

Expand full comment

For instance consistent answers to the Allais paradox are usually so intuitive to me that I forget which way one is supposed to err. This seems good to me.

If you're interested in doing philosophy of ethics, I think it's bad, in that you would seem to have lost introspective access to a conflicting intuitions. Intuitions settle nothing philosophical, but they start everything. Establishing a philosophical position requires explaining conflicting intuitions, and if you don't experience their force, you can't grasp them. ( http://tinyurl.com/cxjqxo9  )

The philosophical landscape varies, yet it doesn't seem to me that philosophers are heavily biased when they report their intuitions, because doing so subtly and articulately is an important measure of philosophical skill, which it's usually more important to signal.

We should all share the same important intuitions--at least that's the reigning presumption. You would lose "status" for failing to acknowledge, say, the contours of nonconsequentialist morality (or of consequentialist morality) as a competing intuitions. There's enough room for philosophers to suck up for status in their conclusions that there's little reason for them to fudge their intuitions.

Expand full comment

If his intuitions are different from most, and average intuitions actually indicate truth, then his are especially likely to be inaccurate.  

The harm from irrational beliefs spreads by analogy, so the almost universal trust in the bare intuition that we have conscious experience engenders a general uncriticalness toward intuition.

I discuss a critical attitude toward intuitions in "The raw-experience dogma: Dissolving the qualia> problem." ( http://tinyurl.com/8gh9vbt  ) 

Expand full comment