Tyler Vid on Disagreement
We often like to ask lunch visitors what is their most absurd view (in the eyes of others). Alas I have so many choices. On BloggingHeads, Tyler Cowen answers this for Will Wilkinson:
Tyler: My most absurd belief, perhaps, is the extent to which I think people should be truly uncertain about almost all of their beliefs. And it doesn’t sound absurd when you say it but I don’t on the other hand know anyone who agrees with it. … Take whatever your political beliefs happen to be. Obviously the view you hold you think is most likely to be true, but I think you should give that something like 60-40, whereas in reality most people will give it 95 to 5 or 99 to 1 in terms of probability that it is correct. Or if you ask people what is the chance this view of yours is wrong, very few people are willing to assign it any number at all. Or if you ask people who believe in God or are atheists, what’s the chance you’re wrong – I’ve asked atheists what’s the chance you’re wrong and they’ll say something like a trillion to one, and that to me is absurd, that even if you think all of the strongest arguments for atheism are correct, your estimate that atheism is in fact the correct point of view shouldn’t be that high, maybe you know 90-10 or 95 to 5, at most. So that maybe is my most absurd view. Most things are much more up for grabs than we like to say they are.
Will: Yeah, I agree with you.
Tyler: No you can’t agree with me because its absurd. I can agree with your absurd view, but you can’t agree with mine.
Will: I agree with you that things are more up for grabs than people think they are, but I have real problems with the idea that it’s either possible or desirable that people assign probabilities to all of their beliefs. I think it’s a weird violation of the actual computational constraints of the human mind, that we just don’t
Tyler: Here, you are more of a philosopher than I am, and I’m more a Bayesian. I’m sure its possible. Now I’m not saying its desirable, I’m just saying I want people to do it in a lot of instances, maybe just for my aesthetic pleasure. I want to pin people down and get a sense for how sure they are, and interpret these probabilities as betting odds, if you want. Let’s say there’s a lot of dying starving children in India or sub-Saharan Africa, and you are offered to bet, and you know that the money won on these bets will go to feed these children and save their lives, and you have to name what odds you are going to bet at. And you can name a number. You want to name the best number you can because you want to save the lives of these children, so I’m not going to allow any evasion here. I don’t see why there is not always some pick of a number that’s better than a lot of other picks. You are not going to get it right so computationally of course its hopeless. But look, you’ve got to give it your best guess.
Will: But why do you have to give it your best guess? There’s a practically infinite set of propositions that I could entertain, none of which I will ever entertain. But every now and then one of them will come up in a conversation. So all of a sudden I’m entertaining it. I’ve never devoted any time or energy to assigning a credal probability to that proposition. Why should I on the spot be able to say anything about it at all?
Tyler: Well it depends on the question. If the questions I’m asking are, "What are the odds that right now exactly 23 people in the city of Cleveland, no more no less, are playing Scrabble?," that to me seems like a waste of time to consider. But if we take truly core beliefs like "Will the world end through nuclear proliferation?," "Is there a God?," "Are libertarians correct about economic policy?," and we simply want to ask people, "What do you think is the chance you’re correct?," and people are evasive, that bothers me, and I don’t think they can invoke the "Well who ever bothers to think about that?" defense, which they could if I asked them about Aunt Milly in Cleveland playing Scrabble. But its not that, they are core issues.
Will: So tell me about how perfectly rational Bayesians aught to deal with disagreement. So suppose you say P and I say not P.
Tyler: That’s right and lets say we have equal background on the issue in question and I have no particular reason to think that I’m less biased on it than you are. I don’t think that you can make the Robin Hanson move and say that well because we disagree, I look at you and you look at me and we realize that we have to agree, because somehow we are equally competent truth-seekers on this issue. It seems to me that even if I can mange to agree with you, there are other people who will disagree with both of us who are equally competent truth-seekers as the two of us are, and you can’t agree with everyone. So like who’s the disagreer, am I disagreeing with them or are they disagreeing with me, who’s at fault? Philosophically I don’t think there’s any way, or using Bayesian reasoning, to resolve that problem. I think that computationally is insoluble. But I think we can still step back and say look, no matter what we think, there are equally skilled people, or in reality actually smarter and more competent people on just about any question other than my personal biography, who know more about it than I do, and who and are smarter than I am, so my odds from my point of view of being correct I should winnow down really quite radically.
Will: So the implication of that then is that if you actually have a strongly idiosyncratic view, about anything, you must be irrational
Tyler: I don’t think you must be irrational.
Will: because all these people who are at least as smart disagree with you, that ought to weigh very heavily against you. So any sort of purist in any ideology ought to see that almost everybody disagrees with them and become a moderate, right?
Tyler: If no one agrees with you, you should be quite worried. If only a small number of people agree with you, you still should be quite worried. I don’t think its a numbers game, but I think whatever view you end up with, it doesn’t have to be a majority point of view, that reasons have weight, not just adding up whoever agrees with you. But you still ought to say at the end of the day, look all those other people are against me, maybe I think I’m right probability 57 to 43, but on any truly controversial question among intelligent people, you should never think its 95 to 5 in your favor. Most smart people I know have that kind of attitude.
Amazingly, I agree with all Tyler says in this interview except:
He says the chance our descendants survive past our Sun is less than one percent.
He mis-attributes to me the claim that disagreers must always instantly go to identical beliefs (he’d know better if he’d heard my standard talk on this).
He says Will can’t agree with his claim that we should all be a lot less certain of our opinions – if Tyler can hold the position so can Will.
Hat tip to David Killoren.
Added 27June: Apparently Tyler thinks it OK to estimate a controversial claim at a p = 1% as long as you see only a 60% chance "that is the correct p or even in the neighborhood of the correct p" and think it equally likely your "p estimate could go either up or down." Sigh – this is a loophole so large cosmologists would study it.