14 Comments

Eliezer, what's wrong with that picture is that there is no base case to terminate the recursion. (Also, the addition of the word "sophisticated" risks the No True Scotsman fallacy.)

Expand full comment

Good point, Vassar. Because of the nature of our political system, our most successful frauds and liars specialize in creating the impression that they represent the opinion of the majority and in influencing the opinion of the masses on pivotal issues, and many people make their living, in whole or in part, consciously or unwittingly, by helping the frauds and liars.

Expand full comment

Doug, surely Hal's eye color would fall under his "strong reasons" exception.

Anne, I'm not sure real science lives up to your hopes for it.

Expand full comment

I definitely don't think that an average person can improve the accuracy of their beliefs via PM because I don't think that the processes required to discover what the majority's belief regarding some question is are less philosophically problematic than other epistemological processes. The proposed regime doesn't eliminate the possibility of bias, it just leads to the PM outsourcing blame for his errors from himself to the majority, or more accurately, his conception of majority opinion, which is no more constrained than his other beliefs.If one wishes to be absolved of blame for one's errors, simply adopting physical determinism/chaos works much better than PM. If one wishes to act appropriately given a situation, one is stuck using reason, whether one pretends to or tries to adopt PM or not.Presumably, for the average person in average circumstances, the degree of acceptance of consensus that is actually average is reasonably close to what has been fitness maximizing during most of the relevant part of human evolutionary history.

Expand full comment

I thought that the recursion/equilibrium/information cascade problem had been sufficiently belabored, of course the opinions averaged have to be object-level judgments. The members of a philosophical cabal could review the object-level evidence (including the opinions of non-cabal members as non-dispositive data points) on a particular question independently, record their answers, and then adopt the average opinion of the cabal. My claim is that those with the traits that could lead them to adopt Hal's philosophical majoritarianism would reliably outperform the population majority if they followed this procedure.

Expand full comment

Could it be that perhaps Hal's proposal simply comes down to, "use science"?

Doug's comment:

we should agree with the opinions of the majority of the set of people who have analyzed enough evidence to form an independent, well-informed opinion.

sounds a lot like the kind of justification one would use for, say, accepting evolution rather than supernatural creation as the more likely explanation for the development of biological organisms on Earth.

It doesn't seem that there's any need to posit a new, special, overgeneralizing, and possibly misleading term like "majoritarianism" to describe a philosophy which has the primary feature of utilizing the net processing power of knowledgeable individuals in the world. That's what science does already -- or at least, that's what it's supposed to do.

And additionally, I don't see what's wrong with not having an opinion on a given question that comes up. If you don't have enough data, or enough high-quality data to allow for the formation of a definitive (or at least best-fit) conclusion, it's probably best to leave your opinion space blank on the question at hand. It's fine to say that you think one outcome or conclusion is more likely than any other, or to assign relative probabilities to potential outcomes and conclusions based on what data you do have, but I don't see the point of "taking a side" just for the sake of being able to say something other than "insufficient data". If all you know is that a bunch of people think something is a certain way, but you don't have any idea why they think that, I'd consider that data to be insufficient. Without knowing anything about where or how the people in question got their opinions, I cannot simply "decide" to agree with those opinions. I could say something like, "Well, X number of people believe this which could very well mean that the belief is accurate", but that wouldn't mean I could convince myself to hold that same belief. Or do you not distinguish between saying you believe something, and actually finding evidence sufficiently convincing?

Now, if you are in the position of having to act based on coming to one conclusion or another in the short-term, whether you go with what you perceive as a "majority view" or not ought to be evaluated according to what the costs of being wrong are. It would be interesting to know what weight people place on information and opinions from particular sources when the situation is very high-stakes vs. lower-risk.

Expand full comment

Consider this question: What color eyes does Hal Finney have?

I've never seen Hal Finney, so if I was asked what to guess what color eyes Hal Finney has, I'd guess the most common eye color: brown. Most people haven't seen Hal Finney's eyes, so they'd also guess that Hal Finney's eyes are brown.

Now let's assume that Hal Finney's eyes are actually blue. Hal Finney sees that his eyes are blue when he looks in the mirror, and people who know Hal Finney see that his eyes are blue when they look at him. If you asked any of these people what color eyes Hal Finney has, they would answer blue. Should any of these people change their opinion because a majority of people would guess that Hal Finney has brown eyes? That would be silly.

Is this a counterexample to "philosophical majoritarianism"?

Maybe, maybe not. People who have seen Hal Finney's eyes have better evidence than those who have not seen them; they can reasonably deduce that the majority of people, given this better evidence, would change their opinion. Therefore, we should change our statement of philosophical majoritarianism somewhat: we should agree with the opinions of the majority of the set of people who have analyzed enough evidence to form an independent, well-informed opinion. In this case, we should agree with the majority of people who have looked at Hal Finney's eyes. In other cases, though, it can be a lot harder to decide who has the expertise required to make their opinion count.

Expand full comment

So a Sophisticated Philosophical Majoritarian tries to adopt the average opinion of all other Sophisticated Philosophical Majoritarians.

(What's wrong with this picture?)

Expand full comment

The sophistication of Hal's post is itself sufficient evidence to reject the assumption that Hal is average. Indeed, given the level of ability and epistemic humility required in order to become a philosophical majoritarian, PMs would be better-off relying on the average opinions of their fellows rather than those of the population as a whole.

Expand full comment

I've added a bit to the post above.

Expand full comment

Byrne and Eliezer, I thought Hal was clear on why he was willing to consider majoritarianism (ick, what a mouthful) itself to be an exception. But if there is a whole class of related topics, where it is not clear whether to go with the majority or with majoritarianism, yes, that seems more of a problem.

Expand full comment

I second Byrne's question.

Expand full comment

Obligatory Godel-esque question: how do we react to the statement "80% of people are opposed to philosophical majoritarianism." Or does it only apply to statements less meta than that?

Expand full comment

Seeing that this majoritarianism rests on past empirical observation, we should simply look at past data and see what we can add to the equation to improve the fit of our oppinion function on truth.

Boy, the general equilibrium effects of this would be disasterous if it ever caught on.

Expand full comment