Consider opinions distributed over a continuous parameter, like the chance of rain tomorrow. Averaging over many topics, accuracy is highest at the median, and falls away for other percentile ranks. This is bad news for contrarians, who sit at extreme percentile ranks. If you want to think you are right as a contrarian, you have to think your case is an exception to this overall pattern, due to some unusual feature of you or your situation. A feature that suggests you know more than them.
But why would one presume the majoritarian position as your prior? That's generally not how we understand priors. In particular, any position that requires that doesn't create a compelling argument because the person being addressed has no obligation to agree with those priors.
Another way to consider the argument: one should presume the majoritarian position as their prior for a given belief, but should update that prior based on evidence or arguments. How heavily should one weight that update? The best-case contrarian has some sort of epistemic skill, maybe they are more numerate, or intentionally study rational argument, or have some sort intuitive rationality and a history of prior contrary successes. If you use the average common-sense muddled thinking you shouldn't weight any given piece of evidence all that much in your update.
All that probably boils down to: you wouldn't be a contrarian in the first place if you didn't believe yourself to have some particular justification in your heterodoxy, which is a rather obvious observation.
Which probably means you should only trust your own reasons for contrary beliefs if the methods used have have prior validated successes, which all things considered, is a rarer feat that it should be.
If there is a tradeoff between truth and other values, it should be easy to be more truthful than other people. Two of your three human examples of conformity traps are about such tradeoffs. The third, about journalist topics, is more murky.
It may be hard to be certain of your contrarian opinions on object-level issues, but when you argue for institutions for truth-seeking, it should be easy to certain of that argument. Maybe everyone claims to care about the truth (though I doubt it), but that's cheap talk. Requiring academics to use lots of math is selected for something, and it's not truth.
If it's what people value, then maybe it's not a trap. But even if it is, the local optimum found by selection can probably be improved by using the truth.
Yeah, I brought that up, and he replied by noting that his link discusses the minority of salmon who survive.
Not to be too contrarian here but most species of Salmon have a 100% mortality rate during the inland river trip to spawn. For example, on the west coast of North America, all species die upon spawning. The advantage here is the survival of the species... the fertilized eggs, alevins, and fry have a much higher chance of survival during the winter in inland waters compared to the open ocean.
Not an expert but I think there has been chess contrarianism, at least to an extent. E.g. the hypermoderns who in the early 20th century popularized controlling the center remotely rather than directly via pawn occupation. Or the recent AI systems whose moves sometimes seem bizarre and unintuitive to even top human grandmasters.
Sure, but I guess I'm not seeing the relevance. They don't even actually mean to communicate that literal claim so they might as well just be saying "our group is the best". It seems to me that this is just an issue of epistemic foundations. At some point you just need to trust your foundational epistemic judgements. Could be that you are the crazy man preaching the end of the world because of severe skizophrenia but no exercise of reason is able to help you there.
Point is there really isn't any going below your ability to understand and judge arguments. You can look for apparent conflicts/inconsistencies in those faculties but you can't reason your way out of the problem if either those faculties are fundamentally broken or the world is designed to diagnolize against those who use them.
Any attempt to explain why it's ok to rely on your judgement will necessarily itself rely on it so what's the point? You just have to hope your the prophet not the crazy man.
"Averaging over many topics, accuracy is highest at the median, and falls away for other percentile ranks."
What is the evidence that this is true really? Sure in a formal scenario the average peak accuracy of prediction markets may be at the median, but usually contrarianism is about missing questions, missing data, or oth things "outside of the box". It's niche is exactly the informal scenario. Of course there's no such thing as chess contrarianism, for example.
But most everyone claims the their group just cares more about truth than do the other groups.
I guess I'm confused as to why this same analysis wouldn't apply to the problem of guessing whether a number is prime or composite. I mean there are psuedoprime generators that produce a prime with very high probability and for really huge psuedoprimes the best guess most people would have (if the situation was sufficiently explained) was that the result was prime.
But none of that is a relevant consideration for whether I should take the contrarian position if I discover a factorization. I've just found the (logical knowledge so not captured well in probability distribution) clinching evidence so who cares that most people didn't find it.
Or are you suggesting that you set out intending to be contrarian before you even see the arguments. But that then just depends on how often most ppl miss simple arguments.
Seems to me that often contrarians are just ppl who are more willing to signal epistemic virtue at the cost of less signalling of loyalty/group membership. But that's just a cost/benefit calc and I don't see where this idea that the median thing that other people say is usually correct comes from.
I give a supporting link where I make that claim.
Maybe a weaker effect, but it is still true even when beliefs influence each other.
Much of contrarianism is a matter of opposing following the mode instead of the median. For example, a large fraction of the public believes that Columbus discovered the world is round in 1492. In the real world, the fact that the Earth is round was discovered around 2000 years earlier. The median estimate for "when did people discover the world is round?" is likely to be less than 1492 but the mode is likely to be 1492. If contrarianism is a matter of being skeptical about the accuracy of the mode instead of being skeptical about the accuracy of the median, it's likely to be worthwhile.
> accuracy is highest at the median
This is most true only when people make up their own minds independently. When they are influenced by others it is much less true e.g. stock market bubbles. See e.g. Didier Sornette's work.
Any of the books about Warren Buffett talks about the need to be independent in your thinking rather than chronically contrarian.
I have no knowledge of the field but perhaps Salmon are just paying a price for finding a safe place to have their babies.
Well, the peacocks are generally supposed to be sexy to the peahens, not to the other peacocks, but... hey, close enough. :) I mean, it's not like everyone else is complaining about it, so... I wouldn't want to swim against the crowd...
My understanding was that salmon always died after mating, rather than living to mate again.