When we form opinions on topics, the depth of our efforts vary. On some topics we put in no effort, and hold no opinions. On other topics, we notice what are the opinions of standard authorities, and adopt those. We often go further to learn of some arguments offered by such authorities, and mostly accept those arguments.
Sometimes we feel contrarian and make up an opinion we know to be contrary to standard ones. Sometimes we instead seek out non-standard authorities that we more respect, and adopt their opinions and maybe also arguments. Contrarian authorities often explicitly mention and rebut the arguments of standard authorities, and sometimes we also learn and adopt those counter-arguments.
Sometimes we try to learn about many arguments on a topic from many sides, and then try to compare and evaluate them more directly, paying less attention to how much we respect their sources. Sometimes we generate our own arguments to add to this mix. Sometimes we do this alone, and sometimes in collaboration with close associates. Compared to the other approaches mentioned above, this last set of approaches can be described as more “thinking for ourselves”.
In general, arguments try to draw conclusions from widely accept claims and assumptions. So to dig deeper, we can recurse, by taking the claims X used to support arguments on topic T, and treat some of those X as new topics to consider in this same way.
Our associates are interested in judging how well we think, and we are eager to impress them. And as all of these effort levels are appropriate in various practical cases, in principle our associates should want to judge our abilities at all of these different levels. However, as we tend to see deeper thinking as harder, where our thinking skills matter more, and the more usual practical task, as authorities haven’t spoken to most of the practical issues we face, we are more eager to demonstrate and judge abilities to do deeper thinking.
Thus we all tend to present ourselves as thinking more deeply than we actually do. Not arbitrarily deeply, which isn’t believable. But maybe as deep as is plausible in a given case. So we tend to present ourselves, when possible, as “thinking for ourselves”.
Note that this thinking-for-yourself approach plausibly produces less accurate and reliable beliefs on each particular topic. Most people are usually less able to integrate info and arguments into an accurate total opinion than is the collective action of the usual authorities. Even so, showing off your abilities, and improving them via practice, often matters more to us than accuracy on each topic. We might be collectively better off due to us all doing more thinking, but this isn’t obvious.
We could of course get both accuracy and practice in thinking if we’d do our own analysis, but then adopt authority opinions even when that disagreed with our personal analysis. But we rarely do that, as we consider it “insincere” and “two-faced”.
Thinking-for-yourself, however, has a big problem on topics where there are orthodox opinions, opinions on which all good thinking people in some community are supposed to agree. The problem is that thinking for yourself is usually noisy and context-dependent. That is, the process of thinking for ourselves doesn’t consistently produce the same outputs given the same inputs. Many random factors re what arguments we notice, and how we framed or ordered our thoughts, often substantially influence our conclusions. And thus people who think for themselves must be expected to reach contrarian conclusions a substantial (~5-50%) fraction of the time.
Note that people who want to create the impression that they think for themselves, without putting in the effort of actually doing so, can just randomly adopt contrarian conclusions at roughly this rate. And this does seem to be the strategy of most ordinary people, who have quite high rates of variation in their opinions, and yet who don’t seem to think very deeply. Their opinions even vary widely across time, as they usually can’t recall the random opinions that they generated even a few months before.
However, this rate of variation is a much bigger problem for people whose opinions are more prominent. If someone publicly states their think-for-themself conclusions on twenty orthodox-adjacent topics, they should expect an average of ~1-10 heressy-adjacent opinions in that set. Yet often a prominent enough person publicly seeming to endorse even a single heresy is enough to get them cancelled in a community. Such as losing their job, or any chance for advancement or entry into that community. What to do?
One traditional solution has been for the usual authorities to present themselves as focused on particular topics associated with their positions of authority, and not thinking for themselves on most other topics. Especially re most orthodox topics. This was long the usual position of CEOs, for example. Another traditional solution was for scholars, who do often specialize as thinkers on topics at least adjacent to orthodox ones, to speak esoterically, i.e., evasively in public, and only frankly in private to other scholars.
In our society today, however, a great many people present themselves as
relatively prominent and thus worth cancelling,
largely thinking for themselves even on orthodox-adjacent topics,
offering their opinions in public on many such topics, and yet
none of these public opinions are heresies.
In fact they often express outrage when they encounter another such person expressing even a single heresy. But if they offer non-heresy opinions on twenty such topics, it is quite hard to believe that all those opinions are a random sample of their opinions generated by thinking for themselves; the natural rate of opinion variation due to thinking for yourself is just too high to produce such a result. Such people are probably being selective in what they say, or deceiving themselves into seeing themselves as thinking for themselves more than they actually do.
And thus we reach the thesis in my title: thinkers must be heretics. If you see people with many opinions none of which are heretical, this just can’t be a random sample of topics on which they are mostly thinking for themselves. And if you plan to manage a herd of deep thinkers in our world today, people who spend a lot of time showing off how well they can think for themselves, you need to either need to keep them away from orthodox-adjacent topics, or keep their discussions internal and private; don’t let them speak on such things in public. Or be securely insulated from cancellation, if that’s really possible.
Note that there might exist a minority of thinkers good enough that their think-for-themselves estimates are actually more accurate than the official opinions of the usual authorities. After all, existing institutions often allow entrenched powers to, for a time, resist switching to better estimates. In this case, we might coordinate to make such better estimates more visible, such as via prediction markets. But such entrenched powers have so far prevented this reform.
Note also that I’ve avoided listing particular heresies here, for fear of seeming to endorse them. Which suggests how strong social pressures regarding them may be.
Added 1Dec: Here I describe myself as a “think for myself polymath.”
Can we maybe all agree that credentializing produces potentially undesirable attractors?
Processing information is deeply personal. It might be the stuff that your subjective experience is made of and, while I respect the Bayesian rationale for deferring to a proxy, or assigning a confidence value to your beliefs based on a second, third or fourth hand evaluation of the arguments, themselves based on an interpretation of the data... I'll never quite be comfortable with people asserting that's there's nothing whatsoever wrong with it. Like, ideally, you would be some sort of transhuman-augmented-cyborg capable of synthesizing way more information than you presently are and thereby free from having to invent some utility function rationalizing the decision to proxy, right?