Prefer Contrarian Questions

Many people are attracted to authority. They are eager to defend what authorities say against heretics who say otherwise. This lets them signal a willingness to conform, and gain status by associating with higher status authorities against lower status heretics.

Other people are tempted to be contrarians. My blog readers tend more this way. Contrarians are eager to find authorities with which they disagree, and to associate with similar others. In this way contrarians can affirm standard forager anti-dominance norms, bond more strongly to a group, and hope for glory later if their contrarian positions becomes standard.

I haven’t posted much on disagreement here lately, but contrarians should be disturbed by the basic result that knowing disagreement is irrational. That is, it is less accurate to knowingly disagree with others unless one has good reasons to think you are more rational than they in the sense of listening more to the info implicit in their opinions.

Today I want to point out a way that contrarians can stay contrarians, taking an authority defying position they can share with like-minded folks and which might later lead to glory, while avoiding most of the accuracy-reducing costs of disagreement: be contrarian on questions, not answers.

Academia has well known biases regarding the topics it studies. Academia is often literature-driven, clumping around a few recently-published topics and neglecting many others. Academia also prefers topics where one can show off careful mastery of difficult and thus impressive methods, and so neglects topics worse suited for such displays.

Of course academia isn’t the only possible audience when selling ideas, but the other possible customers also have known topic biases. For example, popular writings are biased toward topics which are easy to explain to their audience, which flatter that audience, and which pander to their biases.

The existence of these known topic biases suggests how to be a more accurate contrarian: disagree with academia, the popular press, etc. on what questions are worth studying. While individuals may at times disagree with you on the importance of the topics you champion, after some discussion they will usually cave and accept your claim that academia, etc. has these topic biases, and that one should expect your topic to be neglected as a result.

Some academics will argue that only standard difficult academic methods are informative, and all other methods give only random noise. But the whole rest of the world functions pretty well drawing useful conclusions without meeting standard academic standards of method or care. So it must be possible to make progress on topics not best suited for showing off mastery of difficult academic methods.

So if your topic has some initial or surface plausibility as an important topic, and is also plausibly neglected by recent topic fashion and not well suited for showing off difficult academic methods, you have a pretty plausible contrarian case for the importance of your topic. That is, you are less likely to be wrong about this emphasis, even though it is a contrarian emphasis.

Now your being tempted to be contrarian on questions suggests that you are the sort of person who is also tempted to be contrarian on answers. Because of this, for maximum accuracy you should probably bend over backwards to not be contrarian on which answers you favor to your contrarian question. Focus your enjoyment of defying authorities on defying their neglect of your questions, but submit to them on how to answer those questions. Try as far as possible to use very standard assumptions and methods, and be reluctant to disagree with others on answers to your questions. Resist the temptation to too quickly dismiss others who disagree on answers because they have not studied your questions as thoroughly as you. Once you get some others to engage your question in some detail, take what they say very seriously, even if you have studied far more detail than they.

With this approach, the main contrarian answer that you must endorse is a claim about yourself: that you don’t care as much about the rewards that attract others to the usual topics. Most people work on standard topics because those usually give the most reliable paths to academic prestige, popular press popularity, etc. And honestly, most people who think they don’t care much about such things are just wrong. So you’ll need some pretty strong evidence in support of your claim that you actually differ enough in your preferences to act differently. But fortunately, your being deluded about this can’t much infect the accuracy of your conclusions about your contrarian topic. Even if you are mistaken on why you study it, your conclusions are nearly as likely to be right.

This is the approach I’ve tried to use in my recent work on the social implications of brain emulations. This is very contrarian as a topic, in the sense that almost no one else works on it, or seems inclined that way. But it has an initial plausibility as very important, at least if one accepts standard conclusions in some tech and futurist worlds. It is plausibly neglected as having negative associations and being less well suited for impressive methods. And I try to use pretty standard assumptions and methods to infer answers to my contrarian question. Of course none of that protects me from delusions on the rewards I expect to forgo by focusing on this topic.

Added 7Mar: People are already in the habit of pleasantly tolerating a wider range of opinion on which questions are important, both because differing values contribute, and because people tend to strongly overestimate the importance of the questions they work on personally.

GD Star Rating
a WordPress rating system
Tagged as: ,
Trackback URL:
  • Christopher Ziemian

    How confident are you that answering mainstream questions leads to prestige, rather than it merely being the case that most prestige lands in the hands of those who answer mainstream questions? If 90% of the prestige in a discipline goes to those studying topic X, and only 10% to topic Y, but 99% of researchers focus on topic X, then career advancement should be easier for Y-researchers than for X-researchers. Sure, most of your colleagues got their positions by researching mainstream ideas, but how many contenders researched the same things and didn’t make the cut? You made a name for yourself by researching odd ideas for which there was an underserved market; do you think even one-tenth as many people would read your blog if you hadn’t?

    • http://overcomingbias.com RobinHanson

      I’m pretty confident, especially re academia. People at the top often counter-signal by venturing into new topics and proving that they can make people pay attention. For most others, such counter-signals will fail.

  • STINKY

    Maybe I’m just a cynic, but I highly doubt that the sort of person who would go out of their way to ask plausible, productively contrarian questions or to pose plausible, productively contrarian ideas would also go out of their way to seek plausible, productively contrarian answers to said questions or ideas (ie. conspiracy theorists)

    • STINKY

      For example:

      9/11 being an act of terrorism is a mainstream, authoritarian position. 9/11 being an inside job is a contrarian position. With regard to the latter, wouldn’t the contrarian response be that.. 9/11 wasn’t an inside job?

      I got very little sleep last night and my mind is soft so maybe I’m not smelling what you’re cooking but it’s very interesting none the less

      • Peter David Jones

        Contrarianism is context dependent. Yup

  • brendan_r

    Complimentary to the prestige-seeking explanation of topic bandwagoning/neglect is the idea that people just stink at weighting how important things are, even if no status is on the line.

    A favorite lesswrong example: Statistical Prediction Rules beat Expert Judgement in a variety of contexts, which Robyn Dawes explains ~like this:

    People are good at identifying relevant predictor variables. They get the sign of the causal relationship right. But they’re awful at integration; *weighting* the relative importance of various factors, especially if the info takes many forms.

    But maybe that’s explained by prestige seeking too; criminal psychologist overweighting his personal impressions, relative to a criminal’s record, to prove the worth of his intuition and experience.

    • http://overcomingbias.com RobinHanson

      “People are biased” isn’t a reason to think your judgement is better than other people’s. Since you are a people too. Do you have a “statistical prediction rule” to decide which questions are important?

      • brendan_r

        An SPR, no. But if I were to try to build one, I’d try to formalize these ideas:

        Michael Burry on allocating his investment attention:

        “ick investing means taking a special analytical interest in stocks that inspire a first reaction of “ick.” I tend to become interested in stocks that by their very names or circumstances inspire unwillingness – and an “ick” accompanied by a wrinkle of the nose on the part of most investors to delve any further.”

        And Greg Cochran describing why the cure for C-Dif – *stool transplants* – was so elusive:

        “Obviously, sheer disgust made it hard for doctors to embrace this treatment. There’s a lesson here: in the search for low-hanging fruit, reconsider approaches that are embarrassing, or offensive, or downright disgusting.

        Investigate methods were abandoned because people hated them, rather because of solid evidence showing that they didn’t work.

        Along those lines, no modern educational reformer utters a single syllable about corporal punishment: doesn’t that make you suspect it’s effective? I mean, why we aren’t we caning kids anymore? The Egyptians said that a boy’s ears are in his back: if you do not beat him he will not listen. Maybe they knew a thing or three.

        Sometimes, we hate the idea’s authors: the more we hate them, the more likely we are to miss out on their correct insights. Even famous assholes had to be competent in someareas, or they wouldn’t have been able to cause serious trouble.”

  • IMASBA

    I wouldn’t say that being a contrarian is necessarily only about showing off what you call forager ideals. It might be a form of creativity/curiosity as well: accepting that the authorities are more often right than wrong but you’re still on the prowl for angles the authorities missed, questions they forgot to ask and occassionally a result they got wrong and wouldn’t be corrected if there were no contrarians.

    I do agree that the best contrarian is one that is mostly contrarian on questions and angles, less so on answers.

    I admire how Robin is being a contrarian by sketching alternative worlds and ways of looking at things. The contrarian in me always wants to push it further so I like being a contrarian to contrarians as well: when Robin paints us an EM world where so many things we take for granted are different I like to look for things that he hasn’t considered that might alter things even more such as “why would EMs be capitalists?” or “why would EMs respect a right to unlimited procreation when that threatens their individual wealth?” Man, that sounded like I’m showing off, must be my forager ideals (I certainly have those) speaking.

  • http://juridicalcoherence.blogspot.com/ Stephen Diamond

    With this approach, the main contrarian answer that you must endorse is a claim about yourself: that you don’t care as much about the rewards that attract others to your usual topics.

    Seems you’d have a hard time making this claim, what with your taking a late degree for the purpose of achieving academic status.

  • oldoddjobs

    The initial distinction between conformists and contrarians is too woolly to be useful.

    • gjm

      It parses as “… that (knowing disagreement) is irrational”, not as “… that knowing (disagreement is irrational)”.

      It’s a reference to the Aumann agreement theorem and related results, which say roughly that under certain circumstances, if two people are free to communicate with one another and genuinely seek to find the truth, they can’t rationally end up by agreeing to disagree.

      Terse ordinary-language statements like “knowing disagreement is irrational” are arguably *not* in fact supported by these theorems, because they assume an impossible degree of “common knowledge” (A knows that B knows that A knows that B knows … that B knows that A knows B’s opinion, etc.) or because they assume perfect rationality and “work” by means of iterated reasoning of a kind that doesn’t stably work for even very-nearly-perfectly-rational agents.

  • Cahokia

    So instead of asking “What do you believe that almost no one else believes?” you should be asking “What do you think is an important question that no one else is asking?”

    It would be nice to see a debate moderator ask presidential candidates this question.

  • http://CommonSenseAtheism.com lukeprog

    Cross-posted from my Facebook wall…

    I think it’s legitimate to worry that MIRI-FHI folk have some contrarian views about long-term AI outcomes.

    On the other hand, talking to experts in AI and forecasting has confirmed to me that approximately none of them have developed their own modular analyses (re: long-term AI outcomes) and iterated them through many rounds of criticism and fact-checking, like MIRI-FHI folk have.

    So do we really have contrarian views, or do we just have unusually well-developed views on issues about which everyone else is just “making stuff up” on the spot, with no modular analysis or fact-checking?

    In other words, do we have contrarian views, or did we simply ask contrarian *questions*?
    [link to Robin’s post above]

    • http://overcomingbias.com RobinHanson

      It sure seems to me that on the question “Will the first human-level AI foom in a weekend to take over the world?”, many people do have answers to that question, and MIRI’s answer is contrarian. Whether you can reasonably disagree with others on a contrarian answer if you have studied your answer in more detail is a different question than I’m addressing in this post.

      • http://CommonSenseAtheism.com lukeprog

        When you talk to people, do they not have their own opinions about the social impact of emulations?

      • http://overcomingbias.com RobinHanson

        Not really. The few who disagree strongly seem to place their disagreement with the field of economics as a whole. They accept that I’m mostly applying standard econ straightforwardly. I don’t think believing standard econ makes me much of a contrarian.

    • Peter David Jones

      One doesn’t have to have one’s own projection ofAI outcomes to note that MIRI s favourite doom scenario is highly conjunctive.

  • http://juridicalcoherence.blogspot.com/ Stephen Diamond

    Your argument seems to depend on this assumption: academia is more biased around topic issue selection than around methods/means selection.

    What reason is there to think so? If academics are biased toward ways of showing off arduously acquired expertise, why expect this bias to affect issue selection more than methods selection? Why not expect that there are ignored simple solutions to the problems that have been useful for showcasing academic complexity?

    It’s easier to convince people that your problem is important (too) than it is that your solution is at least as good. This seem to be misleading you: the explanation is that there are fewer metrics for rejecting a topic as unimportant than for rejecting an explanation as inferior.

    On a related subject: your contrarian issues versus contrarian conclusions translates (more precisely, I think) as far-mode issues versus far-mode methods. These are two contrarian (or “mismatched”) ideological styles: Monomaniacalism and Demagogism. (See “A Taxonomy of political ideologies based on construal-level theory” http://tinyurl.com/6pt9eq5 )

    • http://overcomingbias.com RobinHanson

      I don’t see why I need to believe that method selection is more biased. I talked about questions being neglected because of an emphasis on impressive methods. That is talking about a method bias.

      • http://juridicalcoherence.blogspot.com/ Stephen Diamond

        Let me rephrase. We posit that academia is biased toward impressive methods. But this bias could be expressed in two ways: 1) by choosing topics suited to using high-status methods and 2) by applying high-status methods to problems that could be solved better by low-status methods.

        To claim an inherent advantage for contrarian issues over contrarian methods (or contrarian solutions, which comes to the same), you must think expression by #1 dominates over expression by #2. Otherwise, contrarians can hope to avoid the conclusion’s force about contrarians usually being wrong by capitalizing on the bias of academics for using (sometimes unsuitably) sophisticated methods (as opposed to choosing problems for which sophisticated methods are suited).

  • Handle

    I am working on a contrarianism project as well. It’s a fascinating topic.

    I think the disgust/taboo direction is most useful. There is remaining low-hanging truth fruit to be picked in topics that scare away most researchers. And there is likely serious policy folly if we are implementing concepts from disciplines with large, fenced-off areas. We will not be able to fail gracefully and assimilate corrective information because it always threatens to cross over into no-man’s-land.

    Disgust, taboo, moral outrage, and other emotional reactions to certain questions exist for reasons (e.g. evolutionary social coordination) that have little to do with the value the subject under discussion. They will prevent us from making progress in these domains.

    Now, they are not always wrong, which makes this very confusing. A 9/11 Truther, Obama Birther, Holocaust Denier, or other conspiracy theorists usually inspire visceral reactions that are justified. However, one has to embrace the fact that one will occasionally feel the same way about contrarian things that are true.

    A good place to look for these will be social problems where we seem to have hit a very bad dead-end, but it seems like we should be able to make more progress – perhaps because there is some evidence things were better in the past. If there is a taboo-minefield somewhere in the vicinity of that policy arena, you’ve probably hit upon a useful-but-taboo contrarian question area.

  • Ronfar

    How to piss people off as an academic:

    http://en.wikipedia.org/wiki/Rind_et_al._controversy

  • alexander stanislaw

    I disagree with this.

    Sorry I had to: on a more serious note, I get the impression that in biology, researchers will investigate whatever they want, but when it comes time to apply for grants, they’ll try to link their research to caner or some other big disease (no matter how insignificant the link is). In other words, they are contrarian about asking questions, but they try to appear conformist to grant writers so they can get cash. They probably are still conformist to some extent, but they are not as conformist as they appear.

  • http://www.yepi8.org/ yepi 8

    I enjoyed the article and totally agree. It’s hard to not understand clearly formed information. You have presented your points in a clear, yet intelligent manner. I agree with you. Thanks for writing and sharing this article.

  • Bruno Coelho

    The em scenario has standard methods but unusual assumptions. Specially about neuro tech as game change. The lower of wages seems acceptable, but the obsolescence of humans is not very palatable theme for a book. I understand why Dvorsky call that a dystopian future.

  • Pingback: Strategies to guard against contrarian bias | What is behavioral?

  • http://juridicalcoherence.blogspot.com/ Stephen Diamond

    “This lets them signal a willingness to confirm…”

    I think you mean conform.

    • http://overcomingbias.com RobinHanson

      I did; thanks; fixed.

  • LeaveMeOutOfYourInsaneRambling

    “That is, it is less accurate to knowingly disagree with others unless
    one has good reasons to think you are more rational than they in the
    sense of listening more to the info implicit in their opinions.”

    Except that a lot of people seem to think that their contrarian nature proves that they are, in fact, more rational than the rest of the “sheeple”. Witness 9/11 “truthers” who are smugly convinced that they’ve uncovered a conspiracy of mind-boggling proportions. The more evidence that’s presented against them, the bigger the conspiracy must be – and uncovering a conspiracy that big could only by done by the cleverest of people!

  • Pingback: Overcoming Bias : Socializers Clump

  • Pingback: Contrarian on Questions | Daily Nous

  • Jake Stevens

    Very interesting. One flaw could be that the ‘benefit’ people get from contrarianism might be connected to the outraged responses they get when they share their contrarian answers. With questions you don’t get these same outraged responses, because instead of contradicting people’s closely held beliefs you’re just talking about a subject they hadn’t thought about much. You’re more likely to get people ignoring your post (as seems to be Robin’s complaint about the EM article) rather than flocking to it to disagree.

  • Pingback: Overcoming Bias : Dust In The Wind

  • bink

    prolix, proxlix, nothing a pair of scissors can’t fix!
    Nick Cave

  • http://juridicalcoherence.blogspot.com/ Stephen Diamond

    Another suspect assumption behind your advice to prefer contrarian questions: we should prefer to be correct.

    Allow me a prediction-market analogy. Whether I speculate on one or another prediction depends on the comparison between my personal odds and that assigned by the prediction market. In other words, if I rationally believe the probability of having EMs is .1 but the market says it’s .01, I should definitely invest in pro-EM predictions, despite my thinking they’re unlikely.

    The same logic should apply to contrarian views. If I have a well-reasoned opinion that causes me to predict at odds with the consensus, I should promote that opinion, even if I think it’s unlikely. I shouldn’t be trying to be correct; I should aim for the greatest possible marginal correctness (so to speak).

    Where a marginal analysis might founder when applied to intellectual pursuits is in psychology rather than logic: it is hard to promote a view you believe is wrong. (Some will consider it intellectually dishonest; I think intellectual honesty lies in honest argument rather than sincere belief.) (To bridge the gap between value at the margin and truth, I’ve offered the distinction between opinion and belief. [See “The distinct functions of belief and opinion”–http://tinyurl.com/4r9k5g3 ])

  • http://whyarethingsthisway.com/ Nat Philosopher

    I believe a lot of contrarian positions, but I would not say I am the least biased toward them. I believe them based on the evidence.
    But when you find your first contrarian position to be proved by examination of the scientific literature, say, and especially if it is a very surprising contrarian position in the sense that all right thinking people think you are nuts, then that actually gives you strong evidence that your prior bias against contrarian positions was misplaced. And when you see your second contrarian position very surprisingly yet rationally proved, well then if you are rational you look for an explanation of what exactly you had been missing about the world. And this may well lead you to a rational theory that in fact combines very many more contrarian positions into a concise, and thus occam-friendly explanation.

    When I realized that the “climate scientists” were delusional about the climate literature and the Pediatricians were delusional about the vaccine literature, I realized that had a larger import.
    Our normal expectation that these collections of individuals
    have determined their beliefs and practices by a logical, scientific
    process, is empirically proven wrong. Instead the observed
    facts are explained much better by the model espoused by Gustav Le Bon in his 1895 book The Crowd, the first work on group psychology, and arguably the most insightful. Although largely forgotten today, this work has had extraordinary influence. By their own accounts it was on Theodore Roosevelt’s bedside table, and dogeared by Mussolini. Lenin and Stalin took from it, and “Hitler’s indebtedness to Le Bon bordered on plagiarism” in the words of historian and Hitler-biographer Robert G. L. Waite. Sigmund Freud wrote a book discussing Le Bon, which we will quote from below, and Edward Bernays, the father of modern public relations,
    acknowledged his deep debt, as Goebbels did of Bernays’ reflected insights. So this wouldn’t be the first predictive power displayed by Le Bon’s model: every one of the above luminaries was very happy with their practical applications of Le Bon.

    http://whyarethingsthisway.com/2014/03/22/why-are-the-pediatricians-so-confused-about-the-actual-state-of-the-scientific-literature/