Best To Mix Odd, Ordinary

“The best predictor of belief in a conspiracy theory is belief in other conspiracy theories.” … Psychologists say that’s because a conspiracy theory isn’t so much a response to a single event as it is an expression of an overarching worldview. (more; HT Tyler)

Some people just like to be odd. I’ve noticed that those who tend to accept unusual conclusions in one area tend to accept unusual conclusions in other areas too. In addition, they also tend to choose odd topics on which to have opinions, and base their odd conclusions on odd methods, assumptions, and sources. So opinions on odd topics tend to be unusually diverse, and tend to be defended with an unusually wide range of methods and assumptions.

These correlations are mostly mistakes, for the purpose of estimating truth, if they are mainly due to differing personalities. Thus relative to the typical pattern of opinion, you should guess that the truth varies less on unusual topics, and more on usual topics. You should guess that odd methods, sources, and assumptions are neglected on ordinary topics, but overused on odd topics. And you should guess that while on ordinary topics odd conclusions are neglected, on odd topics it is ordinary conclusions that are neglected.

For example, the way to establish a new method or source is to show that it usually gives the same conclusions as old methods and sources. Once established, one can take it seriously in the rare cases where they give different conclusions.

A related point is that if you create a project or organization to pursue a risky unusual goal, as in a startup firm, you should try to be ordinary on most of your project design dimensions. By being conservative on all those other dimensions, you give your risky idea its best possible chance of success.

My recent work has been on a very unusual topic: the social implications of brain emulations. To avoid the above mentioned biases, I thus try to make ordinary assumptions, and to use ordinary methods and sources.

GD Star Rating
Tagged as: , ,
Trackback URL:
  • Eric Falkenstein

    I don’t believe in conspiracies much because I find conscious, collective action very difficult as the size of the group increases. Yet I do think there are a lot of cases where the real reason for some policy’s popularity has little to do with the explicit, noble reason, but rather the more petty ulterior motives (Bootleggers and Baptists). Are those latter situations a conspiracy?

    • dmytryl

      Yet, as a matter of historical fact, there was a sizeable number of ridiculous conspiracies involving quite large number of people, which did not surface for decades or longer. E.g. mkultra, or US human irradiation experiments. From the other side, there was a fairly big number of commie spies everywhere including upper echelons in the US. (The wealth of intelligence may well have been the reason world war 3 didn’t ever happen, though).

      • IMASBA

        The number of people who worked on mkultra, and the cost of the program were relatively small, although it eventually was revealed in the 1970s, also, the KGB probably knew about the program but didn’t reveal it because they had a similar program (which is why a 9/11 truther conspiracy is so unlikely: the Chinese would have nothing to loose and much to gain from revealing such a conspiracy to the world). That there were Soviet spies in the US was no secret, everyone knew this, they just didn’t know the identities of the agents, just like the Soviets didn’t know the identities of Western agents in the USSR, but knew that they were there.

      • Most of the time, in order to get a large number of people to be part of a conspiracy, you need to have a case that it’s being done for a good reason. If you’re trying to keep a secret, each person that learns the secret is a potential leak, and, to a first approximation, each person you might recruit to participate in a large conspiracy has the same chance of betraying you as a member of the general population. So you can indeed have a conspiracy that involves hundreds or even thousands of people, but only if it’s a “good” conspiracy that most people would support.

        For example, the biggest “conspiracy” ever concocted was the Manhattan Project – and not one person ever leaked anything about it to the Axis powers. Why? Because there would be almost no Americans that would be willing to perform that kind of betrayal. On the other hand, the details of the Manhattan Project were indeed leaked to the Soviet Union. After all, the Soviet Union was our ally in World War II. Why shouldn’t they be involved in the war’s most important R&D project? If you recruited an American at random, your chances of that person being a Nazi sympathizer was almost zero, but the chances of that person being a Soviet sympathizer was a lot higher.

        This is where a lot of “ridiculous” conspiracy theories, such as the 9/11 Truthers, become impossible to believe. If you don’t have a population that you can safely recruit co-conspirators from, you can’t have a conspiracy that involves more than a handful of people, and many conspiracy theories propose conspiracies that are huge violations of this rule.

  • Ironic that this was posted on the same day as

    • IMASBA

      Even a conservative watch is right twice a day…

    • How is what I said in conflict with that article?

      • Pointing out that there are many true conspiracy theories casts doubt on any meta-level discussion of conspiracy theorizing and inferences therefrom, is all.

      • Alexei Sadeski

        Actually, it does not.

      • Of course it does.

      • Stephen Diamond

        No, quite the opposite is the case. If conspiracy theories were all false, then non-conspirativists would be expressing their personalities through their beliefs less than conspirativists express their personalities, since they would also be expressing their epistemic superiority. ( ) The absence of conspiracies would vitiate the parallel that Hanson is drawing between the two stances, each more or less equally sensitive to truth.

  • Stephen Diamond

    My recent work has been on a very unusual topic: the social implications of brain emulations. To avoid the above mentioned biases, I thus try to make ordinary assumptions, and to use ordinary methods and sources.

    I contend this is the wrong way to deal with bias, and it produces another bias: overcompensation. (See “The deeper solution to the mystery of moralism—Morality and free will are hazardous to your mental health” — .)

  • dmytryl

    Ordinary or not, your predictions have only an incredibly low probability of working, due to reliance on multitude of mostly unstated assumptions. (For the ordinary assumptions, it is perhaps more explicit, and so those who want to be persuasive pick extraordinary assumptions, which, even though even less likely to hold, look like there was more effort and less assuming)

    Predictions are almost always very unlikely to be true even without any evidence towards any contrary prediction, through the simple matter of it being a wild guess. If you had extreme inference ability, there are many areas (e.g. history where you can infer backwards then find confirmation) where you could get recognition for it, and far future societies are not among those.

    You’re not using that inference ability on such topics, because you want to win some status guaranteed, and because somewhere inside something in your head counts all those assumptions and knows that your probability of being correct is an epsilon. (It may be a larger epsilon than that of other guessers, true, but it is still an epsilon).

    On the other hand, of course the non insane speculations are socially useful right now, even if incredibly unlikely to be correct, simply to provide some more samples and dilute influence of people who privilege ideas for fun and profit.

  • Philo

    “Some people just like to be odd.” And some people just *are* odd, whether or not they like to be so.

  • Excellent advice for literary persuasiveness! IIRC studies show that partial anthropomorphism is most memorable and persuasive – it’s why Egyptian gods have the heads of animals but the bodies of humans and so on.

    • Stephen Diamond

      Excellent advice for literary persuasiveness!

      Definitely, since you’re taking the low road.

    • So that’s why catgirls are so adorable! ^_^

  • Pingback: Overcoming Bias : Prefer Contrarian Questions()