Reject Random Beliefs

A recent New Scientist mentions a 2005 American Political Science Review paper on the genetic basis of political beliefs, which includes this key table, breaking variation in opinions (among 30,000 Virginia twins) on 28 specific topics into three origin components: genetic (heritability), family (shared environment), and other (unshared environment):

Geneticpolitics3_3

The paper shows similar results for Australian twins. 

This is a concrete occasion to revisit a general issue.  In general, if you want to believe the truth, then you should just accept the average belief on any topic unless you have a good (and better than average) reason to think the causes of your belief difference would be substantially more informed than average.

So unless you have a good reason to think your genes tend to produce more informed beliefs than other genes, you should reject the genetically-caused parts of how your beliefs differ from average beliefs.  Even if you have a higher genetically-given IQ, and even if high IQ folks have more accurate beliefs, you should still reject genetic ways in which your beliefs differ from the average beliefs of high IQ folks.  After all, true beliefs are supposed to be about the world, not about the particular genes you were randomly given. 

Unless you have a good reason to think your childhood family environment was more informed than an average family environment, ignoring any genetic advantages in your family, then you should reject the ways in which your beliefs differ from average beliefs due to your family background.  Similarly, you should also reject other non-genetic non-family causes of your belief differences, if you do not have a better than average reason to think your causes are better than average.

Having an intuitive feeling that your belief causes are better is not a “good reason” if most everyone has a similar intuitive feeling.  The fact that you have specific reasons for your specific beliefs is also not good enough – most everyone has specific reasons.

A good reason must be based on some feature of you that is different from others, where you have a good reason to think this feature is correlated with being right.  The mere fact that you have a distinguishing feature, and would like to think well of yourself, is not by itself a reason to think that feature is correlated with being right! And you must be wary of the common bias to lower our evidential standards when concluding that people like us tend to be more right.

Added: Let’s say that on a scale of 0 to 100, your position on property taxes is 90 – you think such taxes are good for some standard widely accepted mix of values such as economic growth, inequality aversion, or good neighbors.   When you disagree with someone at the other extreme, with a position of 10, you understand it to be a disagreement about facts, not values.

Let’s assume the average position on this issue is 50, and that you can see no good reason to think the genes that lean you toward property taxes are better than average.  Since the table says that 41% of belief variation on this topic is genetic,  then to eliminate this genetic component of your beliefs, you might reduce your position from 90 to 81, since this takes away 40% of the variance of your belief relative to the average. 

GD Star Rating
a WordPress rating system
Tagged as: ,
Trackback URL:
  • alex

    Sorry, this post just didn’t click with me at all. Most OB posts at least make me go “oh, that makes sense”, but I just can’t figure out what you’re trying to say here. Can you rephrase it?

  • Manon de Gaillande

    I don’t understand how to read that table.

  • josh

    alex,

    he’s saying that the way your views differ from other people is at least partially due to bias uncorrelated to truth value (genes, environment, etc.). You should become aware of this fact and adjust your beliefs accordingly.

  • http://hanson.gmu.edu Robin Hanson

    alex, josh gets it.

  • http://www.givepeaceafrance.com Yelsgib

    Josh’s explanation didn’t help me. Can you give me an example – like, an actual solid example – of how you could “adjust your beliefs accordingly?”

  • http://hanson.gmu.edu Robin Hanson

    See my Added section above.

  • http://www.givepeaceafrance.com Yelsgib

    I suppose that probably I am unfamiliar with the intellectual underpinnings of such a tremendously reductionist/numerical approach to the question of “belief?” What does it mean to “reduce your belief from 90 to 81?” Do you think that such a thing can be done? What about beliefs which are not common enough to have an “average” (e.g. the effect of multitasking on happiness). What culture are we looking at? Are we controlling for differences in IQ? How easy is it to justify the belief that “there is no good reason to think my genes are better.” What does it mean for genes to be better?

    Is this sort of analysis part of some broader intellectual system that I am unfamiliar with? Can you point my to some sources that state its principals/its relative merits?

    I hope this doesn’t come across as confrontational – I just really, really don’t understand what you’re trying to say.

  • Ben Jones

    Hell, why even bother having an opinion in the first place? Just ask everyone what they think, average it out and bingo. Let’s just elect a government that polls opinion on everything, ‘averages it out’ and makes it policy.

    Did the researchers take into account the fact that the ‘average belief’ on a subject has no bearing on anything, and probably has very few advocates in and of itself? Having scales purporting to measure things like ‘liberality’ of ‘astrology’ is a nonsense, and ignores the million subtleties that account for real, live people’s beliefs and opinions. Let me guess – something along the lines of ‘Rate your belief in x on a scale of 0 to 10.’ How enlightening.

    And anyway, even if there were such a thing as an ‘average belief’, why would it have any authority whatsoever? People believe all sorts of stupid things. Why should I think that my genes are leading me to be incorrect just because my beliefs are nonstandard? I can think of plenty of stupid, widespread beliefs. Ghosts? Super Happy Agent? American Idol???

    People can be genetically predisposed to intelligence, but intelligence is a Gaussian distribution. Most people are middling, but that doesn’t make me want to not be intelligent. Many people may believe something, but that’s only a very limited amount of evidence that it’s right. Cf. pretty much every major scientific discovery ever.

    Some ‘scientific findings’ make me angry.

  • josh

    Guys,

    “Even if you have a higher genetically-given IQ, and even if high IQ folks have more accurate beliefs, you should still reject genetic ways in which your beliefs differ from the average beliefs of high IQ folks.”

    We are adjusting for genetic and environmental factors that DO correlate with having true beliefs.

  • http://profile.typekey.com/Unholysmoke/ Ben Jones

    Having reread the original post, my response may have been a little strong. Obviously, you should attempt to exclude all non-evidential reasons for your beliefs. However, the study appears to treat the ‘average belief’ as some sort of impartial control group from which we can readjust ourselves. What about all the stupid people with stupid beliefs, who have genes different from myself? There appear to be a lot more of them than of people like me.

    There is no doubt a correlation between your genes and your beliefs. But this isn’t a good way to go about eliminating it from one’s own thinking. Adjusting your beliefs to come into line with the ‘average’ is just stupid.

  • http://www.scottaaronson.com Scott Aaronson

    Robin, to compensate for hereditary and other biases, should Americans (for example) be adjusting their political views toward the average among Americans, or the average worldwide? The latter seems more defensible, and yet I don’t think the consequence would be Democrats becoming more Republican and vice versa. Rather, it would be almost all Americans shifting radically to the left on certain issues, and radically to the right on others.

    Or should we restrict our reference class to people who have already accepted certain axioms (women’s rights, freedom of the press…) that we now see as self-evident, even if much of the world doesn’t?

    Actually, I think I know the answer to that question: to find out which reference class you should adjust your beliefs toward the average of, poll everyone in your reference class and take the average of their beliefs. :-) (Under what assumptions does this process converge?)

  • http://michaelkenny.blogspot.com Mike Kenny

    Robin writes: “In general, if you want to believe the truth, then you should just accept the average belief on any topic unless you have a good (and better than average) reason to think the causes of your belief difference would be substantially more informed than average.”

    It seems like a culture of comformism would be most rational then. I wonder if cultures with more comformism tend to be composed of more rational individuals generally. I think of American culture as having a non-comformist streak that would lead to an odd thought pattern: “I should believe the average, but the average belief is I shouldn’t believe what is average.”

    Also, I wonder if going against our genetic beliefs are a bit like trying to walk on all fours. How might one deal with the tension between believing what one knows is likely true, and what one continually returns to despite realizing it is unlikely to be true?

  • http://michaelkenny.blogspot.com Mike Kenny

    Excuse me, “comformism” should have been “conformism”.

  • http://hanson.gmu.edu Robin Hanson

    Scott, you want to compare your opinion to as many other creatures whose opinion you could foresee. But you’d discount their opinion if you had good reasons to think them less informed because they were further from the context of the topic in space, time, or culture. Current Virginians may know more about the wisdom of property taxes in Virginia now.

  • http://www.churchofrationality.blogspot.com LemmusLemmus

    A technical question: Aren’t the “unshared environment” estimates inflated by measurement error?

  • Al T

    “just accept the average belief on any topic” and what if the average belief is that you should reject this theory?

  • http://profile.typekey.com/halfinney/ Hal Finney

    Most people would become more accurate by shifting their opinions towards the consensus average. But this probably would cause social harm, in that we all benefit from having a wide variation of opinion and debate in society.

    It may be one reason why society encourages individualism and thinking for yourself. It is harmful to individuals but beneficial for society.

    This is a classic public goods problem. Are you going to do what is good for yourself, suppress your individualism and accept the group consensus? Or are you going to accept the social propaganda to think for yourself, even though you will be less accurate?

  • http://hanson.gmu.edu Robin Hanson

    Al, few people are aware of any concrete analysis for or against accepting the average belief, so if you are so aware you have a good reason to think yourself better informed than average. Nevertheless, you should be somewhat influenced by the fact that many others think it unwise to take the average belief.

    Hal, it is far from obvious to me that “we all benefit from having a wide variation of opinion.” Debate and variety in action does not require variety of opinion. Do feel free to make a post arguing for your claim. :)

  • KapKool

    I found a freely available version of the paper, for anyone who is interested:

    http://www.apsanet.org/imgtest/GeneticsAPSR0505.pdf

  • http://profile.typekey.com/bayesian/ Peter McCluskey

    Robin, why is awareness of the genetic influence helpful when trying to eliminate bias? It has the obvious disadvantage of triggering desires to signal group identity. Wouldn’t a broader measure, such as the components of beliefs that are due to everything other than unshared environment, give a more useful estimate of how we should adjust?

    Mike, the approach that Robin is advocating would produce results that look in many ways like conformism. But it should not cause one feature that is implied by the word conformism: confidence that people who reject the average belief are mistaken. If you’ve gotten a belief by adopting the average belief on a topic, that implies that you haven’t found strong evidence on that topic. Your attitude toward people who hold confident beliefs that conflict with the average should be that they have a nontrivial chance of being right.

  • http://hanson.gmu.edu Robin Hanson

    Peter, I think the argument seems stronger when we can clearly see how random, and independently random, are the causes of belief. We know enough to see clearly how very randomly distributed are our genes.

  • http://profile.typekey.com/halfinney/ Hal Finney

    Robin, it seems clear that society encourages independent thought, often under the rubric of “critical thinking”. A little googling and library searches find dozens of books and hundreds of articles on the topic. It is possible though that this is a relatively modern innovation, and that past generations were encouraged to conform. That would argue against my public-goods reasoning. OTOH maybe in older, more agrarian societies there would have been less social benefit to an independently-minded citizenry.

    I believe Scott Page’s book The Difference and perhaps James Surowieckie’s The Wisdom of Crowds offer examples and case studies where a diversity of opinion improved decision making. But it would take me some time to hunt through them for citations.

    Assuming there is such a benefit, I think it would be hard to achieve diversity in action without diversity in opinion, just because of human nature. So the simplest course for social reformers would be to encourage independent thought, for the benefit of the larger society. Indeed we often hear comments to the effect that a well-informed and well-educated electorate is a public benefit.

  • http://profile.typekey.com/halfinney/ Hal Finney

    While I was browsing through some of the critical-thinking literature I discovered an amusing inverse to the paradox that Al T raises above, about how one can accept the average opinion if the average opinion is to think for yourself?

    There is apparently a substantial subgroup among critical-thinking educators that warns against too much conformity in critical-thinking curricula. Students are being taught to think critically and apply logic and awareness of bias to what they read and hear. But this very uniformity of thought and approach is thought by some to create exactly the kind of conformity which the movement aimed to replace.

    So you can’t believe in following the consensus if everyone else thinks they should think for themselves, but you also can’t think differently if everyone else thinks they should think differently. In short a society in which everyone conforms to the consensus that they should think for themselves is already one founded on paradox.

  • http://hanson.gmu.edu Robin Hanson

    Hal, good point about conformity to critical thinking. On whether we can have diversity in action without diversity of opinion, I think the issue needs more analysis than just “just because of human nature.”

  • Will Pearson

    The crucial problem is knowing when you have more information about the subject than the society. If you have more information, or think that other people are ignoring information, then you should think differently.

  • Caledonian

    What is valuable is to have individuals whose positions are statistically independent of others’, not necessarily that it will end up differing.

    If you start to hold positions because people around you hold them, you fall into groupthink, and a few minor fluctuations in belief can quickly come to dominate the memesphere very quickly. Importantly, the fact that a belief is generally held no longer constitutes evidence that it is valid or that the balance of the evidence supports it. You’re dramatically increasing your susceptibility to bias by making your positions dependent on others’.

    We must also consider that most people are irrational, hysterical primates that would rather die than think. Valuing the opinions of such entities is unwise.

  • http://entitledtoanopinion.wordpress.com TGGP

    It’s kind of ironic to find majoritarianism promoted on a blog whose contributors have such “eccentric” opinions, even relative to an educated and intelligent group like academics. I suppose it might be the case that you’d be even more eccentric if you weren’t adjusting your beliefs to be closer to the average.

  • http://michaelkenny.blogspot.com Mike Kenny

    Hal writes: “Your attitude toward people who hold confident beliefs that conflict with the average should be that they have a nontrivial chance of being right.”

    This makes sense, but what I grapple with is that it seems hard to figure out how an individual confident person affects my belief, which is average and reflects the experience of perhaps billions of people, and opinions of people over time too. I’m not sure I have a sense of how powerful an average confident person’s opinion should be on an average subject in changing my opinion, based on the average belief of people.

    Another difficulty is that the confident people as a class aren’t representative of people in general, and might have different values than people in general. Are confident people’s values shaped by their extra knowledge, or whatever causes their confidence, or do confident people have different values on average than people in general on subjects, and therefore the values they use in forming opinions differ from the average person. In other words, the average person, if better informed, would still prefer A over B, whereas the average confident person would prefer B over A.

  • http://michaelkenny.blogspot.com Mike Kenny

    Rather, Peter said what I quoted, not Hal.

  • http://profile.typekey.com/jhertzli/ Joseph Hertzlinger

    The classic counterexample to “the majority is reliable” is the speculative bubble. It can be financially dangerous to move your opinion on the price of an investment towards majority opinion during a speculative bubble.

    On the other hand, if you reject anything that looks like a speculative bubble out of hand, you might miss the next exponential mode.

    These odd opinions fit together…

  • Grant

    Shouldn’t we be encouraging people not to hold beliefs on topics that aren’t within their area of expertise? We don’t poll people on how to design airplanes or computer software, so I’m not sure why sociological topics (in regards to decision-making) should be any different.

    I understand what Robin is saying about rejecting irrational reasons of holding different beliefs, but doesn’t that mask a bigger problem? People are only able to hold irrational beliefs when its not costly for them to do so. If we are trying to teach people hold more correct beliefs, why not try and teach them not to hold beliefs they have no stake in? If people actually conformed to this rule of thumb, then it seems to me that the “average belief” would be more accurate, because it would be related to the opinion of experts.

    It seems to me that holding no opinion is often a better alternative than accepting or rejecting a majority’s opinion.

  • Grant

    Just another thought, I’d be curious to see how much genes and environments effect people’s beliefs on topics they have stakes in (such as investments). I’d imagine there would be some adverse selection which could occur in stakes voluntarily chosen (e.g., someone suckered in by get-rich-quick schemes would necessarily be overly-optimistic), but I can’t think of any measurable beliefs people have large, involuntary stakes in.

    I’d imagine the incidence of random beliefs would be greatly reduced, but by how much?

  • George S.

    Still I don’t get how you got to 81, from 90…

    Facts:
    ———-
    a) Let’s say that on a scale of 0 to 100, your position on property taxes is 90

    b) The average position on this issue is 50

    c) the table says that 41% of belief variation on this topic is genetic

    So…
    ———–
    then to eliminate this genetic component of your beliefs, you might reduce your position from 90 to 81, since this takes away 40% of the variance of your belief relative to the average.

    ????

    How is this happening ? What are the actual equations ?

    let’s say I have 90… while the average is 50… how come this 41% from “variance of your belief relative to the average” gives a 9, so that I get 81 as a result ?

    Can you please elaborate on the actual equations that give this value ?

    Thanks in advance.

  • alex

    This is starting to make more sense now. But how can the mathematical method you added in the original post be useful? Surely if you followed this it would disallow positions of (at least) 0 and 100? Personally I think that’s a rather good thing, but is it mathematically sound?

  • brian

    First. There’s a quote from James Fenimore Cooper: “the tendency of democracies is, in all things, to mediocrity”. Something I’ve come to believe as well.

    Second. Define ‘Truth’. I dare you.

    Third. The question we should be asking is who is providing information to the ‘majority’ of the cows/sheep with whom we should be aligning ourselves? How did they come by their information/beliefs? I’m way out of my area of expertise, but it seems to me like most are indoctrinated by their government (including public schooling), their church/religious organization and the media. So maybe the real question is; To what extent has that group actually *thought* about what they believe and why?

  • Z. M. Davis

    George, I think variance in this context is the statistical term referring to the mean of the squares of the differences between the data you’re investigating and the mean of the data.

    So, if I am not mistaken, we calculate the difference between your opinion and the average opinion, square it, subtract the appropriate percentage from this corresponding to the heritability, then take the square root, then add that to the average opinion. In this case, we have:

    90 – 50 = 40
    40^2 = 1600
    1600 – (.4)1600 = 960
    sqrt(960) ~ 30.98
    50 + 30.98 = 80.98 ~ 81, which is QED

  • http://dl4.jottit.com/contact Richard Hollerith

    I agree that I do not percieve reality well enough to reach my goals and that the referenced finding that opinions are heritable is evidence of that. I also agree that having an intuitive feeling that your belief causes are better is not a “good reason” to believe my beliefs are truer than anyone else’s. Consequently, since I want to perceive reality as well as possible, I have to start doing something different.

    But then, Robin, you go on to make a specific recommendation, namely, just accept the average belief on any topic unless certain conditions apply. Do you not agree that nothing you have said in the OP is evidence for the specific recommendation except for the very weak link that the specific recommendation is doing something? Do you not agree that the following is an invalid argument: there is no reason to believe that your beliefs are better than average, therefore you should adjust your beliefs towards the average?

    Naturally I have been exposed to thousands of voices who advocate the specific recommendation or beliefs with a high conditional dependency on it, such as the belief that the more people involved in a decision, the better the decision will turn out. But how would you persuade someone to go with those thousands of voices who does not already believe that counting heads is a good method of getting closer to the truth when the alternative of thinking long and hard about the question is available?

  • http://hanson.gmu.edu Robin Hanson

    Richard, there is math that supports the claim that taking the average is more accurate than adding a deviation which is not better informed than the average deviation.

    George and alex, read ZM.

    Grant, sometimes you can’t avoid having an opinion if you must take related action, but otherwise sure, avoid uninformed opinions.

  • http://dl4.jottit.com/contact Richard Hollerith

    If Bob and Charlie share the same causal model of some aspect of reality except that the model contains parameters (integers or real number) and Bob and Charlie differ in their estimates of those parameters then yes, I am willing to believe that the average of their estimates is more accurate than either of their estimates.

    Since there exists a mathematical theory of causal models (in Bayesian networks and structural equation models) it is possible that you are refering to mathematics which describes a way to average causal models having what one might call “qualitative” differences (different number of equations in the two models, different number of terms in two equations, different number or identity of factors in two terms) rather than merely “quantitative” differences (different integral or real coefficients in the equations). If so, I would love to learn more about this crunchy mathematics.

    But if the math you refer to assumes that Bob and Charlie have the same “qualitative” causal model and differ only in their beliefs about the correct values of the parameters of that model, do you really believe that the math is relevant to human beliefs about most things, particularly about aspect of reality as complex as politics?

  • Pingback: Overcoming Bias : Beliefs Require Reasons, or: Is the Pope Catholic? Should he be?

  • Pingback: Against Rothbard and Keynes, for Marx « Entitled to an Opinion

  • Pingback: beliefs - StartTags.com

  • Pingback: Beliefs are partly hereditary | The Thinker