80 Comments

Apparently, "probability estimate" is being used as a more tractable, scalar, proxy for belief.

Expand full comment

If Bob and Charlie share the same causal model of some aspect of reality except that the model contains parameters (integers or real number) and Bob and Charlie differ in their estimates of those parameters then yes, I am willing to believe that the average of their estimates is more accurate than either of their estimates.

Since there exists a mathematical theory of causal models (in Bayesian networks and structural equation models) it is possible that you are refering to mathematics which describes a way to average causal models having what one might call "qualitative" differences (different number of equations in the two models, different number of terms in two equations, different number or identity of factors in two terms) rather than merely "quantitative" differences (different integral or real coefficients in the equations). If so, I would love to learn more about this crunchy mathematics.

But if the math you refer to assumes that Bob and Charlie have the same "qualitative" causal model and differ only in their beliefs about the correct values of the parameters of that model, do you really believe that the math is relevant to human beliefs about most things, particularly about aspect of reality as complex as politics?

Expand full comment

Richard, there is math that supports the claim that taking the average is more accurate than adding a deviation which is not better informed than the average deviation.

George and alex, read ZM.

Grant, sometimes you can't avoid having an opinion if you must take related action, but otherwise sure, avoid uninformed opinions.

Expand full comment

I agree that I do not percieve reality well enough to reach my goals and that the referenced finding that opinions are heritable is evidence of that. I also agree that having an intuitive feeling that your belief causes are better is not a "good reason" to believe my beliefs are truer than anyone else's. Consequently, since I want to perceive reality as well as possible, I have to start doing something different.

But then, Robin, you go on to make a specific recommendation, namely, just accept the average belief on any topic unless certain conditions apply. Do you not agree that nothing you have said in the OP is evidence for the specific recommendation except for the very weak link that the specific recommendation is doing something? Do you not agree that the following is an invalid argument: there is no reason to believe that your beliefs are better than average, therefore you should adjust your beliefs towards the average?

Naturally I have been exposed to thousands of voices who advocate the specific recommendation or beliefs with a high conditional dependency on it, such as the belief that the more people involved in a decision, the better the decision will turn out. But how would you persuade someone to go with those thousands of voices who does not already believe that counting heads is a good method of getting closer to the truth when the alternative of thinking long and hard about the question is available?

Expand full comment

George, I think variance in this context is the statistical term referring to the mean of the squares of the differences between the data you're investigating and the mean of the data.

So, if I am not mistaken, we calculate the difference between your opinion and the average opinion, square it, subtract the appropriate percentage from this corresponding to the heritability, then take the square root, then add that to the average opinion. In this case, we have:

90 - 50 = 4040^2 = 16001600 - (.4)1600 = 960sqrt(960) ~ 30.9850 + 30.98 = 80.98 ~ 81, which is QED

Expand full comment

First. There's a quote from James Fenimore Cooper: "the tendency of democracies is, in all things, to mediocrity". Something I've come to believe as well.

Second. Define 'Truth'. I dare you.

Third. The question we should be asking is who is providing information to the 'majority' of the cows/sheep with whom we should be aligning ourselves? How did they come by their information/beliefs? I'm way out of my area of expertise, but it seems to me like most are indoctrinated by their government (including public schooling), their church/religious organization and the media. So maybe the real question is; To what extent has that group actually *thought* about what they believe and why?

Expand full comment

This is starting to make more sense now. But how can the mathematical method you added in the original post be useful? Surely if you followed this it would disallow positions of (at least) 0 and 100? Personally I think that's a rather good thing, but is it mathematically sound?

Expand full comment

Still I don't get how you got to 81, from 90...

Facts:----------a) Let's say that on a scale of 0 to 100, your position on property taxes is 90

b) The average position on this issue is 50

c) the table says that 41% of belief variation on this topic is genetic

So...-----------then to eliminate this genetic component of your beliefs, you might reduce your position from 90 to 81, since this takes away 40% of the variance of your belief relative to the average.

????

How is this happening ? What are the actual equations ?

let's say I have 90... while the average is 50... how come this 41% from "variance of your belief relative to the average" gives a 9, so that I get 81 as a result ?

Can you please elaborate on the actual equations that give this value ?

Thanks in advance.

Expand full comment

Just another thought, I'd be curious to see how much genes and environments effect people's beliefs on topics they have stakes in (such as investments). I'd imagine there would be some adverse selection which could occur in stakes voluntarily chosen (e.g., someone suckered in by get-rich-quick schemes would necessarily be overly-optimistic), but I can't think of any measurable beliefs people have large, involuntary stakes in.

I'd imagine the incidence of random beliefs would be greatly reduced, but by how much?

Expand full comment

Shouldn't we be encouraging people not to hold beliefs on topics that aren't within their area of expertise? We don't poll people on how to design airplanes or computer software, so I'm not sure why sociological topics (in regards to decision-making) should be any different.

I understand what Robin is saying about rejecting irrational reasons of holding different beliefs, but doesn't that mask a bigger problem? People are only able to hold irrational beliefs when its not costly for them to do so. If we are trying to teach people hold more correct beliefs, why not try and teach them not to hold beliefs they have no stake in? If people actually conformed to this rule of thumb, then it seems to me that the "average belief" would be more accurate, because it would be related to the opinion of experts.

It seems to me that holding no opinion is often a better alternative than accepting or rejecting a majority's opinion.

Expand full comment

The classic counterexample to "the majority is reliable" is the speculative bubble. It can be financially dangerous to move your opinion on the price of an investment towards majority opinion during a speculative bubble.

On the other hand, if you reject anything that looks like a speculative bubble out of hand, you might miss the next exponential mode.

These odd opinions fit together...

Expand full comment

Rather, Peter said what I quoted, not Hal.

Expand full comment

Hal writes: "Your attitude toward people who hold confident beliefs that conflict with the average should be that they have a nontrivial chance of being right."

This makes sense, but what I grapple with is that it seems hard to figure out how an individual confident person affects my belief, which is average and reflects the experience of perhaps billions of people, and opinions of people over time too. I'm not sure I have a sense of how powerful an average confident person's opinion should be on an average subject in changing my opinion, based on the average belief of people.

Another difficulty is that the confident people as a class aren't representative of people in general, and might have different values than people in general. Are confident people's values shaped by their extra knowledge, or whatever causes their confidence, or do confident people have different values on average than people in general on subjects, and therefore the values they use in forming opinions differ from the average person. In other words, the average person, if better informed, would still prefer A over B, whereas the average confident person would prefer B over A.

Expand full comment

It's kind of ironic to find majoritarianism promoted on a blog whose contributors have such "eccentric" opinions, even relative to an educated and intelligent group like academics. I suppose it might be the case that you'd be even more eccentric if you weren't adjusting your beliefs to be closer to the average.

Expand full comment

What is valuable is to have individuals whose positions are statistically independent of others', not necessarily that it will end up differing.

If you start to hold positions because people around you hold them, you fall into groupthink, and a few minor fluctuations in belief can quickly come to dominate the memesphere very quickly. Importantly, the fact that a belief is generally held no longer constitutes evidence that it is valid or that the balance of the evidence supports it. You're dramatically increasing your susceptibility to bias by making your positions dependent on others'.

We must also consider that most people are irrational, hysterical primates that would rather die than think. Valuing the opinions of such entities is unwise.

Expand full comment

The crucial problem is knowing when you have more information about the subject than the society. If you have more information, or think that other people are ignoring information, then you should think differently.

Expand full comment