13 Comments

Why all but one, rather than simply all?

Expand full comment

@Paul Ganssle

Yes, that occurred to me too.

Either a problem or an additional conclusion, depending on what you believe: The other poles picked their co-believer as the most credible too. Both individualists and hierarchicalists did so, and we seem to be running with the assumption that it generalizes well beyond that. All but one pole must be wrong.

You can conclude that all but one are systematically deluded, and I'm not dismissing that possibility.

Expand full comment

I would have thought the more risk-averse would be more hostile to nanotech, regardless of what the experts said. However, there may not be much correlation between individualism and willingness to take risks at a society-wide level. (The riskiness inherent in individualism is that your position in society is at risk due to your nonconformist behaviour, not society as a whole.)

Expand full comment

The question of trusting someone (to take major decisions wisely) because their values system is close to one's own is central to some of the disagreements that there have been on this blog.Do I trust a Singularitarian or a Libertarian to make major decisions concerning the future of Humanity ? How similar are their values systems to mine ? It would need a more thorough discussion of values systems and their meanings to answer those questions, right now I have my doubts.Such a discussion should NOT be buried as a side issue in a presentation of Bayesian probability calculations.

Expand full comment

Related to Michael's comment that people evaluating the study presumably knew little about the subject matter, this actually doesn't seem like a very bad way of evaluating facts. Of course, it's not a substitute for actual understanding, but if I am making a cursory evaluation about something, I tend to be more willing to believe people who I know think like me simply because I think that they are most likely to reach the same conclusions I would reach if I had enough information to make a reasonable decision about something. So if my friend who I know is super into astrology and elaborate conspiracy theories runs up to me and tells me to take all my money out of the bank before next week because something horrible is going to happen, and my other friend who I know tends to think things out comes up and tells me that this is most likely NOT going to happen - even though both provided little useful information, if you asked me to decide right then I would say that I am not going to withdraw my money from the bank.

Of course, I wouldn't make a decision based on two people just telling me things that I don't understand, but if forced to choose as I might be in a study situation, I would believe the person who I think shares my worldview because I consider my worldview well-reasoned and as a result the one who shares my worldview is most likely to evaluate the data reasonably (by my estimation). The trick, however, (and I'm not saying I can always do this) is not to let the initial impressions bias my answer.

Expand full comment

What you value seems to cause you to gather facts that support your values. I think of the Academic Clusters post here at overcomingbias.com. Should you assume people with different values might know something you don't know? How much do differing values reflect different tastes, and knowledge of different facts.

Expand full comment

Michael, thanks, fixed the link.

Expand full comment

This link given is for a different and only vaguely related article.

Whether or not people listen to a given expert on nanotechnology might have nothing to do with how they interpret certain "facts" in general. I'm assuming the people in the study knew little about the dangers (or lack thereof) of nanotechnology, and so had to take the experts' views at face value. The less you know about something, the more easily swayed you are by who the expert is and not necessarily about her arguments. One hopes that in general, we can build our worldview about many topics that we can become well-versed in, and make more informed decisions.

Expand full comment

Peter, I often think of individualism as much more of a risk-averse philosophy than hierachicalism:

I agree with what you're saying, but one could also argue that fearful people will tend to follow the herd. This is an empirical question, which could be settled by asking the people in this experiment to rate themselves as risk-tolerant or risk-averse. (Although I suppose one's risk tolerance level depends on the topic. The more I think about experiments of this type, the less clear the conclusion becomes.)

Expand full comment

Peter, I often think of individualism as much more of a risk-averse philosophy than hierachicalism: the idea being that if power is vested in a central authority, we are all in for a huge disaster if the authority is seriously mistaken about something, whereas if everyone is allowed to go her own way, some people will surely suffer disaster, but some others will triumph.

Granted, strong nanotech is likely to be an exception to this.

Expand full comment

Perhaps it might be a good idea if, instead of limiting public discussion only to political views, we more frequently spent time talking about these meta-beliefs which frame our worldviews, and which lead to unbridgable gaps in our political opinions?

This is what George Lakoff is saying:

http://en.wikipedia.org/wik...

Expand full comment

Perhaps "individualists" are people who are willing to tolerate higher levels of risk when the potential gains are high, and "people who believed in hierarchy" are people who are more conservative and averse to risk. So, as a layperson, with little understanding of the technical issues, I look for the expert whose risk tolerance/aversion level matches my own, and I echo their decision. The conclusion we should draw from the experiment depends on the correlation between the individualist/hierarchicalist distinction and the risk-tolerant/risk-averse distinction. Intuitively, I would expect a high correlation. So perhaps this is about risk-tolerance, rather than facts versus values.

Expand full comment

It does look like many of our long-running political disagreements tend to be disagreements that are really about our philosophies - deeply held convictions that are difficult to even consciously identify and describe, but which nevertheless dictate our thoughts, opinions, and actions.

Perhaps it might be a good idea if, instead of limiting public discussion only to political views, we more frequently spent time talking about these meta-beliefs which frame our worldviews, and which lead to unbridgable gaps in our political opinions?

Expand full comment