Nanotech Views Value Driven

In another experiment conducted with the Washington-based Project on Emerging Nanotechnologies at the Woodrow Wilson International Center for Scholars, Kahan found that when volunteers heard about the risks of nanotechnology from different experts, they gravitated toward the views of experts who seemed to share their personal values — individualists followed the lead of experts who appeared to be individualists, while people who believed in hierarchy were most likely to be influenced by experts who espoused similar views. Once volunteers decided which experts were most like them, it did not make a difference whether the experts said nanotechnology was risky or safe — either way, the volunteers agreed with them. … When people clash on hot-button issues, their disagreements may have more to do with clashing values than facts. One person may conclude nanotechnology is dangerous while another person concludes it is safe, but neither realizes their conclusions are being driven by underlying values that have nothing to do with nanotechnology.

That is from the Post.  Of course regarding policy conclusions, all else equal it does make sense to listen more to people who share your values.   But it seems a shame if your views about facts contain nothing more. 

GD Star Rating
Tagged as: , ,
Trackback URL:
  • It does look like many of our long-running political disagreements tend to be disagreements that are really about our philosophies – deeply held convictions that are difficult to even consciously identify and describe, but which nevertheless dictate our thoughts, opinions, and actions.

    Perhaps it might be a good idea if, instead of limiting public discussion only to political views, we more frequently spent time talking about these meta-beliefs which frame our worldviews, and which lead to unbridgable gaps in our political opinions?

  • Perhaps “individualists” are people who are willing to tolerate higher levels of risk when the potential gains are high, and “people who believed in hierarchy” are people who are more conservative and averse to risk. So, as a layperson, with little understanding of the technical issues, I look for the expert whose risk tolerance/aversion level matches my own, and I echo their decision. The conclusion we should draw from the experiment depends on the correlation between the individualist/hierarchicalist distinction and the risk-tolerant/risk-averse distinction. Intuitively, I would expect a high correlation. So perhaps this is about risk-tolerance, rather than facts versus values.

  • Perhaps it might be a good idea if, instead of limiting public discussion only to political views, we more frequently spent time talking about these meta-beliefs which frame our worldviews, and which lead to unbridgable gaps in our political opinions?

    This is what George Lakoff is saying:

  • Z. M. Davis

    Peter, I often think of individualism as much more of a risk-averse philosophy than hierachicalism: the idea being that if power is vested in a central authority, we are all in for a huge disaster if the authority is seriously mistaken about something, whereas if everyone is allowed to go her own way, some people will surely suffer disaster, but some others will triumph.

    Granted, strong nanotech is likely to be an exception to this.

  • Peter, I often think of individualism as much more of a risk-averse philosophy than hierachicalism:

    I agree with what you’re saying, but one could also argue that fearful people will tend to follow the herd. This is an empirical question, which could be settled by asking the people in this experiment to rate themselves as risk-tolerant or risk-averse. (Although I suppose one’s risk tolerance level depends on the topic. The more I think about experiments of this type, the less clear the conclusion becomes.)

  • Michael

    This link given is for a different and only vaguely related article.

    Whether or not people listen to a given expert on nanotechnology might have nothing to do with how they interpret certain “facts” in general. I’m assuming the people in the study knew little about the dangers (or lack thereof) of nanotechnology, and so had to take the experts’ views at face value. The less you know about something, the more easily swayed you are by who the expert is and not necessarily about her arguments. One hopes that in general, we can build our worldview about many topics that we can become well-versed in, and make more informed decisions.

  • Michael, thanks, fixed the link.

  • What you value seems to cause you to gather facts that support your values. I think of the Academic Clusters post here at Should you assume people with different values might know something you don’t know? How much do differing values reflect different tastes, and knowledge of different facts.

  • Paul Ganssle

    Related to Michael’s comment that people evaluating the study presumably knew little about the subject matter, this actually doesn’t seem like a very bad way of evaluating facts. Of course, it’s not a substitute for actual understanding, but if I am making a cursory evaluation about something, I tend to be more willing to believe people who I know think like me simply because I think that they are most likely to reach the same conclusions I would reach if I had enough information to make a reasonable decision about something. So if my friend who I know is super into astrology and elaborate conspiracy theories runs up to me and tells me to take all my money out of the bank before next week because something horrible is going to happen, and my other friend who I know tends to think things out comes up and tells me that this is most likely NOT going to happen – even though both provided little useful information, if you asked me to decide right then I would say that I am not going to withdraw my money from the bank.

    Of course, I wouldn’t make a decision based on two people just telling me things that I don’t understand, but if forced to choose as I might be in a study situation, I would believe the person who I think shares my worldview because I consider my worldview well-reasoned and as a result the one who shares my worldview is most likely to evaluate the data reasonably (by my estimation). The trick, however, (and I’m not saying I can always do this) is not to let the initial impressions bias my answer.

  • tcpkac

    The question of trusting someone (to take major decisions wisely) because their values system is close to one’s own is central to some of the disagreements that there have been on this blog.
    Do I trust a Singularitarian or a Libertarian to make major decisions concerning the future of Humanity ? How similar are their values systems to mine ? It would need a more thorough discussion of values systems and their meanings to answer those questions, right now I have my doubts.
    Such a discussion should NOT be buried as a side issue in a presentation of Bayesian probability calculations.

  • Colin Reid

    I would have thought the more risk-averse would be more hostile to nanotech, regardless of what the experts said. However, there may not be much correlation between individualism and willingness to take risks at a society-wide level. (The riskiness inherent in individualism is that your position in society is at risk due to your nonconformist behaviour, not society as a whole.)

  • @Paul Ganssle

    Yes, that occurred to me too.

    Either a problem or an additional conclusion, depending on what you believe: The other poles picked their co-believer as the most credible too. Both individualists and hierarchicalists did so, and we seem to be running with the assumption that it generalizes well beyond that. All but one pole must be wrong.

    You can conclude that all but one are systematically deluded, and I’m not dismissing that possibility.

  • Unknown

    Why all but one, rather than simply all?