The Denier’s Dilemna

Another excellent Washington Post article on bias by Shankar Vedantam: 

The federal Centers for Disease Control and Prevention recently issued a flier to combat myths about the flu vaccine … When … social psychologist Norbert Schwarz had volunteers read the CDC flier, however, he found that … three days later, they remembered 40 percent of the myths as factual. … Most troubling was that people … now felt that the source of their false beliefs was the respected CDC.  … The research, which has been confirmed in a number of peer-reviewed laboratory experiments, have broad implications for public policy. …


The psychological studies highlight … the potential paradox in trying to fight bad information with good information. … once an idea has been implanted in people’s minds, it can be difficult to dislodge. Denials inherently require repeating the bad information, which may be one reason they can paradoxically reinforce it. … In politics and elsewhere, this means that whoever makes the first assertion about something has a large advantage over everyone who denies it later. …

Hearing the same thing over and over again from one source can have the same effect as hearing that thing from many different people — the brain gets tricked into thinking it has heard a piece of information from multiple, independent sources, even when it has not. … People are not good at keeping track of which information came from credible sources and which came from less trustworthy ones, or even remembering that some information came from the same untrustworthy source over and over again. …

Ruth Mayo … found that for a substantial chunk of people, the "negation tag" of a denial falls off with time. … explaining why people who are accused of something but are later proved innocent find their reputations remain tarnished. … So is silence the best way to deal with myths? Unfortunately, the answer to that question also seems to be no.  Another recent study found that when accusations or assertions are met with silence, they are more likely to feel true … Myth-busters, in other words, have the odds against them.

I guess this goes some way to answering Ute’s question, and "what evidence in silence."

GD Star Rating
loading...
Tagged as:
Trackback URL:
  • conchis

    Um, isn’t a crucial piece of information missing here, viz. “what percentage of the myths did people think were true before reading the flyer”?

  • Henry V

    Disclaimer: As an economist, I may have a bias towards the research methods of economists (versus psychologists).

    One important reason for some of the success of experimental economics is that participants have an incentive (usually a monetary payoff) that coincides (hopefully) with real world payoffs.

    In the psychological study mentioned above, did the participants have any incentive to get the information right? What was their motive? Likely, they were joes off the street (or more likely from around campus), who may have little incentive to understand accurately information about flu vaccines. They may be low risk.

    I imagine that those participants from high risk groups (e.g., the elderly, pregnant women, if there were any) were much more likely to pay attention to the details and to correct their false beliefs.

    As in most things in life, people respond to incentives.

  • conchis

    Henry V, your reasoning is valid up to a point, but there’s also the potential for vicious cycles here – people’s information about whether they’re high risk must also come from somewhere, and their process of acquiring that information is also likely to be flawed. Indeed, at least one of the targeted myths (“only older people need flu vaccine”) speaks pretty directly to people’s perceived risks. For that reason I’m somewhat agnostic about whether your conjecture would necessarily hold up empirically, though I agree it would be very interesting to know.

  • http://www.hopeanon.typepad.com Hopefully Anonymous

    “Denials inherently require repeating the bad information” I don’t see why that’s inherently true. For example, fundamentalist islam is a denial of fundamentalist christianity without “repeating the bad information”. It’s straussian, but one could hitch “good information” to an optimized myth/narrative/formulation and then sufficiently fill the information market with it. That wouldn’t require either repeating bad information or silence.

    Another thing: I doubt 100% of the volunteers remembered 40% of the “bad information” as factual. Like Henry V, I suspect a significant percentage, greater than 0, accurately recalled the “good information” in the flyer. This group is worthy of study and acknowledgment too. Generally, I think the effect of the flyers across the distribution of volunteers would be more interesting than a reduced, overgeneralized description of the degree to which volunteers ended up thinking the “bad information” was true.

  • http://www.aleph.se/ Anders Sandberg

    I did not find the CDC study, but Norman Schwartz had some other interesting studies on biases on his pages: http://sitemaker.umich.edu/norbert.schwarz/social_cognition

    In particular, Skurnik, I., Yoon, C., Park, D.C., & Schwarz, N. (2005). How warnings about false claims become recommendations. Journal of Consumer Research., 31, 713-724.
    http://sitemaker.umich.edu/norbert.schwarz/files/05_jcr_skurnik_et_al_warnings.pdf
    seems to be an earlier test of this tendency. However, as Henry V pointed out, we do not get to see how these false beliefs stand up in a real incentive situation.

    But even mildly held false beliefs can cause further bias by affecting information acquisition. If I think a myth is factual I might not seek out information to change the belief (unlike in the case where I know I don’t know) and I will be biased to accept further confirming evidence.

  • Henry V

    “people’s information about whether they’re high risk must also come from somewhere, and their process of acquiring that information is also likely to be flawed.” –conchis

    Very good point. There are certain risks/activities/products that we don’t know that we should know more about. Therefore our incentive to acquire information may be wrong from the start. Inefficient.

    There are other sorts of information that I think people are very good at knowing whether they need to acquire; so called “workaday affairs.”

    As a non-pilot, I think it’s unimportant for me to know much about the differing risks associated with different types of single engine planes. If I were in a psychological study, I would have little incentive to study any flyer with information about single engine planes. Whereas, I might read with great interest a flyer about “myths” about airbags and young children (my wife won’t put them in the front seat, but I let my oldest ride in the front).

    But, yes, point taken.

  • jeff gray

    Hopefully-
    Christianity is a precise example of your ‘straussian’ phenomenon: The New Testament is tacked onto the ‘optimized myth/narrative’ structure of the Hebrew scriptures.

  • Brian Moore

    It seems like a better method would be to engage the social mechanisms of mockery and ostracism first, and provide detailed data-based denials later on in the pamphlet. I would imagine that wanting to avoid social stigma is more motivational than whatever weak motivation is not uh, motivating, people to remember the true phrasing of the myths mentioned.

    Therefore our pamphlet should read “Only Stupid Morons Believe That X Isn’t True” and then dissect opposing theories to X (the myths) in the fine print, for the nit-pickers.

    I’m not sure how tongue-in-cheek this post is. 🙂

  • JMG3Y

    An important distinction may be the state of mind of the information consumer. Are they actively seeking the information (“need to know”) or are they an innocent bystander who happens to be splashed by an information stream? In the first case, if they are seeking to solve or prevent a problem of current concern to them they are more likely a critical consumer and connect the better pieces into a stronger memory context. In the second case, their reconstructive recall may get all tangled up because the pieces weren’t linked strongly to a concern. In the first case it is sort of “pulled information”, which is a strength of the internet, and in the second case it is sort of “pushed information”, typically provided by a mass media entity on their rather than the consumer’s schedule. So, a reasonable hypothesis would seem to be that the results from surveying people who sought the information off the CDC website would be better than those from volunteers. Otherwise, pushed info fades into the noise.

  • anon

    It seems that many people are missing the point. I think that a little more thought needs to be given to the population of interest. What are your goals, to which population are you trying to optimally communicate information. I don’t believe the CDC’s goal is to target only those who have large incentives to learn the information. Instead, it is more likely to be to dispel myths among the entire population. Even if you don’t have incentives to learn the information, your beliefs can have an effect on others who are potentially high-risk and need accurate information.

    However, with that said, it would be frustrating if attempts to dispel myths among the entire population actually reinforced the myths… I agree that a very important missing piece of information is the subjects prior beliefs in the myth. The study should address the change in correct myth/fact categorizations. You would hope that the flyer would have a positive effect, but it is hard to actually determine the effect of the flyer with ONLY the post-test score. Certainly people did not perform as well as we would have liked after reading the flyer, but that doesn’t necessarily mean that the flyer had a negative effect.

    “Most troubling was that people … now felt that the source of their false beliefs was the respected CDC.”

    This doesn’t surprise me. People don’t like to take responsibility and admit that their memory is imperfect; blaming the flyer is an easy and obvious scapegoat. “Where did I get my beliefs, it must have been from that CDC flyer I just read 3 days ago…. wasn’t it?”

    Obviously, there is a limit to our memory recall, especially when an effort isn’t made to transfer the information to our long-term memory. So, part of the question is what is the best medium and way to communicate information so that it is accurately recalled.

  • Stuart Armstrong

    As in most things in life, people respond to incentives.

    You’d think, but…

    According to the “Guiness book of Flying Blunders” (pretty accurate in most things I’ve checked) in 1939, at the begging of WW2, the British government sent out a pamphlet to every household emphasising that “IF YOU THROW A BUCKET OF WATER ON BURNING INCENDIARY BOMB, IT WILL EXPLODE AND THROW BURNING FRAGMENTS IN ALL DIRECTIONS”. However, by the time of the blitz (1940), most respondents when questioned said that they would deal with an incendiary bomb by, you guessed it, throwing water on it.

    If the threat of getting blown to tiny pieces isn’t enough of an incentive to get people to remember information, I don’t know what is…

  • Stuart Armstrong

    Myth-busters, in other words, have the odds against them.

    The actual Mythbusters use a particular story format to make their message more memorable. Could this be adapted to information flyers? Presenting the lie and the truth, with some sort of memorable contest between the two?

  • anon

    Thanks, Stuart. That was an interesting anecdote.

    The funny thing about mythbusters is that people constantly contact the show saying that they made a mistake in the experiment performed and should have tried this or that. So even when people are presented with memorable evidence, some still think they know better and find ways to not change their beliefs. I’m not sure if this falls under the first thing they heard bias, or an overconfidence bias in your ability to decide what your optimal action is in a given situation, such as an incendiary bomb in your living room. I like calling it the “I know best” bias.

  • Floccina

    Is it not better to teach the truth with no mention of the myth. IE flu vacinations are safe and affective at prevening flu.

  • http://graehl.posterous.com Jonathan Graehl

    “Dilemna” -> dilemma in title 🙂