May 2, 2012

How a Post on Biased Thinking Used Research Twisted to Fit Its Own Bias

Posted in Uncategorized at 2:08 pm by mariawolters

Last week, New Statesman journalist Martha Gill published a column in the New Statesman’s Current Account blog about the way in which our own biases affect how we process information. She based her argument not just on her own observations, but on a scientific study.
Unfortunately the actual results of the study are not nearly as neat as Gill would like them to be.

Before we start, let me be clear: Studies can be distorted at any step from the reseachers’ fertile minds to the page or blog post, with the most common culprit being (misreported) press releases. I don’t know where the paper Gill cites got twisted beyond its results; I merely based my analysis on the way she reported the study.

Gill versus Nyhan/Reifler

A study published in the journal Political Behaviour shows just how reluctant people are to engage with facts that don’t support their world-view.

Notice that the reference is not given; for fact fans, it’s Nyhan, Brendan and Reifler, Jason (2010): When Corrections Fail: The Persistence of Political Misconceptions. Polit Behav (2010) 32:303–330. The paper is freely available online, as far as I can see. Ben Goldacre explains much better than me why this matters: It allows readers to check whether conclusions were reported correctly.

In the experiment, conducted in 2005, participants were given fake news stories.

There were two experiments, one in 2005 and one in 2006, that consisted of a total of four studies. This is very important – we will see later why.

These news stories were embedded with false facts: that tax cuts under the Bush administration increased government revenues, that weapons of mass destruction had been found in Iraq and that Bush had banned stem-cell research (he only limited some government funding).

The first experiment in autumn 2005 looked at correcting people’s impression that Iraq had weapons of mass destruction; the second in spring 2006 repeated the earlier study, and repeated it with two more topics, tax cuts and stem cell research. In the first experiment, the story with the “false facts” came from Associated Press, in the second experiment, half the participants were shown stories that purported to be from the New York Times, a notoriously liberal paper, and half saw exactly the same story, but this time, it supposedly came from Fox News, a notoriously conservative news source.

After each statement, the researchers put in an unambiguous correction – and then tested the participants to see if they picked this up.

The supposedly unambiguous correction was in fact a paragraph in the same story that reported findings from a relatively objective source which contradicted the key statement in the first paragraphs. So what we have here is not statement / counterstatement, but rather a classic “he said/she said” structure, where journalists present both views.

Here’s the original text from Experiment 1, together with the correction.


Wilkes-Barre, PA, October 7, 2004 (AP)—President Bush delivered a hard-hitting speech here today that made his strategy for the remainder of the campaign crystal clear: a rousing, no-retreat defense of the Iraq war. Bush maintained Wednesday that the war in Iraq was the right thing to do and that Iraq stood out as a place where terrorists might get weapons of mass destruction. ‘‘There was a risk, a real risk, that Saddam Hussein would pass weapons or materials or information to terrorist networks, and in the world after September the 11th, that was a risk we could not afford to take,’’ Bush said.

[Correction]
While Bush was making campaign stops in Pennsylvania, the Central Intelligence Agency released a report that concludes that Saddam Hussein did not possess stockpiles of illicit weapons at the time of the U.S. invasion in March 2003, nor was any program to produce them under way at the time. The report, authored by Charles Duelfer, who advises the director of central intelligence on Iraqi weapons, says Saddam made a decision sometime in the 1990s to destroy known stockpiles of chemical weapons. Duelfer also said that inspectors destroyed the nuclear program sometime after 1991.

[All subjects]
The President travels to Ohio tomorrow for more campaign stops.

(Nyhan & Reifler, 2010, p. 324f.)

Can you see how easy it is to reframe this as a piece that just presents two different points of view? Your evaluation of the correction will depend largely on your view of the CIA, Big Government, and bureaucrats who write reports.

They didn’t. Participants who identified themselves as liberal ignored the correction on stem-cell regulations and continued to believe Bush had issued a total ban. Conservatives not only ignored the corrections on Iraq and the tax cuts but clung even more tenaciously to the false information. Facts had made things even worse.

Well, what actually happened?

First of all, this was never about people actually changing their opinion; the researchers are clear that this is future work. Instead, this was a between-subjects design. Half the participants read the story with the correction before they answered the question, half read the story without the correction.

The question participants answered was designed to measure shift in opinion:


Question to participants:
Immediately before the U.S. invasion, Iraq had an active weapons of mass
destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived.

  • Strongly disagree [1]
  • Somewhat disagree [2]
  • Neither agree nor disagree [3]
  • Somewhat agree [4]
  • Strongly agree [5]

    (Nyhan & Reifler, 2010, p. 325)

    The authors looked at four predictors of opinion: whether participants had seen the correction, what their ideology was (on a scale from liberal to conservative), how much they knew about politics, and whether their ideology affected their reaction to the correction. When they used all of these predictors to model participants’ answer to the question, by far the largest effect was political knowledge. In the first study, they also found a clear effect of ideology. The correction was more likely to work the more liberal participants were; it backfired for conservatives. Conservatives who read the corrected text were more likely to believe that Iraq indeed had Weapons of Mass Destruction than Conservatives who didn’t.

    In the second experiment in Spring 2006, where they looked at the Weapons of Mass Destruction issue again, that backfire effect for conservatives was gone. The only way they could replicate it was to look at a small subset of their participants who had stated that Iraq was most important for them, but that’s a post-hoc analysis (in other words, fishing for results). There are many possible explanations for this, but the authors argue that the main reason was the change in public opinion between Autumn 2005 and Spring 2006. So, far from showing that journalists are powerless, the results actually suggest that persistent corrections across a range of media might be far more powerful than a single story in a point/counterpoint scenario.

    When the researchers used a text about tax cuts, however, they saw the same “backfire effect” in conservatives that they had observed in the Autumn 2005 study. In effect, strong conservatives who read the text with the correction were more likely to believe the incorrect fact than strong conservatives who didn’t see the correction. When Nyhan and Reifer looked at liberals’ reactions to a text about stem cell effects, there was no such backfire effect; liberals who read the correction were less likely to accept the correction, but they didn’t show an increased conviction that the opposite was true.

    What about people who are neither strongly liberal or strongly conservative? Well, in all of the four studies Nyhan and Reifer conducted, the effect of ideology was gradual, i.e. Centrists were far less likely to be influenced by ideology than either people with strong liberal or strong conservative views.

    (By the way, all participants in this study were undergraduates at a Catholic university; the researchers say that their study needs to be repeated with a more representative sample of the population.)

    So, What Are We To Make Of This?

    People’s perceptions can change, but they don’t change based on reading a single contradictory story, even if it comes from supposedly trustworthy sources. (In fact, in the second experiment, the purported source of the story (New York Times vs Fox News) did not make any difference to the results at all).

    What we need to do is understand why people persist in their beliefs despite contrary evidence. In particular, we need to look at

    • how much people already know (which explained a lot of the variation in the data)
    • how persistent their beliefs are
    • what motivates their reasoning

    So what can journalists do? Quite a bit, as it turns out.

    First of all, once a belief has been formed, it tends to persevere. So, those journalists and news media who break stories have a unique opportunity to affect people’s beliefs about the issues they report, because first impressions are likely to stick (c.f. Ross and Lepper’s (1980) work on belief perseverance, cited after Nyhan & Reifler 2010).
    Secondly, and most importantly, if misinformations are persisting, keep correcting them, and seek as many allies as possible to spread the correct information. People who are ‘‘confronted with information of sufficient quantity or clarity… should eventually acquiesce to a preference-inconsistent conclusion.’’ (Ditto & Lopez, 1992, p. 570).

    So if you are a journalist or an activist, and Gill’s column discouraged you, take heart. Change happens, but it takes patience and persistence. Keep going!

    Additional References

    Ditto, P. H., & Lopez, D. F. (1992). Motivated skepticism: use of differential decision criteria for preferred and nonpreferred conclusions. Journal of Personality and Social Psychology, 63(4), 568–584.

    Ross, L., & Lepper, M. R. (1980). The perserverance of beliefs: Empirical and normative considerations. In R. A. Shweder (Ed.), Fallible judgment in behavioral research: New directions for methodology of social and behavioral science (Vol. 4, pp. 17–36). San Francisco: Jossey-Bass.

    Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: