Why the backfire effect does not explain the durability of political misperceptions
Brendan Nyhan
PNAS April 13, 2021 https://doi.org/10.1073/pnas.1912440117
Abstract
Previous research indicated that corrective information can sometimes provoke a so-called “backfire effect” in which respondents more strongly endorsed a misperception about a controversial political or scientific issue when their beliefs or predispositions were challenged.
I show how subsequent research and media coverage seized on this finding, distorting its generality and exaggerating its role relative to other factors in explaining the durability of political misperceptions.
To the contrary, an emerging research consensus finds that corrective information is typically at least somewhat effective at increasing belief accuracy when received by respondents. However, the research that I review suggests that the accuracy-increasing effects of corrective information like fact checks often do not last or accumulate; instead, they frequently seem to decay or be overwhelmed by cues from elites and the media promoting more congenial but less accurate claims.As a result, misperceptions typically persist in public opinion for years after they have been debunked. Given these realities, the primary challenge for scientific communication is not to prevent backfire effects but instead, to understand how to target corrective information better and to make it more effective.
Ultimately, however, the best approach is to disrupt the formation of linkages between group identities and false claims and to reduce the flow of cues reinforcing those claims from elites and the media.
Doing so will require a shift from a strategy focused on providing information to the public to one that considers the roles of intermediaries in forming and maintaining belief systems.
…
On the issue of climate change, for instance, fact checks and messaging emphasizing the scientific consensus have failed to substantially reduce belief polarization on the issue. Efforts to reduce misperceptions might instead seek to amplify credible voices who share identities or worldviews with groups whose members frequently doubt anthropogenic climate change.
Notable examples include Katharine Hayhoe, an evangelical climate scientist, and Bob Inglis, a former Republican member of Congress turned climate activist. More such advocates are needed, however, such as Republican-leaning farmers and corporate leaders who could speak about how climate change is affecting their businesses or former military leaders who could discuss the threats to national security created by climate-related disruptions. While these voices may seem rare, polarization can reverse when fissures emerge in a coalition and elites disavow a previously consensus position.
What this approach highlights is the key dynamic in countering false beliefs about politics and other controversial issues—the configuration of information flows to the public. Even if backfire effects are rare, fact checking struggles to overcome the inertia of public opinion absent unusually strong evidence that people become aware of and find difficult to deny (e.g., an economic crisis), particularly given the countervailing effects of group identity on issues for which belief polarization is common. Providing corrective information is generally worthwhile and can often improve belief accuracy on the margin, but durably reducing misperceptions will often require changing the cues that people receive from the sources that they most trust.
Doing so will in turn require journalists and science communicators to focus less on communicating directly to the public and more on the intermediaries that are most credible to people who hold or are vulnerable to false beliefs.