(Image by traffic_analyzer/iStockphoto)

In what many are calling our “post-truth” world, emotions tend to trump facts in the public’s views on policy and politics. The way people think about issues from climate change to GMOs to the Black Lives Matter movement reflect their preexisting worldviews. As a result, people create media echo chambers that reflect the beliefs, values, and information that support what they already believe. Consequently, this results in the spread of false and politicized information.

In this context, dispelling false information is one of the greatest hurdles for public interest advocates. Recent studies show that just reading false information more than once can make it seem more true. Communicating the facts about climate change, public health, and human rights has never been harder. Many well-resourced special interest groups have a stake in politicizing information that minimizes or questions the negative effects of their respective industries’ activities on the public's health and well-being.

A dive into cognitive, behavioral and social science suggests potential tactics to counter these special interest groups’ efforts. As the research director for frank, an organization housed in the University of Florida College of Journalism and Communications, I work on a team that aggregates and shares the science of how people think and make decisions to improve strategic communication in the public’s interest. One particular tactic continues to show positive results as a way to ward off false information. Scholars refer to this tactic as “inoculation.”

Just as vaccines inoculate the human body against infectious diseases, messaging that warns audiences of false and politicized information, coupled with information that debunks the false claims, can protect audiences in the long run. I like to think of this as playing offense rather than defense. Rather than waiting to defend your cause, research suggests that we should preemptively prepare people to counter false and politicized information.

It is good practice to anchor this preemptive messaging in scientific consensus. Counter to popular belief, people across the partisan divide consider scientific evidence when forming policy beliefs and forming decisions. Telling and showing audiences that scientists are in consensus increases support.

For example, the scientific consensus that climate change is a result of human activity has been well established—although climate skeptics often deny this consensus, citing uncertainty in the scientific community about the exact impact of human activity as reason to downplay the evidence. Sander van der Linden, a social psychologist at the University of Cambridge, and his colleagues at the Yale Program on Climate Change Communication, recently conducted a series of experiments to test the effectiveness of messages that countered false information questioning the climate science consensus. This research found that telling people there is consensus among scientists about the role of human activity in climate change was effective in increasing the public’s belief by 20 percent. Exposure to contradictory information weakened the belief by 9 percent, but an inoculation message that preceded the contradictory information reinforced people’s belief in the consensus, protecting up to two thirds of the scientific consensus. For example, a message that “some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists” protected the public’s support for the consensus message.

However, evidence and consensus that resoundingly supports your cause and solution is not enough. It is also critical that advocates’ employ digestible and engaging stories. Most people cannot connect with statistics and data alone. Rather, people are more likely to engage with your cause when you couple messages that warn audiences about false information with stories about the facts or issue at hand. You can build empathy, support, and even action for your cause by telling stories that transport your audience into the experiences of those affected by the issues. To reach people who are otherwise skeptical of your data, it is useful to tell stories that resonate with your audience’s deeply held beliefs and highlight the local impact on their communities.

For example, public health advocates often have to go up against large and well-funded industries. “Powerful groups such as tobacco, soda, and pharmaceutical companies typically outspend health policy advocates by a considerable margin,” write researchers Jeff Niederdeppe, Kathryn Heley, and Colleen L. Barry. “This is likely to generate more frequent and recent exposures to industry anti-policy frames, which tend to emphasize personal responsibility for health, the threat of overzealous government intrusion, and the benefits of industry self-regulation.”

Niederdeppe and his colleagues tested the effectiveness of both inoculation and narrative messages that supported government regulative policies backing restrictions on the sale of soda in schools, graphic warning labels on cigarettes, and the prohibition of marketing efforts by doctors for prescription painkillers. Some participants read an inoculation message, while others read a narrative message against either sugary beverages, cigarettes, or prescription painkillers. Inoculation messages warned the reader of threats from these industries and also refuted arguments against the policy. One message, for example, read: “Big soda companies spend all of this money … because they don’t think people can make up their minds on their own. But we all have the freedom to think for ourselves about where we stand on policies that would reduce childhood obesity.”

Narrative messages, on the other hand, told stories about a woman named Cynthia and her daughter. In the stories, Cynthia’s daughter either struggled with weight problems as a result of soda or experimented with cigarettes, or Cynthia had an addiction to prescription painkillers. “Each story ... placed Cynthia’s struggles in a broader context (‘many parents face similar challenges’), and described how three specific policies could help to address the problem. The stories explicitly acknowledged personal/parental responsibility but emphasized the role that policies could play in reducing the impact of industry marketing,” the researchers wrote.

The researchers tested the effectiveness of inoculation and narrative messages immediately and one week after participants’ exposure to them. The narrative messages offset the immediate impact of messages opposing the given policy, but their influence declined slightly over time. Participants that read the inoculation message were less likely to be persuaded by the messages against the policy and were more likely to counter them one week later. The results suggest that a preemptive combination of narrative and inoculation messages can be effective in preparing audiences to resist false or politicized information.

Research suggests that corrective messages that debunk false claims also can reinforce support for the consensus scientific opinion. In one study, political scientist Leticia Bode and communication scholar Emily K. Vraga tested the impact of corrective messages on perceptions of the safety of genetically modified organisms (GMOs). The authors note that while controversy and the rise of false information around GMOs is relatively new, experts agree that GMOs do not pose any health risks. They first assessed participants’ attitudes toward GMOs and then presented participants with a simulated Facebook feed with a mix of posts, but one that shared false information through a news story (“GMOs make you sick”). After that, some participants saw a series of links in the simulated “Related Stories” section commonly seen on Facebook posts that pop up under an article. Some participants saw stories debunking the false information from the American Medical Association, others saw stories reinforcing it from Snopes.com, and some saw a combination of both or unrelated stories. Participants who already believed that GMOs were safe did not change their beliefs when presented with false information about GMOs. However, for participants that held incorrect beliefs about GMOs prior to the experiment, the links debunking GMOs as unsafe reduced their misconceptions.

In a time marked by increased polarization, advocates must employ the best of what we know from science to effectively refute politicized and false claims. Behavioral, cognitive, and social science provides a number of insights that can be used in campaigns and communication efforts. Across multiple disciplines, research suggests that a combination of playing offense by communicating scientific consensus, sharing stories, and inoculating audiences is one way to protect the facts and the issues we care about most in a post-truth world.

Tracker Pixel for Entry