This content originally appeared on the American Academy of Family Physicians (AAFP) website and excerpts are being republished with the AAFP’s permission.
This fall, Timothy Caulfield, PhD, Canadian health law and science professor and best-selling author, addressed America’s family physicians about the rise in misinformation and the need to “move the needle” in slowing its spread.
An Infodemic
Misinformation about the COVID-19 pandemic is rampant from claims that it’s a bioweapon or linked to 5G wireless network communications to speculation that it (along with Candida auris) made its way to Earth via a meteorite.
“We [are living] in an infodemic,” said Caulfield.
What’s truly incredible, though, is how many people buy into these conspiracies, said Caulfield, citing survey results from the Pew Research Center indicating that 3 in 10 Americans believe the virus was made in a lab.
What’s behind this thinking? Researchers at Carleton University seem to have tapped into a likely cause: social media consumption―or what Caulfield calls “availability bias.”
Social Media: A Land Mine
A growing body of evidence shows that “those who believe in conspiracy theories are more likely to be getting their information from social media.”
What’s particularly enlightening, said Caulfield, is to look at the amount of misinformation coming from celebrities, sports stars, politicians and other prominent people compared with the level of social media engagement that content receives. He calls it a “top-down, bottom-up phenomenon.”
According to research from the University of Oxford’s Reuters Institute for the Study of Journalism, prominent public figures disseminated only about 20% of the misinformation sampled, but that misinformation attracted nearly 70% of all social media engagements seen in the sample.
“So, it’s bottom-up in that all of us ― all of us ― we’re spreading that misinformation on social media,” Caulfield noted. “It’s important to recognize that because that gives us a sense of what we have to do to stop the spread of misinformation.”
Use of slick video techniques is a particularly effective way to spread misinformation. The first Plandemic movie (yes, there were actually two of them), for example, commanded a far greater viewing audience than did The Office reunion or Taylor Swift’s City of Lover Concert. Its high-quality documentary presentation and convincing testimonials lent it an inordinate amount of credibility, Caulfield said, and helped advance the many conspiracy theories the film touted.
Superspreaders (of Misinformation)
Another important source of misinformation, according to Caulfield? Superspreaders. “There are entities out there, there are individuals out there that have had a disproportionate impact on the public discourse, who have had a large impact on the spread of misinformation,” he said.
Not surprisingly, anti-vaccination activists have jumped on the bandwagon of naysayers when it comes to coronavirus. “It’s fascinating to see how the anti-vaxxers are positioning their message,” Caulfield observed. “Since the very early days, they tied their message to an ideological approach, perhaps even an intuitively appealing ideological idea. The idea of liberty, choice, freedom. We know research tells us that if you can do that, if you can link misinformation to an ideologically appealing concept, it’s more likely to spread.”
The bad news: This approach has been very successful, creating harmful effects in multiple areas. It’s causing physical harm, promoting stigma and discrimination, sparking property damage (yes, people are actually pulling down cell towers), wreaking havoc on science and health policy, and creating an overall chaotic and divisive information environment.
The good news: The same approach can be appropriated to disseminate accurate, reliable information. It basically comes down to overpowering bad science with good science, according to Caulfield. And that requires strenuously adhering to salient aspects of scientific discourse and discovery, from highlighting the use of rigorous methodology to faithfully relating how evidence is interpreted and translated into appropriate findings.
Fighting misinformation and promoting transparency needs to happen across all fronts.
Debunking Works
“I think debunking works,” said Caulfield. But don’t harbor unrealistic expectations, he cautioned. “It’s really important to highlight, here, that what we’re trying to do is move the needle. No one strategy is going to fix everything.”
How to go about debunking scientific falsehoods? Don’t be dissuaded by concerns about the so-called backfire effect, the belief that pushing back against an idea only leads to further entrenchment among those who hold that view. The idea was espoused in a 2010 study that has since been contested, he emphasized, and more recent research has undermined its validity. So, although it can’t be completely discounted, it shouldn’t stand in the way of attempts to counter false information.
Such efforts are more likely to succeed when conducted by experts and trusted information sources before the misinformation becomes too firmly entrenched. Other keys to successful rebuttal are using compelling facts and highlighting the rhetorical techniques used to push misinformation and support denialism, Caulfield said.
“What I mean by that is saying, ‘Look, this is a conspiracy theory.’ ‘Look, they’re misrepresenting the risk involved.’ ‘Look, they’re relying on testimonials.’ So, highlight the good data, the good science, and point out the rhetorical devices that are used to push the misinformation.”
TIPS ON DEBUNKING MISINFORMATION
- Provide the science.
- Use clear, shareable content.
- Reference trustworthy and independent sources.
- If possible, reference expert consensus (as well as the fact that science evolves, when appropriate).
- Be nice, authentic, empathetic and humble.
- Consider using a narrative.
- Highlight gaps in logic and rhetorical tricks.
- Make facts the hook, not the misinformation.
- Remember: Your audience is the general public, not the hardcore denier.
Pause First
“We want to get people just to pause before they share,” noted Caulfield. Getting people to stop long enough to consider how truthful a headline is can disrupt the cycle of perpetuating misinformation, according to a study published in the Harvard Kennedy School Misinformation Review earlier this year. More recently, a study from researchers at the University of Regina in Saskatchewan found this concept held true for COVID-19 misinformation.
People typically aren’t trying to deliberately spread misinformation. “You know, we all seem to think that people have this nefarious agenda ― no, that’s not the case. Most people want to be accurate,” he said. “So, if we can just get people to pause and embrace a culture of accuracy, we can have an impact, we can move that needle. And that’s what we really want to do.”
Pause first. Check. Share later.