By Jayne Williamson-Lee. Mentored and edited by Elizabeth Whittington.
Just as our bodies can build resistance to viral attacks, it’s possible our minds can cultivate defenses against disinformation.
Developing a vaccine for disinformation
To help combat disinformation, Google researcher Beth Goldberg and her colleagues have developed a “vaccine.” Exposing viewers to manipulative messages and promptly debunking them can help people dispel the false content before they encounter them in the wild—much like how a vaccine trains our immune system to recognize a virus and prepare for an attack.
“In this case, the attack is not a virus but a manipulation, something intended to compel you to do something that you don’t actually want to do,” said Goldberg, a research program manager at Jigsaw, a unit within Google that explores threats to open societies. During a panel at the annual meeting of the American Association for the Advancement of Science in February, experts, including Goldberg, examined issues of trust and belief arising from the unruly spread of false and unverified information online and how to counter it.
The technique, known as psychological inoculation, is one of the defensive interventions against misinformation and disinformation, which are distinguished from each other based on their intent to mislead. Misinformation is inaccurate information spread inadvertently, while disinformation operates as a much more insidious campaign to seed false or misleading information for a specific objective, whether financial or political.
During the study, Goldberg’s team saw their participants’ resilience to manipulative messages improve after watching 90-second animated video clips showing techniques used in political propaganda, such as fear mongering and employing emotionally charged language. Each participant watched a video warning them about an impending attempt to manipulate them, then were shown weakened examples of misinformation (a “microdose”), followed by counter arguments refuting the manipulative message.
This sequence of inoculation trained viewers to recognize propaganda techniques, raising their guard to misinformation and reducing the desire to share these messages than those in the control group. Interestingly, “the inoculation groups worked across different demographic groups, political ideologies, and across different analytical abilities,” Goldberg said.
A variation of 30-second videos has since also proved similarly effective. Goldberg is now in the process of releasing the videos in an advertising and social media campaign to prevent people from believing and sharing falsehoods more widely. Additional work regarding online platforms will target the source of misinformation.
Preventing exposure and spread of misinformation
“The more you see misinformation, and the more any statement—whether it's true or false—is repeated, it starts to feel real,” said Lindsay Juarez, Ph.D., director of Irrational Labs, a product design company that creates behavioral change for good. Her team explored ways to identify and reduce the spread of disinformation on social media platforms. “Our approach [was], how can we prevent that exposure?”
Juarez and her team partnered with the user-generated video platform TikTok to flag videos that contained unverified information. While the platform currently takes down videos depicting false information, it had not taken any measures to address the spread of potentially false claims in real time.
Having a banner with “Caution: Video flagged for unverified content” on relevant videos was found to reduce views by 5% and likes by 7%. Further interventions were implemented for users trying to share these videos, asking them, “Are you sure you want to share this video?” This additional step reduced video shares by 24%.
Social media users “want to spread accurate information—they want to be credible,” said Juarez. “And yet in the moment, when you're scrolling through Twitter [or] putting something on Facebook, other desires come into play.”
Juarez’s team considered how they could interrupt users’ desires to increase likes and shares–or at the least make them pause before sharing potentially harmful misinformation.
“It's a small intervention that really does capitalize on refocusing people on accuracy,” said Juarez.
Jayne Williamson-Lee is a technology writer and graduate student at Johns Hopkins. Her work has appeared in Popular Science, Mashable, Psychology Today, and other outlets. Find her website at j-writer.com or follow her on Twitter @jaynew_l.
Image: A “vaccine” against disinformation, known as psychological inoculation, exposes viewers to manipulative messages and promptly debunks them before they encounter them in the wild. Illustration by Emiliano Morales.