Map helps chart your place in the disinformation universe

By Samantha Martinusen. Mentored and edited by Nicola Jones.

People are connected through language, music, the Internet, and social media. Online platforms are composed of communities of people that are interconnected, both within the platform itself and across platforms. And now, there is a map that illustrates the interconnectedness of the online world. These connections are mapped to tackle the spread of fake news and disinformation: false statements that spread deliberately to influence public opinion.

“We are here to navigate misinformation. What do you need to navigate? A map,” said George Washington University physics professor Neil Johnson when presenting the work at the American Association for the Advancement of Science virtual annual meeting on 18 February.

Johnson said he hopes the work will help social media platforms to better target disinformation and stop its spread. His team plans to release an app that will allow individuals and companies to see where they lie in the misinformation universe.

The spread of disinformation is “terrifying stuff, but also kind of fascinating from a science perspective,” said Danny Rodgers, co-founder and chief technology officer of the Global Disinformation Index, a non-profit organization based in London, England with the mission of defunding, disrupting, and down-ranking disinformation sites. Johnson’s work “could not be more impactful”, he said.

Johnson and colleagues developed the map by scanning over 25 million posts and active accounts on platforms including Facebook, VKontakte, Instagram, Gab, Twitter, and Telegram. They looked for key words relating to a particular thread of disinformation—such as, in the case of the COVID-19 pandemic, “Chinese Zombie Virus”—and identified groups of people, or communities, that were sharing and discussing the same topic across platforms. They then mapped how those communities are connected by reacting to the information, by “liking” a post or re-posting it for example.

The distance between communities, or nodes on the map, represents the strength of these connections. The result is a spider-web of billions of people, grouped into like-minded communities, revealing the average paths that disinformation takes when traveling between them.

Johnson has been working to understand the flow of information and hate speech for years. In 2019, his team published a paper in Nature showing that communities promoting positive online speech tend to intermingle among themselves in a tight information network, while the hate speech community is not as concentrated in one particular area of the map. In 2021, they published another Nature paper showing that disinformation is not centralized to one social media platform. This dispersion of hate speech communities creates a complex web of interconnections.

The mapping effort serves two purposes, said Johnson: “it establishes a new science to study human turbulence and serves as a tool to establish interventions.” Johnson hopes that once these connections are represented on a map, platforms can begin to address how to deal with cutting off the flow.

Social media sites continue to experiment with different tactics to moderate their content, including flagging sensitive subjects, tagging posts with keywords with links to accredited sources, and increasing stringency on account verification. Johnson noted that many social media platforms currently intervene by removing the loudest voices spreading disinformation. But this, he said, is no more effective than lopping the top off an iceberg: “it doesn’t matter if you chop off the top, the stronger connections at the base will continue to rise and replace the removed voices.” Instead, he said, the more effective strategy is to target the connections between communities across different platforms.

Johnson’s team is currently developing an app, called Map of Activity Across Platforms (MAAP), that they plan to make publicly available within the next year or so. Users will be prompted to plug in their social media handles and preferred methods of media consumption. The app will then plot their location in the misinformation multiverse. Users will be able to explore their closest connections and see where disinformation typically comes from. Media platforms should be able to use the app to discover which community groups have strong disinformation-focused connections and so might be usefully targeted with intervention strategies.

“The problem with misinformation is how to deal with it at scale? How do we navigate?” said Johnson. He hopes his map is a start.

Samantha Martinusen is a student journalist at the University of Florida, where she is pursuing a Ph.D. in Chemical Engineering. Her SciCom interests lie in navigating the space between hard science and the public, with an emphasis on making STEM discoveries accessible to all. Samantha is an NASW student member. Follow her on Twitter @sammartinusen or send her an email at sammartinusen@gmail.com.

ADVERTISEMENT
Knight Science Journalism @MIT

ADVERTISEMENT
Stanford Center for Biomedical Ethics