Despite the overwhelming evidence vaccines work, experts fear anti-vaxxers might soon outnumber those who believe in the science – with Facebook largely to blame.
Researchers looking at how the anti-vaccination movement has exploded in recent years have found small but highly active groups on the world’s biggest social media site are spreading misinformation and unscientific messages about vaccines faster than health agencies and scientists can keep up.
“There is a new world war online surrounding trust in health expertise and science, particularly with misinformation about COVID-19, but also distrust in big pharmaceuticals and governments,” said data scientists Neil Johnson at George Washington University, who led the research.
“Nobody knew what the field of battle looked like, though, so we set to find out.”
What they found was a network “of unprecedented intricacy that involves nearly 100 million individuals partitioned into highly dynamic, interconnected clusters across cities, countries, continents and languages”.
While they’re presently vastly outnumbered by people who believe in the science, those clusters “manage to become highly entangled with undecided clusters in the main online network, whereas pro-vaccination clusters are more peripheral”.
The researchers found there are three times as many anti-vaccination groups on Facebook as pro-vaccination, and they tend to contain a lot more “content” – the authors notably avoiding the world “information” to describe their contents – than pro-vaccination communities, which tended to focus on public health benefits.
In other words, on Facebook you’re generally more likely to come across anti-vaccination content than pro-vaccination.
“We thought we would see major public health entities and state-run health departments at the center of this online battle, but we found the opposite. They were fighting off to one side, in the wrong place,” said Prof Johnson.
The researchers developed a mathematical model to figure out where this would lead, and the result wasn’t promising.
“Our theoretical framework reproduces the recent explosive growth in anti-vaccination views, and predicts that these views will dominate in a decade,” the study, published in journal Nature, found.
Worryingly, more people sign up to anti-vaccination views during disease outbreaks.
“Anti-vaccination clusters show the highest growth during the measles outbreak of 2019, whereas pro-vaccination clusters show the lowest growth,” the study found.
Prof Johnson said their research could help public health agencies develop “an entirely new set of strategies to identify where the largest theaters of online activity are and engage and neutralise those communities peddling in misinformation so harmful to the public”.
Facebook has refused to ban false anti-vaccination content on its platforms, despite the World Health Organization listing the movement as one of the planet’s top 10 threats to health. The social network, which also owns Instagram and messaging service Whatsapp, says it has “de-ranked” anti-vaxxer groups in its algorithms and added links to reliable sources at the top of search results.
“For example, if a group or page admin posts this vaccine misinformation, we will exclude the entire group or page from recommendations, reduce these groups and page’ distribution in news feed and search, and reject ads with this misinformation,” spokesperson Monika Bickert said last year.
CEO Mark Zuckerberg told US Congress later that year Facebook wouldn’t stop people exercising “freedom of expression”.
“If someone wants to post anti-vaccination content, or if they want to join a group where people are discussing that content, we don’t prevent them from doing that.”
A University of Otago study in March found half of all search results on Facebook for information about vaccines brought up ‘negative’ content, far more than Google (20 percent negative) and YouTube (15 percent).
Once a person clicks on one anti-vaccine article or Facebook page, algorithms make it likely they’ll see more, the Kiwi researchers said.