Anti-vaccination content on Facebook extends beyond the thoroughly debunked myth that vaccines cause autism, a new study reports. Instead, a team of researchers found four flavors of anti-vaccine misinformation that may discourage parents from vaccinating their children.
The findings, published today in the journal Vaccine, suggest that there isn’t a “one-size-fits-all” approach to curbing the spread of vaccine hesitancy, according to the study’s senior author Brian Primack, a professor at the University of Pittsburgh. Instead, understanding why parents are reluctant to vaccinate their kids will be critical in fighting vaccine-preventable diseases like measles, which continues to spread through pockets of unvaccinated people across the country. The recent outbreaks have prompted a closer look at the proliferation of anti-vaccine misinformation on social media platforms, including Facebook.
Historically, anti-vaccination rhetoric has focused on misplaced fears about autism after a fraudulent study by disgraced researcher Andrew Wakefield claimed there was a link. The study has since been retracted and studies continue to report no such link. Today’s study reports that misinformation about vaccines on Facebook appears to have multiplied beyond fears of autism to include four main themes: mistrust of science and government agencies; fear of safety risks; belief in conspiracy theories, and support of alternative disease treatments. The researchers also found that the same stories and videos from anti-vaccination groups tend to recirculate among people who oppose vaccines.
The findings make sense to vaccine expert Peter Hotez, dean for the National School of Tropical Medicine Baylor College of Medicine who was not involved in the research. “It began focusing on autism, but now it’s moving into other areas,” Hotez says. “It tends to confirm the depth and breadth of how Facebook is promoting the anti-vaccine movement.” Jeff Hancock, founding director of the Stanford Social Media Lab who also wasn’t involved in the research, is less convinced that this particular study proves Facebook’s culpability. “I don’t think it tells us much about what role social media is playing,” he says. “I think it tells us more about these individuals, which is very valuable.”
The idea for the study started with a rush of anti-vaccination comments on a 90-second informational video about the human papillomavirus (or HPV) vaccine, which prevents certain kinds of cancer. A month after the Kids Plus Pediatrics clinic in Pittsburgh posted the video on Facebook, “distinctly anti-vaccination” comments, as the researchers called them, began flooding in. These included threatening or extremist comments along the lines of “you’ll burn in hell for killing babies,” or “you have been brainwashed.” After about eight days, the tide of anti-vax sentiment petered out.
A team of researchers led by Beth Hoffman, a research assistant at the University of Pittsburgh’s Center for Research on Media, Technology, and Health, wanted to understand who the people posting these comments were, and what was driving them. “Not everyone concerned about vaccines has the same concerns,” Hoffman says. Understanding the range of those concerns could help health professionals and scientists reach people who are hesitant or afraid of vaccines. Hoffman wanted to know: “How can we create pro-vaccine messages that resonate with these four different types of beliefs?”
After the University of Pittsburgh approved the team’s approach, Hoffman and her colleagues pored over the comments posted during that eight day span. There were “distinctly anti-vaccination” comments from nearly 800 different profiles, so the team picked a random sample of 197 to study in depth. They scoured their profiles for information about age, gender pronouns, political affiliation, whether the posters were parents. They looked through two years of publicly available posts and information. If a post included a video, Hoffman and her colleagues watched it. If it linked to a website, they checked it out.
The team found that these anti-vaccination profiles were scattered across nine different countries, and 36 states within the US. (The most common state in the US was California, with Texas coming in second.) Of the 55 profiles with obvious political affiliations, 31 of them, or 56 percent, supported Donald Trump, and six of them, or 11 percent, supported Bernie Sanders. But despite the wide geographic range, and some differences in political views, the posters shared some key characteristics. Most, for example, appeared to be parents. Most were also identified with female gender pronouns — an interesting finding in light of the Daily Beast’s discovery that anti-vaccination advertisers on Facebook specifically targeted demographics likely to include mothers.
Most also shared similar content at least once over two years, with overlapping points. For example, 73 percent of the anti-vaccine posts purported to share “scientific” information about the dangers of vaccines; 71 percent warned of supposed conspiracies — like that the government and Big Pharma have covered up vaccines’ harms in a push for more profit; and 69 percent claimed vaccines were linked to poorer health. Of course, the truth is that vaccines are safe and effective. Providing free vaccines to kids whose families can’t afford them saves lives and billions of dollars. And vaccine-preventable diseases are far, far more dangerous than vaccines.
When the researchers ran a social network analysis, the team found that the posters fell into four different groups, each with a particular theme. One emphasized alternative and homeopathic treatments for diseases — like, for example, eating yogurt to cure HPV. (“There is no treatment for the virus itself,” the CDC says, but infections can be prevented with the vaccine.) Another worried about the safety and morals of vaccination. A third group valued civil liberties and mistrusted science. And a fourth group promoted conspiracy theories, like that the polio virus isn’t real. (It is real, and it paralyzed thousands of people in the US every year before widespread use of the vaccine.)
The study did have some limitations, including that it analyzed only 197 profiles. It’s possible that looking at a bigger population might reveal different themes. The researchers also couldn’t say for certain whether these profiles represented real people, although Hoffman says that they didn’t see any evidence that they were bots or fraudulent accounts. Still, even if they were real, people might not present accurate portraits of themselves on social media — which could skew the results. And Hancock at the Stanford Social Media Lab points out that this study doesn’t get into the role of social media in spreading misinformation, but rather identifies characteristics shared by people who post it.
Hancock thinks these findings could help strategically disseminate public health interventions so they reach the same people targeted by anti-vaccination campaigns. “Fight fire with fire,” he says. Along those lines, the study authors suggested that boosting media literacy might help curb the spread of misinformation on social media, and that health professionals speak up about vaccines on social media.
That, however, can be a daunting prospect. Both health professionals and parents of kids who died of vaccine-preventable diseases have been the targets of virulent harassment campaigns on social media, according to recent reports by the LA Times and CNN. Hotez says there’s only so much the medical community can do, and urges social media platforms to get involved. In what Hotez calls a “cosmetic” fix, Facebook recently outlined plans to remove anti-vax pages and groups from its recommendations, but it wouldn’t remove the content altogether. “They have not done the real work that needs to be done to disarm the anti-vax social media empire,” Hotez says. “And until we figure out a way to dismantle some of those components, this will continue.”