Adorable girl looking at nurse while she making her an injection in clinics

Zuckerberg on Vaccines, or The Shortcomings of Our Responses to Vaccine Misinformation

When Mark Zuckerberg walked into a hearing on Facebook’s proposed cryptocurrency last week, he probably was not expecting questions about his views on vaccines. Yet, the hearing quickly changed topic when Congressman Bill Posey (R-FL) asked Zuckerberg: “Are you 100% confident that vaccines pose no injury to any person on this planet?”

The question arose after a lengthy introduction by Congressman Posey that managed to portray anti-vaccination views as harmless and worthy of unfettered dissemination through social media, while conflating the federal program used to compensate claims of vaccine-related injuries with scientific proof of actual injury. The fact that unscientific, emotion-driven assertions continue to permeate the discourse of policymakers addressing the regulation of social media is troubling. At a time when vaccine-preventable diseases like measles are making a comeback, misinformation spread through social media can have devastating consequences. Reports in the U.S. and abroad indicate that inaccurate information about vaccines—either questioning vaccine safety and efficacy, or propagating discredited pseudo-scientific information—spreads much faster through social media than does credible information. Social media amplify inaccurate vaccine speech. And while most of the U.S. population is pro-vaccine, anti-vaccination advocates have a much more active presence in the social media universe, and thus anti-vaccine messages tend to ring louder and longer.

Recently, researchers uncovered an additional layer complicating the problem of vaccine misinformation in cyberspace. Malicious automated programs (popularly known as bots) have been used to spread vaccine misinformation on social media since at least 2014. The apparent goal of those who created these bots is increasing the amount of anti-vaccine content available online. Mimicking behavior observed in connection with political speech during recent campaigns in the United States, Russian trolls have also been active online in spreading both pro- and anti-vaccine messages. The goal here is to coat anti-vaccine arguments with a veneer of plausibility and legitimacy as a means to sow further discord among Americans.

This makes it even more important that U.S. policymakers not perpetuate counter-narratives that lack scientific support and run against the message conveyed by multiple institutional regulators with expertise in the subject. As a society, we collectively face a multi-layered problem with implications for public health. The way we go about addressing it cannot rest on false dichotomies. When Mr. Zuckerberg shared his view that scientific consensus indicated that “people should get their vaccines,” Congressman Posey followed up by asking: “[S]houldn’t somebody have the opportunity to express an opinion different from yours?” In its simplistic formulation, the Congressman’s second question is clickbait designed to elicit a quick “yes.” But when vaccine misinformation in the topic, problem framing needs to be more sophisticated. Certain kinds of speech—including certain opinions—can inflict enormous damage.

Recall the case of Ethan Lindenberger, an Ohio teenager who grew up without receiving recommended childhood vaccines but decided to get vaccinated at age 18 against his mother’s wishes. The major source of anti-vaccine information fueling Mr. Lindenberger’s mother’s anti-vaccine stance? Facebook, or more precisely content made widely available through Facebook.

Facebook rolled out a new vaccine misinformation policy in early 2019. It has started (1) to reduce the ranking of groups and pages conveying inaccurate vaccine information, (2) to exclude these groups from recommendations and predictions, and (3) to remove controversial targeting options related to vaccines. More recently, the company has proposed to reduce or eliminate funding streams for these groups and pages. In addition to imposing restrictions on misinformation, Facebook is also promoting vaccine-related content from public health organizations. These steps have drawn praise from entities like the World Health Organization.

However, as I have written elsewhere, these steps are not enough. Facebook may be erecting some barriers to locating vaccine misinformation, but the company continues to allow misinformation to populate its pages and groups. As Zuckerberg reassured Congress members present at the hearing that Facebook will not “go out of its way” to recommend pages containing vaccine misinformation, he also reiterated that the company will not hide or remove vaccine misinformation. If a user searches for it, the content remains there to be found. Nor will Facebook prevent users from joining groups promoting vaccine misinformation.

The gaps in Facebook’s current approach are made starker in an environment in which, as mentioned above, social media users espousing anti-vaccination views are disproportionally more vocal and more driven than pro-vaccination ones. For example, I never much stumbled upon vaccine misinformation on social media until I started researching topics related to vaccines. Now, even after the measures taken by Facebook and other companies, I have no problems finding it as I monitor social media for anti-vaccine content, simply by typing words usually at the core of anti-vaccination vocabulary.

This is not to say that what Facebook has committed to doing is irrelevant. It is certainly a start, albeit one that needs improvement. But it illustrates more broadly the shortcomings of industry self-regulation, even in the face of mounting public criticism and, worse still, a public health crisis in the making. For many diseases, herd immunity—protection against a disease across a community through the presence of a high percentage of immunized individuals—requires that more than 90% of a community be vaccinated or otherwise immune. As herd immunity rates keep falling in the United States, we have already witnessed the first wave of outbreaks of eradicated, highly contagious diseases like measles.

This is why, in addition to the limitations inherent to industry responses to misinformation problems, we cannot afford congressional policy and lawmaking based on false dichotomies and spurious inquiries into private views on vaccination. This is not a case of contrasting opinions freely circulating in the online marketplace of ideas. It is about understanding how to regulate the availability and dissemination of scientifically unsupported information capable of producing detrimental effects to the public health.

What Mark Zuckerberg thinks about vaccines is irrelevant. That he was asked—in the context of a congressional hearing on social media regulation—is alarming. But easing the pressure on social media, and on our elected representatives, to address more aggressively digital vaccine misinformation could be the greatest mistake of all.