On February 14, 2019, Representative Adam Schiff (D-CA) sent a letter to the CEOs of Google and Facebook asking for more action against the spread of misinformation about vaccines. In it, he wrote:
“There is strong evidence to suggest that at least part of the source of this trend is the degree to which medically inaccurate information about vaccines surface on the websites where many Americans get their information, among them YouTube and Google search. As I have discussed with you in other contexts, and as you have acknowledged, the algorithms which power these services are not designed to distinguish quality information from misinformation or misleading information, and the consequences of that are particularly troubling for public health issues. If a concerned parent consistently sees information in their YouTube recommendations that casts doubt on the safety or efficacy of vaccines, it could cause them to disregard the advice of their children’s physicians and public health experts and decline to follow the recommended vaccination schedule. Repetition of information, even if false, can often be mistaken for accuracy, and exposure to anti-vaccine content via social media may negatively shape user attitudes towards vaccination.
Additionally, even parents and guardians who seek out accurate information about vaccines could unwittingly reach pages and videos with misinformation. A report by the Guardian found that on both Facebook and YouTube, suggested searches related to vaccines often led users to pages or groups providing medically and scientifically inaccurate information.”
In response, both Google and Facebook promised to do more to curtail the amount of anti-vaccine misinformation being spread on their platforms. However, the first social media platform to act was neither a Google nor a Facebook property. Pinterest, a social media platform for sharing images and the information that come with those images, announced that it was not allowing any searches for vaccine-related terms on its platform. (Pinterest works by people posting images and other media to their “wall” and others being able to see those pictures and share them.)
Soon after Pinterest’s announcement, YouTube (owned by Google) announced that anti-vaccine channels and videos on its platform would no longer be able to advertise or otherwise receive money from viewers. YouTube works by members posting videos on a variety of topics, with certain restrictions placed on videos that contain violence, nudity (or adult content) or attacks against individuals and groups. Admittedly, some videos get through their filters, but Google insists that they work diligently to identify and remove videos in violation of their terms. With this new policy, anti-vaccine individuals and groups will be cut off from a big revenue stream.
Facebook, for its part, has announced that it will tweak its recommendations to members to make anti-vaccine ads and groups less visible without outright banning them. Facebook ads can be bought by almost anyone, and Facebook allows these ads to be targeted to specific groups of people. For example, if you’re a small business in Philadelphia, you don’t need to advertise to the world. You may just need to advertise to your target customers in Philadelphia. Or, if you’re an anti-vaccine group, you may target your ads to parents seeking information on vaccines.
Of course, there are many other ways for parents to reach anti-vaccine information on the internet that is not reliable. There are other social media platforms such as Twitter and Instagram (owned by Facebook) that have made no announcement regarding anti-vaccine misinformation. And there are video-hosting sites competing with YouTube that will host and even monetize videos that would violate YouTube’s terms of service. However, there seems to be increasing public pressure from those concerned about infectious diseases to have some effect. It remains to be seen what other online platforms decide to stop being vehicles for the spread of misinformation that can be harmful to children.