Facebook Bans Debunked Claims About COVID-19 Vaccines

A customer walks past a sign indicating that a COVID-19 vaccine is not yet available at Walgreens, Wednesday, Dec. 2, 2020, in Long Beach, Calif.

Ashley Landis / AP

Facebook is banning claims about COVID-19 vaccines that have been debunked by public health experts, as governments prepare to roll out the first vaccinations against the virus.

That includes posts that make false claims about how safe and effective the vaccines are, and about their ingredients and side effects.

“For example, we will remove false claims that COVID-19 vaccines contain microchips, or anything else that isn’t on the official vaccine ingredient list,” Facebook’s head of health, Kang-Xing Jin, said in a blog post. “We will also remove conspiracy theories about COVID-19 vaccines that we know today are false: like specific populations are being used without their consent to test the vaccine’s safety.”



The new ban is an expansion of Facebook’s rules against misinformation about the coronavirus that could lead to imminent physical harm. The company said it removed 12 million such posts from Facebook and Instagram between March and October.

The approach to COVID-19 vaccines is a departure from Facebook’s general approach to vaccine misinformation. The company has made false claims about other vaccines less visible on its platform but stopped short of removing them. In October, it banned anti-vaccination ads.

Facebook said it was extending the policy because COVID-19 vaccines will soon be rolled out around the world. The U.K. became the first country to approve a vaccine this week, with the first doses expected to be available next week. Regulators in the U.S. are expected to approve vaccines before the end of the year.

On Monday, Facebook CEO Mark Zuckerberg said the company would show users “authoritative information” about vaccines. It’s adding a section to its coronavirus information center — a section of its site that promotes credible sources — with details about how vaccines are tested and approved.

YouTube, owned by Google, and TikTok also have said they will remove false claims about COVID-19 vaccines.

Despite efforts by Facebook and other platforms to curb the spread of hoaxes and conspiracy theories, misinformation about the pandemic has spread widely on social media this year.

Editor’s note: Facebook, Google and TikTok are among NPR’s financial supporters.

Copyright 2020 NPR. To see more, visit https://www.npr.org.