According to YouTube Official Blog, it is banning ‘all anti-vaccine content on its platform, including misinformation about approved vaccines for common illnesses in addition to COVID-19’. This reportedly includes ‘several video channels associated with high-profile anti-vaccine activists including Joseph Mercola and Robert F. Kennedy Jr., who experts believe are partially responsible for the spread of anti-vaxx misinformation.
Matt Halprin, YouTube’s vice president of global trust and safety, told The Post that ‘YouTube didn’t act sooner because it was focusing on misinformation specifically about coronavirus vaccines’.
“Developing robust policies takes time. We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge.”– Matt Halprin, YouTube’s vice president of global trust and safety (source: The Washington Post)
In the Washington Post, Gerrit De Vynch writes “As part of a new set of policies aimed at cutting down on anti-vaccine content on the Google-owned site, YouTube will ban any videos that claim that commonly used vaccines approved by health authorities are ineffective or dangerous. The company previously blocked videos that made those claims about coronavirus vaccines, but not ones for other vaccines like those for measles or chickenpox.”
According to The Post, Mercola, an alternative medicine entrepreneur, and Kennedy, a lawyer and the son of Sen. Robert F. Kennedy both have been face of the anti-vaccine misinformation movement for years.
Even though Facebook, Twitter and YouTube all reportedly banned all Coronavirus misinformation early on in the pandemic, the false claims continue to run rampant across all three of the platforms, writes The Post.
- YouTube is banning prominent anti-vaccine activists and blocking all anti-vaccine content
- Managing harmful vaccine content on YouTube
- YouTube Bans All Anti-Vaxx Content in Sweeping Bid to Suppress Misinformation
Lisa Fazio, an associate professor at Vanderbilt college who studies misinformation, told The Post “YouTube is the vector for a lot of this misinformation. If you see misinformation on Facebook or other places, a lot of the time it’s YouTube videos. Our conversation often doesn’t include YouTube when it should.”
Hany Farid, a computer science professor and misinformation researcher at the University of California at Berkeley, told The Post “You create this breeding ground and when you deplatform it doesn’t go away, they just migrate. This is not one that should have been complicated. We had 18 months to think about these issues, we knew the vaccine was coming, why was this not the policy from the very beginning?”
“These conspiracy theories don’t just go away when they stop being on YouTube,” Farid told The Post. “You’ve created the community, you’ve created the poison, and then they just move onto some other platform.”
“Like always, the devil’s in the details. How well will they actually do at pulling down these videos?. But I do think it’s a step in the right direction,” Lisa Fazio told The Post.
Coronavirus-19 vaccination watch
According to the US Centers for Disease Control and Prevention (CDC) COVID vaccination tracker page (as of October 2, 9 a.m.) 478,362,045 doses have been distributed and 394,690,283 doses administered. According MDH COVID-19 Response vaccine data (as of September 29) a total of 6,458,044 doses of Covid-19 (Pfizer & Moderna) vaccines have been administered in Minnesota. According to the MDH latest tally (as of October 1) the confirmed COVID-19 cases in Minnesota are 714,790 (out of 12,562,842 tested) with 8,170 deaths.