YouTube said it broadened its medical misinformation policies to prohibit content that includes false claims and conspiracy theories about any approved vaccine, not just those for COVID-19. In addition, the video giant said it kicked high-profile anti-vaxxers off YouTube, including Robert F. Kennedy and osteopathic physician Joseph Mercola.
Specifically, YouTube is banning content that “falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines,” according to a blog post.
YouTube’s guidelines already prohibit certain types of medical misinformation (such as saying drinking turpentine can cure diseases). It also developed new policies around COVID-19 and medical misinformation, and says that since last year, YouTube has removed over 130,000 videos for violating COVID-19 vaccine misinformation policies.
Why did it take YouTube until now to block all anti-vaccine content? “Developing robust policies takes time,” YouTube VP of global trust and safety Matt Halprin said in an interview with the Washington Post. “We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge.”
As with YouTube’s COVID guidelines, the video platform said it consulted with local and international health organizations and experts in developing the new vaccine-related policies.
YouTube added that there are “important exceptions” to our new guidelines. For example, it will continue to allow videos about vaccine policies, new vaccine trials and historical vaccine successes or failures, as well as personal testimonials relating to vaccines — as long as the video doesn’t violate other YouTube Community Guidelines and the channel does not show “a pattern of promoting vaccine hesitancy.”
Source: Read Full Article