(Reuters) - Alphabet Inc’s YouTube said on Wednesday it would remove videos from YouTube containing misinformation about COVID-19 vaccines, expanding its current rules against falsehoods and conspiracy theories about the pandemic.
The video platform said it would now ban any content with claims about COVID-19 vaccines that contradict consensus from local health authorities or the World Health Organization.
YouTube said in an email that this would include removing claims that the vaccine will kill people or cause infertility, or that microchips will be implanted in people who receive the vaccine.
(The story has been refiled to change references to blog post to email)
A YouTube spokesman told Reuters that general discussions in videos about “broad concerns” over the vaccine would remain on the platform.
YouTube says it already removes content that disputes the existence or transmission of COVID-19, promotes medically unsubstantiated methods of treatment, discourages people from seeking medical care or explicitly disputes health authorities’ guidance on self-isolation or social distancing.
Conspiracy theories and misinformation about the new coronavirus vaccines have proliferated on social media during the pandemic, including through anti-vaccine personalities on YouTube and through viral videos shared across multiple platforms.
Although drugmakers and researchers are working on various treatments, vaccines are at the heart of the long-term fight to stop the new coronavirus, which has killed more than a million people, infected more than 38 million and crippled the global economy.
In its email, YouTube said it had removed over 200,000 videos related to dangerous or misleading COVID-19 information since early February.
Andy Pattison, manager of digital solutions at the World Health Organization, told Reuters that the WHO meets weekly with the policy team at YouTube to discuss content trends and potentially problematic videos. Pattison said the WHO was encouraged by YouTube’s announcement on coronavirus vaccine misinformation.
The company also said it was limiting the spread of COVID-19 related misinformation on the site, including certain borderline videos about COVID-19 vaccines. A spokesman declined to provide examples of such borderline content.
YouTube said it would be announcing more steps in the coming weeks to emphasize authoritative information about COVID-19 vaccines on the site.
Reporting by Elizabeth Culliford; Editing by Chizu Nomiyama and Steve Orlofsky
from Hacker News https://ift.tt/3lKv05l
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.