Wednesday 29 September 2021

YouTube BANS prominent anti-vaxxers, including Robert F. Kennedy and Joseph Mercola, and will remove videos peddling misinformation about Covid, MMR and chickenpox vaccines and ALL other jabs

 YouTube has vowed to remove videos that contain misinformation about all vaccines, including jabs for Covid-19, chicken pox and MMR.

This move expands its policies around health misinformation which had been strengthened during the coronavirus pandemic.

As well as removing specific videos that violate the new policy, YouTube will terminate the channels of high profile users who spread misinformation about vaccines, and has already acted to remove Robert F. Kennedy and Joseph Mercola. 

Kennedy was one of the most high profile proponents of the debunked theory that vaccines cause autism, and Mercola, an alternative medicine entrepreneur, has been generally critical of vaccines while promoting alternative therapies.

The Google-owned video platform said its ban on Covid-19 vaccine misinformation, which was introduced last year, had seen 130,000 videos removed so far as a result.

However, the firm says more scope was needed to clamp down on broader false claims about other vaccines appearing online.

The most prominent spreader of anti-vaccine content was Joseph Mercola (pictured), an alternative medicine entrepreneur who has over 3.6m followers across Facebook, Instagram and Facebook
Robert F Kennedy Jr (pictured in August 2020 speaking during a protest against coronavirus related measures) a prolific anti-vaccine campaigner and son of former US attorney general Robert Kennedy

Anti-vaccine content creator,Joseph Mercola (pictured left), an alternative medicine entrepreneur, and Robert F Kennedy Jr (right) are among those set to be banned by YouTube

Under the new rules, any content which falsely alleges that any approved vaccine is dangerous and causes chronic health problems will be removed, as will videos that include misinformation about the content of vaccines.

Social media and internet platforms have been repeatedly urged to do more to tackle the spread of online misinformation.

Millions of posts have been blocked or taken down and a number of new rules and prompts to official health information have been introduced across most platforms.

However, critics have suggested not enough has been done to slow the spread of harmful content since the start of the pandemic.

YouTube said it was taking its latest action in response to seeing vaccine misinformation begin to branch out into other false claims.

'We've steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general,' the firm said in a blog post.

'We're now at a point where it's more important than ever to expand the work we started with COVID-19 to other vaccines.'

The changes also bar content that falsely alleges approved vaccines are dangerous and cause chronic health effects.

It will also remove videos with claims that vaccines do not reduce transmission or contraction of disease.

YouTube wrote it will take down any videos that also contain misinformation on the substances contained in vaccines.

'This would include content that falsely says that approved vaccines cause autism, cancer or infertility,' YouTube wrote in the blog post shared by Google. 

YouTube is playing catchup with other social media firms, as Facebook banned misinformation on all vaccines seven months ago. 

However, despite the information ban, the pages of Mercola and Kennedy are still active on both Twitter and Facebook. 

As well as removing misinformation, YouTube is working with official sources to bring more videos to the platform including the National Academy of Medicine and the Cleveland Clinic. 

YouTube head of health care, Garth Graham, told the Washington Post that the goal was to get scientific information front and centre.

It was in the hope of catching their attention before they get caught in a web of misinformation and anti-vaxx videos.

'There is information, not from us, but information from other researchers on health misinformation that has shown the earlier you can get information in front of someone before they form opinions, the better,' Graham said.  

In an announcement today the Google-owned site states content that promotes misinformation about COVID-19 vaccines will be removed (file photo)

YouTube has vowed to remove videos that contain misinformation about all vaccines, including jabs for Covid-19, chicken pox and MMR 

As of Wednesday, popular anti-vaccine accounts, including those run by Mercola were kicked off YouTube. 

A press email for Mercola's website said in a statement: 'We are united across the world, we will not live in fear, we will stand together and restore our freedoms.'

Kennedy did not immediately respond to requests for comment. 

Other content that would result in the removal of a video is one that suggests substances in vaccines can track those who receive them. 

'Our policies not only cover specific routine immunisations like for measles or Hepatitis B, but also apply to general statements about vaccines,' the firm says.

YouTube added that there would be 'important exceptions' to the new guidelines, including content about 'vaccine policies, new vaccine trials and historical vaccine successes or failures'.

Personal testimonies relating to vaccines, which the company said were important parts of public discussion around the scientific process, are also allowed.

'Today's policy update is an important step to address vaccine and health misinformation on our platform,' the company said.

It added: 'We'll continue to invest across the board in the policies and products that bring high-quality information to our viewers and the entire YouTube community.'

If your content is flagged as violating the YouTube policy then it will be removed and you'll get an email to explain why.

Conspiracy theories and misinformation about the coronavirus vaccines proliferated on social media during the pandemic

Conspiracy theories and misinformation about the coronavirus vaccines proliferated on social media during the pandemic

On a first time violation there will be a warning but it is unlikely YouTube will take any further action beyond removing the video from public viewing.

Users who continue to upload banned content will have the channel terminated - usually after three strikes in 90 days. 

Conspiracy theories and misinformation about the coronavirus vaccines proliferated on social media during the pandemic.

This was through anti-vaccine personalities on YouTube, through viral videos shared across multiple platforms and on TikTok and Twitter.

Research by the Center for Countering Digital Hate (CCDH) earlier this year found 65 per cent of misinformation content on social media was attributed to a dozen people, including Kennedy and Mercola. 

Some posts perpetuated conspiracy theories relating to Microsoft co-founder Bill Gates, unfounded claims that vaccines cause harm to specific groups, or that vaccines cause autism. 

Post a Comment

Start typing and press Enter to search