While Facebook has had mixed results to date with its enforcing of guidelines on the spread of misinformation, the fact that the social media platform is taking a string stance is promising. Having dealt with Presidential elections and COVID-19 last year, it is now turning its attention to vaccine misinformation, issuing revised guidelines with the help of the World Health Organization (WHO).
“Today, following consultations with leading health organizations, including the World Health Organization (WHO), we are expanding the list of false claims we will remove to include additional debunked claims about the coronavirus and vaccines,” explains an updated blog post from Facebook.
“This includes claims such as:
- COVID-19 is man-made or manufactured
- Vaccines are not effective at preventing the disease they are meant to protect against
- It’s safer to get the disease than to get the vaccine
- Vaccines are toxic, dangerous or cause autism,” it adds.
The company also notes that those users, accounts or pages that repeatedly spread vaccine misinformation will be removed from Facebook and its associated platforms.
“We will begin enforcing this policy immediately, with a particular focus on Pages, groups and accounts that violate these rules, and we’ll continue to expand our enforcement over the coming weeks. Groups, Pages and accounts on Facebook and Instagram that repeatedly share these debunked claims may be removed altogether,” it highlights.
Along with limiting or removing posts that try to spread vaccine misinformation, Facebook will also help to share more scientific information from trusted healthcare organisations. To that end, it says it will donate $120 million in ad credits to public health agencies, NGOs and the United Nations to assist with messaging for the COVID-19 relief and education efforts.
Whether these steps are indeed enough to curb the spread of vaccine information remains to be seen, but at the very least, Facebook is doing its part.