Facebook took down 1.3 billion fake accounts between October and December 2020, the company has announced in a blog. The company said the move was just one of the ways the company tackled misinformation.
“Tackling misinformation actually requires addressing several challenges including fake accounts, deceptive behavior, and misleading and harmful content.” read the blog post.
Facebook says it removed more than 12 million pieces of content that had been flagged by Global health bodies as Covid-19 misinformation. The company said it was working with independent fact-checkers to review content.
“Misinformation can also be posted by people, even in good faith. To address this challenge, we’ve built a global network of more than 80 independent fact-checkers, who review content in more than 60 languages. When they rate something as false, we reduce its distribution so fewer people see it and add a warning label with more information for anyone who sees it.”
“We know that when a warning screen is placed on a post, 95% of the time people don’t click to view it. We also notify the person who posted it and we reduce the distribution of Pages, Groups, and domains that repeatedly share misinformation. For the most serious kinds of misinformation, such as false claims about COVID-19 and vaccines and content that is intended to suppress voting, we will remove the content.” The blog post read.
There was an influx of Covid-19 misinformation on social media at the peak of the pandemic, with Facebook and Twitter recording the highest numbers. Facebook said it had about 35,000 people working on tackling the challenges on the platform.
Over the past several years, we have invested in protecting our community and we now have over 35,000 people working on these challenges.