Facebook has released its Community Standards Enforcement Report for October through December 2020. The Community Standards and Enforcement Report tracks the tech giant’s progress and commitment to making Facebook and Instagram platforms safe and inclusive.
The quarterly report shares metrics on Facebook’s performance in preventing and taking action on content that goes against Community Standards while protecting its community’s safety, privacy, dignity and authenticity.
The latest report shows some positive strides towards improvements in prevalence, providing transparency and accountability around content moderation operations across different Facebook products. It includes metrics across 12 policies on Facebook and 10 policies on Instagram.
During the 4th quarter 2020,the report shows that the Facebook took action on: 6.3 million pieces of bullying and harassment content, up from 3.5 million in Q3 due in part to updates in their technology to detect comments; 6.4 million pieces of organized hate content, up from 4 million in Q3; 26.9 million pieces of hate speech content, up from 22.1 million in Q3 due in part to updates in technology in Arabic, Spanish and Portuguese; 2.5 million pieces of suicide and self-injury content, up from 1.3 million in Q3, due to increased reviewer capacity
During the 4th quarter 2020, on Instagram, the company took action on: 5 million pieces of bullying and harassment content, up from 2.6 million in Q3 due in part to updates in our technology to detect comments; 308,000 pieces of organized hate content, up from 224,000 in Q3; 6.6 million pieces of hate speech content, up from 6.5 million in Q3; 3.4 million pieces of suicide and self-injury content, up from 1.3 million in Q3 due to increased reviewer capacity.
Kojo Boakye, Facebook’s Director of Public Policy, Africa said that the company’s goal was to get better and more efficient at enforcing their community standards.
“We do this by increasing our use of Artificial Intelligence (AI), by prioritizing the content that could cause the most immediate, widespread, and real-world harm, and by coordinating and collaborating with outside experts.”, he said.
Facebook plans to share additional metrics on Instagram and add new policy categories on Facebook. Efforts are also being made to externally audit the metrics of these reports while making the data more interactive so people can understand it better.