Facebook’s Rule Enforcement Report Lists 2.1 Million Posts as Bullying, 8.7 Million of Child Nudity in the Past Six Months
In its second Community Standard Enforcement Report, Facebook included a new category of numbers for how many of the site’s posts were identified as bullying and child nudity. The short answer: A lot.
In the six months between April and September 2018, 2.1 million posts met the criteria for bullying and harassment while 8.7 million pieces of content violated the social media site’s rules surrounding child nudity and sexual exploitation of children.
Facebook said that it discovered 99% of the child nudity and child exploitation posts before anyone reported them. The company said that it removes nonsexualized content as well, such as photos of children in the bath, to avoid the potential for abuse.
Since its last report in May, hate speech on the site has more than doubled, though the company said it has made progress on more proactively identifying hate speech.
“Overall, we know we have a lot more work to do when it comes to preventing abuse on Facebook,” the report reads. “Machine learning and artificial intelligence will continue to help us detect and remove bad content. Measuring our progress is also crucial because it keeps our teams focused on the challenge and accountable to our work.”
The report was released Thursday, a day after a scathing report by The New York Times revealed damning details about their knowledge of Russian meddling in the 2016 elections and that the company hired an opposition research firm to repair relationships, put out negative articles about Apple and attack billionaire George Soros. CEO Mark Zuckerberg claims he didn’t know about the relationship.
In response to criticism of its practices, the company announced a move to be more transparent, releasing a comprehensive list of its rules, called “community standards,” and issuing enforcement reports.