Monika Bickert has one of the toughest jobs at Facebook: overseeing content moderation on the site, which involves crafting and enforcing Facebook’s user policies.
Deciding what counts as hate speech on the platform isn’t easy, even for a former federal criminal prosecutor turned Facebook head of global policy management. And it’s a job that, when it fails, can have disastrous consequences due to its global reach.
Despite its extreme challenges, Bickert still believes Facebook offers more good than bad to the world and the platform’s billions of users.
“Yes, you have these abusive actors, but we get so much good from this product,” Bickert said at Fortune‘s Most Powerful Women Next Gen Summit in Laguna Niguel, Calif., on Wednesday. “It’s true for women. If we look at #MeToo, and Women’s March, and International Women’s Day being the biggest moment on Facebook in 2017. Those are things that social media gives us.”
Social media has helped movements like #MeToo grow, but its role as a force for negativity has grabbed more headlines recently.
Fortune senior writer Michal Lev-Ram asked Bickert: Would some of Facebook’s content problems have been avoided if CEO Mark Zuckerberg had more life experience when he founded the company, and could better understand humanity’s potential for ugliness?
“I can’t speak for Mark, but even when I joined the company seven years ago … even then we had teams focused on exactly these things,” Bickert said. “It’s not that we didn’t recognize that in a very large community of really well-intentioned people, you’re going to have a very small sliver that are engaged in abusive behaviors. We knew that. But it is quite a different thing to understand how to identify that content and deal with it.”
Correction, Dec. 14, 2018: An earlier version of this article misstated Bickert’s title. We regret the error.