In the wake of the London Bridge terrorist attack on June 3, Prime Minister Theresa May lashed out at “the big companies that provide Internet-based services” for giving extremist ideology “the safe space it needs to breed.”
Internet giants like Google, Facebook, and Twitter rushed to defend their policies against May’s criticism.
“We want Facebook to be a hostile environment for terrorists,” Simon Milner, director of policy at Facebook, said earlier this month. “Using a combination of technology and human review, we work aggressively to remove terrorist content from our platform as soon as we become aware of it—and if we become aware of an emergency involving imminent harm to someone’s safety, we notify law enforcement.”
At the Fortune Most Powerful Women International Summit in London on Tuesday, Nicola Mendelsohn, who serves as Facebook’s vice president of Europe, the Middle East and Africa, repeated that defense—nearly word for word—when she was pressed on her response to May’s critique.
Mendelsohn said the company takes the issue of extremism “hugely seriously.”
“[W]e want to be a hostile environment for terrorists,” she said, noting that the social network uses a combination of people and technology tools to flag and remove terrorist content. She also mentioned that Facebook plans to hire another 3,000 employees to bolster this approach.
Subscribe to The World’s Most Powerful Women, Fortune’s daily must-read for global businesswomen.
In addition to May’s finger-pointing, Facebook has faced criticism for violent videos that have lived on its site. In April, a man in Thailand allegedly streamed a live video of his child’s murder, and a man in Cleveland uploaded a video of him killing someone. Both videos remained accessible on Facebook for hours.
In a 5,500-word manifesto on building a global community posted in February, CEO Mark Zuckerberg said that the sheer volume of content posted to Facebook makes it difficult to police. “We review over one hundred million pieces of content every month, and even if our reviewers get 99% of the calls right, that’s still millions of errors over time,” he wrote.
In London on Tuesday, Fortune senior writer Michal Lev-Ram quizzed Mendelsohn about the incredible scope of the task, asking if the biggest challenge was technological—building the tools to reliably flag offensive photos and videos—or philosophical as Facebook determines its role in monitoring and reporting suspicious content.
Mendelsohn skirted an either or answer, stating that Facebook’s philosophy is clear in that its community has no place for extremism and that it carries out that mission with a team of 4,500 people as well as automated tools.
After Theresa May lobbed her criticism at Internet companies, MP John Mann from the rival Labour party took it a bit further, repeating in a tweet his call “for the Internet companies who terrorists have again used to communicate to be held legally liable for content.”
When Lev-Ram asked Mendelsohn whether she agreed with that proposal, the Facebook VP demurred again, reiterating that the company works closely with law enforcement and that it turns content indicating an intent to harm over the authorities.