YouTube is taking more steps to remove extremist videos.
Google’s YouTube has announced additional steps to curb the number of videos from terrorist-affiliated groups on its service.
On Tuesday, the Alphabet-owned video portal said it would use machine learning technology to step up its efforts to detect and remove terrorist videos while also putting into place “tougher standards” for determining when other videos are too controversial—even if they do not violate YouTube’s specific community guidelines. Last month, YouTube said the service had already begun redirecting searches for violent and extremist content, sending users to anti-terrorism videos instead.
These moves come after British Prime Minister Theresa May described online communities and social media platforms as “safe spaces” for terrorists and other extremist groups while calling on tech companies to do more to combat extremism online. Social media giants like Facebook and Twitter have also rolled out their own initiatives aimed at cutting down on extremism on their respective services.
Of course, YouTube has also suffered a direct financial hit from the proliferation of terrorism-affiliated videos online. Earlier this year, it saw an exodus of advertisers after companies boycotted YouTube because their ads appeared next to violent and offensive content. Over the past several months, the company has been working to recover from those boycotts, publicly apologizing to advertisers multiple times while promising to work harder to root out extremist content.
Get Data Sheet, Fortune’s technology newsletter.
Now, YouTube says its improved machine learning technology is making those efforts that much easier and effective. “Our machine learning systems are faster and more effective than ever before,” YouTube wrote in a blog post on Tuesday. “Over 75 percent of the videos we’ve removed for violent extremism over the past month were taken down before receiving a single human flag.”
The company was quick to point out that the sheer scale of its service—more than 400 hours of content is uploaded every minute, the company said—makes the challenge of finding and removing all instances of violent extremist content extremely difficult. But the company said it’s used the improved machine learning technology to remove double the number of videos flagged for violent extremism over the past month, while also doubling the rate at which those videos were removed.