Trolls have long enjoyed the upper hand on the Internet: Their hate and stupidity can turn social media into a toxic cesspool, and drive people away from online discussions altogether. So kudos to the New York Times for striking back.
On Tuesday, the Grey Lady launched a new system called Moderator, which will rely on machine learning developed by Jigsaw, a subsidiary of Google-holding company Alphabet (googl), to dramatically expand its online comment forums—while also containing the trolls.
The rollout of the technology, which Jigsaw calls Perspective, means Times readers will be able to offer comments on all the top stories during east coast business hours, and all on opinion piece from Monday to Friday.
Until now, readers could only comment on about 10% of Times articles because its 14 human moderators had to parse around 12,000 comments every day—that will jump to 25% since the moderators can use the artificial intelligence tools to help screen the submissions. The Times (nyt) says the next goal is for 80% of all stories to come with comments.
Sign up for Data Sheet, Fortune’s technology newsletter.
This is a big deal because, while comments sections offer rich debate and useful information (more useful, sometimes, than the original article!), many online publishers have given up on them all together. That's because it's just too much time and trouble to pay moderators to keep the trolls and trouble-makers out of the discussion.
"It's become too easy for trolls to dominate conversations online. People are either leaving the conversation entirely or comments sections are being shut down. The power of machine learning offers us an opportunity to tip the scales and reverse this trend," Jigsaw's CEO Jared Cohen said in a statement. "This is why we built Perspective, technology that puts the power of machine learning into the hands of publishers and platforms to host better discussions online."
Jigsaw, which operates as an ideas and policy shop loosely tied to Google, has been working on using artificial intelligence to combat trolls for years. As Fortune reported in February, the tool works by training software to recognize comments that are toxic or irrelevant, and then filtering them out in order to show smarter stuff.
New agencies, including the Times and the Economist, have been experimenting with Jigsaw's tools for months but the Times is the first to deploy it on a broad scale.
During the trials, the tool sometimes proved imperfect—missing some modern slurs or mistaking crude idioms like "life's a bitch" as toxic—but Jigsaw researchers say perfection is the not the point, and that the tool is built to improve with time. And, as the expansion of the Times comments section shows, the Jigsaw tools are helping to take back parts of the Internet.