Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward

YouTube Is Hiring Hundreds of People to Help it Fix its Child-Exploitation Problem

December 5, 2017, 5:48 PM UTC

YouTube plans to hire more people to scour its website for violent videos after reports about disturbing footage on its app for children surfaced last month, prompting a slew of advertisers to abandon the site.

The company on Monday announced its new initiatives, which include beefing up its surveillance teams to have more than 10,000 people in 2018 and moving faster to shut down inappropriate content.

“We are taking these actions because it’s the right thing to do,” YouTube CEO Susan Wojcicki wrote in a blog post. It’s unclear how many people are already part of YouTube’s review team.

In November, the New York Times reported on a number of offensive and sometimes brutal videos found on the family-friendly YouTube Kids app. In one video, some of the characters of the popular animated children’s series PAW Patrol are violently killed off. Another image depicts beloved Disney characters like Frozen’s Elsa being urinated on while she’s in the bathtub.

Several advertisers, included Mars Inc., Adidas and Diageo, said they would pull their campaigns off YouTube in the aftermath, fearing the videos would attract pedophiles, according to the Wall Street Journal.

YouTube addressed the issue this week, saying it has reviewed nearly 2 million videos for violent, extreme content and removed more than 150,000 of those videos since June — largely with the help of its “machine-learning technology” that can identify problematic videos.

“Our advances in machine learning let us now take down nearly 70 percent of violent extremist content within eight hours of upload and nearly half of it in two hours and we continue to accelerate that speed,” Wojcicki said.

More than 1 billion people use YouTube, according to its website. Wojcicki said human reviewers are still necessary in the company’s attempt to fix its child exploitation problem.

“Human judgment is critical to making contextualized decisions on content,” she said. “Our goal is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube.”