By Tom Huddleston Jr.
March 14, 2018

YouTube CEO Susan Wojcicki says the online video giant is placing a limit on the amount of time its human moderators can spend watching disturbing videos each day.

YouTube has vowed to rely more on human beings to review its videos for inappropriate or offensive material, but the Google-owned video service is also one of the tech companies grappling with the psychological toll that viewing a high volume of disturbing content can take on its human moderators. That’s why YouTube is starting to limit its own part-time moderators to no more than four hours per day of watching those videos, Wojcicki announced in an interview with Wired at South by Southwest in Austin on Tuesday.

Earlier this year, an article in The New Yorker took a look at the growing number of moderators employed at tech companies like Google, Facebook, and Twitter, where human employees are being asked more and more to scan online posts, pictures, and videos with the potential to offend and disturb.

“This is a real issue and I myself have spent a lot of time looking at this content over the past year. It is really hard,” Wojcicki said in the interview, according to The Verge. In addition to placing a cap on the amount of time those moderators can spend watching potentially upsetting material, the company is also providing those contractors with what the YouTube CEO described as “wellness benefits.” (Because the people hired as moderators are typically contractors, they are not provided the same health benefits that go to full-time Google employees.)

Get Data Sheet, Fortune’s technology newsletter.

In December, Google promised to hire 10,000 moderators to ensure that videos on YouTube, particularly popular videos that are eligible for advertising dollars, are scanned for potentially offensive content by a person, rather than just an algorithm. The move followed a rough year for YouTube, which saw advertisers flock away from the service after complaints about ads appearing next to offensive videos containing anything from terrorist or violent extremist content to disturbing videos exploiting children. YouTube also said in December that it had removed over 150,000 videos featuring violent extremism since June 2017, though the company noted that 98% of those types of offensive videos are still typically flagged by YouTube’s machine-learning algorithms.

YouTube’s moderators have faced criticism in recent months, both for failing to remove extremist videos or content featuring conspiracy theories as well as for supposedly mistakenly pulling several right-wing videos and channels last month. In her interview with Wired on Tuesday, YouTube’s Wojcicki also said the company is taking action against conspiracy theorists who use the site to spread false and misleading information. Wojcicki said that YouTube will now place links to Wikipedia pages next to conspiracy theory videos in order to debunk any misinformation that could be spread by the videos’ creators.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST