Facebook Content Moderators Take Home Minimum Wage, Anxiety, and Trauma, Report Says

February 25, 2019, 11:30 PM UTC

Facebook found itself on the defensive yet again after an explosive report from The Verge that detailed the emotional trauma that the social network’s content moderators face as they sift through an endless stream of disturbing posts.

Facebook pays its content moderators, many hired through third-party contractors such as Cognizant, $15 an hour, or about $28,000 a year. While the rate matches the minimum wage in many states, the job exposes moderators to content that causes anxiety and trauma symptoms that endure long after they leave their jobs, the Verge’s Casey Newton reported, based on a dozen interviews with former moderators.

“Collectively, the employees described a workplace that is perpetually teetering on the brink of chaos. It is an environment where workers cope by telling dark jokes about committing suicide, then smoke weed during breaks to numb their emotions,” Newton wrote. “It’s a place where, in stark contrast to the perks lavished on Facebook employees, team leaders micromanage content moderators’ every bathroom and prayer break.”

Beyond the trauma of watching disturbing content, the job also leaves some moderators radicalized by extreme views. “Employees have begun to embrace the fringe viewpoints of the videos and memes that they are supposed to moderate,” Newton wrote. “One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust.”

Within hours, Facebook responded with an anodyne blog post. “We know there are a lot of questions, misunderstandings and accusations around Facebook’s content review practices—including how we as a company care for and compensate the people behind this important work,” Justin Osofsky, Facebook’s vice president of global operations wrote. “We are committed to working with our partners to demand a high level of support for their employees; that’s our responsibility and we take it seriously.”

For many on social media, that defense rang hollow, especially as it comes in the wake of a number of Facebook scandals, including evidence of lax data-sharing policies that enabled meddling in the U.S. Presidential election in 2016; an initiative that “duped children” into paying money to play games on its platform; and paying $20 to people, including teenagers, to install an app “that lets the company suck in all of a user’s phone and web activity.”