A large trove of leaked internal Facebook documents pulls back the curtain on how the social network’s employees moderate postings on the site.
When it comes to violent images and videos, the moderation team can choose to either take no action, mark content as ‘disturbing,’ or remove it entirely. Broadly, Facebook’s approach to violent imagery is to allow it because of its importance for public awareness. That includes video of violent human deaths, which are flagged as ‘disturbing’ but not, as a rule, removed. The broad exception is any imagery of violence that is shared with “sadism and celebration,” which should be removed, according to the guidelines.
The documents, obtained by The Guardian, include thousands of pages including examples. Their publication follows a series of recent incidents including live murders and deceptive news that have sparked debate about exactly what should be allowed.
The documents show a delicate balancing act behind Facebook’s efforts to manage a deluge of postings from around the globe. The policies include the company’s approach to videos and images of graphic violence, cruelty to animals, and non-sexual child abuse. They also outline how Facebook’s moderation team is expected to screen threats of violence.
Get Data Sheet, Fortune’s technology newsletter.
The portion of Facebook’s guidelines that seems most likely to provoke public backlash is that dealing with threats of violence. The training materials state that “people commonly express disdain or disagreement by threatening or calling for violence in generally facetious and unserious ways,” and that moderators’ primary aim should be to “disrupt potential real world harm.”
The materials include some examples of extremely disturbing speech that Facebook says it would not remove under that rubric, such as “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat,” and “Little girl needs to keep to herself before daddy breaks her face”.
We have contacted Facebook for comment about the leaked material and will update this article if we receive a response.