Facebook -- Lavish Reynolds

In the wake of a world-shaking video, Facebook says "context" is crucial to its filtering policy.

By David Z. Morris
July 9, 2016
July 09, 2016

The furor and chaos sparked this week by a police shooting that was streamed on Facebook Live has highlighted the platform’s massive potential to affect public life. In a statement released on Friday, Facebook clarified its content filtering policy on Live, particularly as it relates to violent imagery and events.

When it comes to such content, the company says, “context and degree are everything. For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it. However, if someone shared the same video to mock the victim or celebrate the shooting, we would remove the video.”

Facebook’s filtering system relies in part on users reporting content as inappropriate, which is then reviewed by Facebook’s staff of screeners. Trained screeners, the company says, are “on-call 24 hours a day, seven days a week.”

Get Data Sheet, Fortune’s technology newsletter.

As TechCrunch explored in more detail in a conversation with Facebook, this policy means graphic content can appear on Facebook Live, though it is posted with a disclaimer, does not stream automatically in a user’s feed, and will not generally be accessible to users under 18.

Facebook’s content-screening policy presents major implications for the platform’s role as a news source and site of public discourse. The openness to violent content under the right circumstances seems to be in line with the CEO Mark Zuckerberg’s expressions of support for the Black Lives Matter movement, which has been fueled in part by the viral spread of disturbing videos of police shootings.

Facebook’s clarification comes in response to questions surrounding the temporary removal of the now-notorious video streamed by Diamond Reynolds immediately following the shooting of her boyfriend, Philando Castile, by police in Falcon Heights, Minn. The video was removed for one hour as it went viral, with some sources claiming it was removed by police. Facebook maintains the disappearance of the video was due to an unspecified technical problem.

For more on Facebook, watch our video.

However, in previous instances, Facebook has acknowledged that it has mistakenly removed controversial or disturbing content. That has included removing memes critical of Brock Turner, the former Stanford student convicted of felony sexual assault, as well as a post opposing homophobia following the Orlando nightclub massacre. In Friday’s statement, Facebook said it will “continue to make improvements” to its screening practices and policies.

SPONSORED FINANCIAL CONTENT

You May Like