Skip to Content

If Facebook wants our trust, it needs to explain when and why it censors things

Inside The F8 Facebook Developers ConferenceInside The F8 Facebook Developers Conference
Mark Zuckerberg, CEO of Facebook.Photograph by David Paul Morris — Bloomberg/Getty Images

Almost every major digital platform, including Google and Twitter, publishes a so-called “transparency report,” in which the company in question lists the number of requests it has gotten from government authorities and legal entities to take down information. Facebook does this too, except that its version isn’t nearly as forthcoming on what exactly it has been asked to remove, and by whom.

For example, according to the Electronic Frontier Foundation, the giant social network doesn’t say anything at all about certain kinds of content takedowns—a lack of transparency that is especially disturbing now that Facebook wants to be a platform for journalism and news distribution.

In a recent post at the EFF blog, the group explains that this is why Facebook failed its recent censorship test, which the EFF applies to all the major platforms. In Facebook’s case, its “government requests” report explained how it “restricted access” to content because of Brazilian court orders, Israeli speech laws, and demands by the governments of Turkey and India, among other things.

 

All of this is well and good, but if you click through to the U.S. section of the report, there is no category called “content restrictions.” Is that because Facebook didn’t remove or restrict access to any content in the United States? Not at all. In fact, it does so regularly, the EFF says. It just doesn’t have any of the details of those restrictions in what is supposed to be a tell-all transparency report.

One large missing category, the foundation notes, is content created by prison inmates. The social network has a whole process by which the authorities can have content removed simply by filling out a form, and in many cases that has occurred without any proof that the content or behavior was either illegal or breached Facebook’s terms of use (Facebook recently changed the process so that this kind of proof is required). And yet, none of this is mentioned:

“We know for a fact that Facebook processed 74 requests for the California Department of Corrections and Rehabilitation alone in 2014. Between California and the state of South Carolina, we also know Facebook processed more than 700 takedown requests over the last four years. We could file public records requests in all 50 states to learn more, but since Facebook’s system allowed prisons to file these requests without creating a paper trail, only Facebook knows how many requests it has complied with nationwide. We believe it may reach into the thousands.”

This isn’t just about prisoners posting on Facebook, as the EFF points out. The fact that Facebook isn’t reporting these takedown requests “raises larger questions about what other kinds of censorship Facebook has been hiding.” Google provides all kinds of information about specific requests from law enforcement agencies, including demands that the search engine remove specific news stories from its search results, or take down YouTube videos that are seen as defamatory.

 

If Google has been receiving such requests, the EFF says, “we believe it is highly likely that Facebook has received them as well.” And yet, there is no information of any kind in the company’s so-called transparency report. And so we are left to wonder: what else is Facebook not telling us about whose takedown orders it is capitulating to and when?

Photojournalist Jim MacMillan recently posted images of a traffic accident he saw in Boston to both Facebook and the photo-sharing service Instagram (which is owned by Facebook), only to find later that they had been removed without explanation. Facebook officials said they had no record of any such content being removed, but it’s impossible to know for sure whether that’s the case or not. As the EFF notes, the lack of transparency in some aspects of the company’s report makes it difficult to trust that the social network is being totally forthcoming about its behavior.

If Facebook wants to be a media platform that users trust, then it’s going to have to do a better job of explaining exactly when and why it chooses to censor things, and for whom.