Censored by Facebook or Twitter? This is Where You Can Report it

November 19, 2015, 12:14 PM UTC
Social Media Illustrations
Social media apps including WhattsApp, LinkedIn, Twitter, FaceBook, Instagram, SnapChat and Periscope are displayed in a social media folder on the screen of an Apple Inc. iPhone 6 in this arranged photograph taken in London, U.K., on Friday, May, 15, 2015. Facebook Inc. reached a deal with New York Times Co. and eight other media outlets to post stories directly to the social network's mobile news feeds, as publishers strive for new ways to expand their reach. Photographer: Chris Ratcliffe/Bloomberg via Getty Images
Photo by Chris Ratcliffe/Bloomberg via Getty Images

Social platforms like Twitter, Facebook, Instagram and YouTube make it easy to post and distribute your photos, movies and status updates — but since they are all controlled by corporations, your content can just as swiftly disappear. In some cases, you may never know why it was removed, or how to get it back.

To try and combat that problem, or at least make it more obvious, the Electronic Frontier Foundation (EEF) has launched a new site called Onlinecensorship.org, in partnership with a company called Visualizing Impact, which is funded by a grant from the Knight Foundation.

Jillian York, the EFF’s director for international freedom of expression, co-founded the site with Visualizing Impact CEO Ramzi Jaber, whose company does social design and data visualization. York said in an interview before the site’s launch that the site is an extension of her work supporting free speech around the world, and has been in the works since the 2011 Arab Spring rebellion in Egypt.

The EFF director said she met Jaber at a conference in Palestine, and they talked about how their mutual friends had been hit with a number of content takedowns involving Facebook and YouTube, and how it was difficult to track those kinds of events.

In one case, a “Freedom for Palestine” project put together by music and human-rights collective OneWorld disappeared from the band Coldplay’s Facebook (FB) page, even though it had received nearly 7,000 largely supportive comments from users. It later became clear that Facebook had taken down the post because several users reported it as “abusive”.

York and Jaber’s project got off the ground after the two won the Knight News Challenge last year. Although it began as a way to focus on Facebook and YouTube takedowns, York said that it has expanded to include Twitter (TWTR) as well, and other platforms such as Instagram, Flickr and Google+ (GOOG). The team is also thinking about including Vine, which has become a popular place for sharing short video clips.

“When we started we were focused on Facebook and YouTube because that’s where the bulk of takedowns were happening, and Twitter was still the free-speech wing of the free-speech party,” York said. But Twitter’s approach to taking down content has changed, and so it is being included in the site as well.

“It kind of came to the fore with the James Foley video [of a U.S. journalist being beheaded by ISIS],” said York. “If you were a verified user on Twitter then they allowed your content to stay up, but if you were a regular user they didn’t. So in a way they were effectively choosing who was going to be seen as the media.” That kind of choice makes it important to track when content is removed and when it isn’t, she said.

The EFF director said that Onlinecensorship.org is intended to be complementary to another site formerly known as Chilling Effects, but now called Lumen. That site tracks takedowns as a result of copyright infringement allegations, but Onlinecensorship is more concerned with content that gets removed for other reasons.

In some cases, the posts and videos that get removed may be seen as offensive or disturbing, or may get reported by other users as violating some law—reports that are often used by various groups as a way of silencing commentary they don’t like. In other cases, they may be content that the platform sees as harassment or hate speech.

“It’s a lot more complicated with hate speech and harassment,” York admitted. “We’re not taking a position on the content necessarily, we’re just observing and trying to force companies to be more transparent about what they do and why. They don’t want to be transparent, which is why we thought we need to go to the users.”

In some cases, content gets taken down for non-malicious reasons, York said. “Sometimes it’s done with good intentions. A lot of problems come down to companies relying on users to report each other, and relying on low-level, poorly-paid workers to make those decisions quickly. And some of it is just laziness.”

The site will also have information about how users of the various social networks that it covers can go about protesting or challenging content takedowns, York said. That information also isn’t made easily available to users by many platforms.

You can follow Mathew Ingram on Twitter at @mathewi, and read all of his posts here or via his RSS feed. And please subscribe to Data Sheet, Fortune’s daily newsletter on the business of technology.

For more on Twitter and Facebook, check out the following Fortune video:

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward