The conundrum of content filtering isn't getting any easier.

By David Z. Morris
May 13, 2017

Facebook on Thursday deactivated the group page of Women on Web, a Dutch organization that helps women obtain abortion pills by prescription in regions where abortions are not easily accessible. A message from Facebook, shared by an allied organization, cited “promotion or encouragement of drug use” as the reason for the action.

The shutdown immediately attracted media attention, with the Guardian pointing out that Facebook had previously blocked abortion-related content from the group’s founder. Organizers said they expected Facebook to “undo this action soon enough, as access to information is a human right.”

Facebook appears to shifted gears quickly. Within a day, the Women on Web page had been reinstated.

Get Data Sheet, Fortune’s technology newsletter.

The incident highlights the ongoing challenges Facebook faces as it negotiates its role in controlling content shared by users. The company is desperate to guard those users from content they might find shocking, offensive, or even traumatizing. But users can also be alienated if Facebook’s filtering efforts are perceived as heavy-handed or politicized. In a similar instance, Facebook took heat last September for blocking, and then allowing, a Pulitzer Prize-winning photo from the Vietnam War that featured child nudity—arguably, a no-win situation.

Facebook CEO Mark Zuckerberg argued as recently as last year that Facebook was not a media company, downplaying its responsibility to monitor what’s posted to the platform. In August 2016, Facebook seemed to be acting from that position when it fired its Trending News editorial team and replaced them with an algorithm.

But Zuckerberg’s stance came under heavy scrutiny during the U.S. presidential election, with many observers claiming Facebook had become a vector for misleading information. Zuckerberg seemed to reverse himself soon after, announcing a series of initiatives to more carefully control content, particularly news. Facebook has also recently added thousands of screeners to monitor content on the Facebook Live video service.

While much of its content may be generated by users, the world’s largest social network is looking more and more like a carefully-curated media outlet. That could also make decisions about politically contentious topics like abortion increasingly fraught.

SPONSORED FINANCIAL CONTENT

You May Like