It's getting harder to believe Facebook when it says it's not a media company. The social network just said that even if an image or a story posted on the site breaches its community standards, it will leave the post up if it is deemed to be "newsworthy."
But how will Facebook determine whether something is newsworthy and therefore deserves not to be deleted by the site's censors? That remains unclear.
The site's responsibilities as a media outlet were highlighted Friday by a report that some staffers wanted to delete posts by the Trump campaign because they believed the posts qualified as hate speech. But CEO Mark Zuckerberg overruled their thinking.
According to the Wall Street Journal, the decision to allow Trump's posts to remain resulted in complaints that the founder and CEO was bending the site’s rules for the Republican candidate. Some employees who work reviewing content on the site reportedly threatened to quit.
When it comes to the newsworthiness exception, Joel Kaplan and Justin Osofsky—vice presidents of global public policy and of global operations and media Partnerships, respectively — said in a blog post that the site came to its decision based on user and partner feedback.
Get Data Sheet, Fortune's technology newsletter.
Although the post doesn't mention it specifically, much of this feedback likely came as a result of a recent incident in which Facebook deleted posts containing an iconic Vietnam War image of 9-year-old Kim Phuc running down the road naked after her village was bombed.
Not only did Facebook delete the original image after a Norwegian newspaper editor uploaded it as part of a series on war photography, but the site deleted the editor's post about the deletion as well. It then blocked his account, and even deleted a post by Norway's prime minister, who protested Facebook's censorship of the image.
The social network eventually apologized for the deletions, and said that staffers were compelled to remove the image because it was of a naked child and that it violated the site's community standards.
Now, Kaplan and Osofsky say that Facebook will leave up certain images and posts. "We’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest—even if they might otherwise violate our standards," they write. "We will work with our community and partners to explore exactly how to do this."
The problem, as Facebook sees it, is that some images and content that seem innocuous in one country or culture may be seen as offensive or even illegal in another.
Twitter has also struggled with this problem, and what it came up with in 2014 was what it calls "country withheld content." So in the case of tweets containing pro-Nazi references, Twitter will block anyone within Germany from seeing them because of local laws against that kind of content.
This may be the kind of thing that Kaplan and Osofsky are talking about when they say that Facebook is looking at "new tools and approaches to enforcement."
The larger issue, however, is that Facebook is increasingly having to make these kinds of calls about what content is permitted and what isn't, and in fact it has been making those editorial decisions for years—all while denying that it is a media company.
For example, it routinely removes breast-feeding photos or even articles and images having to do with breast cancer, as it apparently did this week with a breast-cancer video advertisement.
Facebook is taking on Craigslist with this new feature. Watch:
Activists and political dissidents are also familiar with having their posts and even their accounts disappear from Facebook without warning. Investigative journalist Eliot Higgins has talked about how Facebook's deletion of pages about violence in Syria has prevented journalists like him from collecting important information about the war there.
In addition to those kinds of decisions, Facebook has also been criticized for hosting so many hoaxes and fake news stories, many of which are produced by a shadowy group of political sites.
Human editors used to remove such fakes from the Trending Topics section of the site, but Facebook got rid of its editors following a controversy over alleged bias in their decision making. Trending Topics is now run by algorithm, just like the main Facebook news feed.
Many believe that Facebook can no longer argue that the algorithm makes all of its decisions about what news to include or not include, and therefore it doesn't have any responsibility or duty to talk about its news judgment or decision-making process. It is a media outlet with 1.5 billion users, and it needs to start acting like one.