Facebook routinely denies that it is a media entity, or that it should be expected to behave like one, and yet the giant social network continues to behave in ways that have a significant and tangible impact on the news that its users see about the world, and the practice of journalism in general.
In the latest example of this behavior, Facebook (fb) recently suspended the accounts of several Palestinian journalists without providing any warning or any explanation.
According to Al Jazeera, four editors from the Shehab News Agency — which has more than 6.3 million likes on Facebook — and three executives from the equally popular Quds News Network reported that they were suddenly unable to access their personal accounts.
After the journalists complained to Facebook, their accounts were reinstated, and the company apologized for what it said was a mistake. "Our team processes millions of reports each week, and we sometimes get things wrong," a spokesman said, suggesting that the accounts were flagged by users.
A number of Palestinian groups, however, believe that the suspension of journalist accounts is more than just a mistake. They argue that it is more likely to be a spin-off effect from a recent agreement that Facebook struck with the Israeli government.
"The concern is that Facebook is adopting Israeli policy and terminology when it comes to defining what incitement is," Nadim Nashif, co-founder of the Arab Centre for the Advancement of Social Media, told Al Jazeera.
Earlier this month, Israeli authorities said they were working with the company to cut down on the use of the social network to incite violence. Israel's justice minister said the government had submitted 158 requests to have content removed since June, and the company had granted 95% of the requests.
Get Data Sheet, Fortune's technology newsletter.
Facebook is also the target of a $1-billion lawsuit launched this summer by an Israeli legal advocacy group, which claims the company is violating the U.S. Anti-Terrorism Act by providing services that assist in "recruiting, radicalising and instructing terrorists."
Whatever the reason, the issue of censorship on Facebook — and especially censorship of journalists and news outlets — is a serious concern, but one which the company has so far refused to confront head on.
Just a few weeks ago, the social network repeatedly deleted posts and suspended the account of a Norwegian newspaper editor after he published a Pulitzer Prize-winning photo of a young Vietnamese girl running away from her bombed village in 1972, as part of a series on compelling war photography.
After both the newspaper editor and the prime minister of Norway complained, Facebook again admitted its mistake and reinstated the content. But the issue still highlights how pervasive the company's control over news and journalism has become.
Removing posts and suspending the accounts of journalists is something that sets off warning bells, but the social network also practices a more subtle form of censorship every day, by using its news-feed algorithm to highlight certain content and hide other types of content.
Facebook says it isn't a news entity, perhaps in part because it doesn't want to shoulder the responsibilities of behaving like one, but the reality is that such decisions have a huge impact on the way that its 1.5 billion users see the world.
Does Facebook need humans to decide what's trending? Watch:
And apart from just the algorithmic filtering, the issue of Palestinian journalists having their accounts and pages suspended without warning raises the question of whether Facebook is filtering certain types of content out as part of an agreement with a foreign government, without making that clear to users.
Facebook isn't the only social platform that is struggling with these issues. Twitter has been asked by the Turkish government to block the accounts of a number of Turkish journalists, and it may ultimately be forced by court order to do so.
Social networks like Facebook and Twitter have become such a crucial part of how we communicate now that these kinds of issues are no longer just commercial decisions, but choices that will alter how hundreds of millions of people understand the world around them.
That imposes a significant responsibility on those companies, whether they choose to admit it or not.