For many of us, Facebook has intertwined itself into our lives to such an extent that it often seems to fade into the background—it’s always there, like the atmosphere, and so we stop thinking about it. But now and then, the giant social network does something that forces us to consider its over-sized influence on the way we see the world, and the larger implications of that influence.
Sometimes it’s when Facebook removes a breastfeeding photo, or censors a page about the war in Syria. Other times, it’s something that seems a little more trivial—such as a recent report by tech news site Gizmodo that human editors in charge of the “Trending Topics” section of the site routinely prevented conservative topics and news items from making it into the rankings.
As Peter Kafka at tech news site Recode notes, this isn’t quite the same as Facebook itself deciding that conservative topics and websites weren’t worthy of being included—individual editors apparently made these decisions themselves. But that hardly removes the problem, which is that Facebook is making editorial choices about what news to include and which sites get preferential treatment.
On Monday afternoon, the company released a statement about the Gizmodo article, saying it takes allegations of bias “very seriously,” and that there are guidelines in place to “ensure consistency and neutrality.” Those guidelines forbid the suppression of specific political perspectives, Facebook says, and don’t permit “the prioritization of one viewpoint over another.”
The fact that Facebook made such a statement is noteworthy since it typically tries to ignore these kinds of issues. But the big problem isn’t that a couple of human editors fiddled with the Trending Topics. It’s that human beings are making editorial decisions all the time via the social network’s news-feed algorithm, and the impact of those decisions can be hugely far-reaching—and yet the process through which those decisions are made is completely opaque.
Sign up for Data Sheet, Fortune‘s technology newsletter.
Some point out that human editors do this kind of thing all the time at traditional media outlets. In other words, they make choices about what sites to link to or include. The New York Times motto that claims the paper contains “All the news that’s fit to print” implies that someone is deciding what “fit” means. And newspapers are arguably just as prone to bias as the editors working at Facebook, if not more so, with motives that are often just as opaque.
There are a couple of significant differences between the editors at traditional media outlets and the ones who run the news-feed algorithm at Facebook, however. One is that, based on a unique daily audience of more than a billion people, Facebook is orders of magnitude larger than any other media outlet anywhere in the world, and its influence as a news outlet is also far larger than any single news source (yes, even the New York Times). More than 60% of millennials say they get their political news from Facebook.
The second important point is that we’ve grown accustomed to the idea that media outlets such as the New York Times and Washington Post and CNN have at least some sort of notional commitment to journalistic principles, including fairness, balance, accuracy, and the greater good of society. What commitment does the Facebook algorithm have to any of these principles?
Facebook’s f8 conference was all about the bots. Watch:
The last time someone asked the social network that kind of question, the guy in charge of its media partnerships went to great lengths not to answer. Facebook routinely says that it doesn’t see itself as a media entity, and doesn’t see its algorithmic choices as being of any concern to anyone outside the company—even when those choices help influence the way people think and behave, like whether they decide to vote and how they see political issues.
It’s not surprising that Facebook wants to avoid seeing itself as a media outlet. After all, it is a private corporation controlled by its shareholders—primarily Mark Zuckerberg—and what it does with its algorithms is its own business. Not only would admitting that it is a powerful media entity open the door to all kinds of awkward questions, the media companies that it is trying to partner with through services like Instant Articles might come to see it as a competitor (which of course it is in many ways).
At some point, however, Facebook is going to have to grapple with these kinds of issues, or at least acknowledge that they exist and that people have a right to be concerned about them.