The recent bombings in Paris, and the outpouring of sympathy about those attacks, has sparked an ongoing debate about why there hasn’t been as much attention paid to similar events in places like Beirut. In particular, people seem concerned that “the media” hasn’t been doing as much reporting about the latter as it has about Paris, for a variety of reasons having to do with racism, cultural bias, etc.

Is this really true though? The answer is: Yes and no. And the question of whose fault it is—ours as news consumers, the media as information gatekeepers, or algorithm-driven news platforms like Facebook FB —is even more complicated than that.

Let’s take the simplest and most obvious aspects of these questions first. The simplistic complaint following the Paris bombings that “no one is writing about Beirut” is clearly untrue. As a number of media outlets and journalists have pointed out, there were reports about both incidents in a number of mainstream publications, including the New York Times, and in many cases they were on the front page.

Whether there was the same volume of stories about Beirut as there was about Paris—and whether they got as much promotion or prominence—is a separate question. Some media watchers say there wasn’t as much coverage, or it wasn’t as comprehensive.

It’s understandable why that might be the case. Not just because Paris is a western city and is therefore seen (even subconsciously) as being more worthy of coverage in the West, but also because Beirut is seen­—rightly or wrongly—as places where violent attacks are not uncommon. The bombings in Paris, by contrast, represented the worst violence seen in that city since the war ended in 1945.

In any case, the complaint that there wasn’t enough coverage of Beirut goes a lot deeper than just that. What many of the people who made this complaint seem to be saying isn’t “Why was there no news coverage of this,” but rather “Why didn’t I see any news stories about this?”

There are a number of possible answers to that question. One answer is that those people either didn’t seek out the publications that reported on those incidents, or didn’t notice the stories that they published, for whatever reason. And that’s where Facebook and our disjointed and atomized experience of the news comes in.

In the old days before the web, if someone didn’t see a news story like Beirut in their local paper or hear about on a TV news report, it would have been fair to ask why no one was reporting it. But now, the media universe that any given person is exposed to is orders of magnitude larger, with hundreds more sources, and a lot more noise. Those factors make it even more difficult to follow every story.

In order to help filter the noise, many people turn to platforms like Twitter and Facebook and Snapchat to show them what is important. A young millennial once said “If the news is important, it will find me”—but what if it is important and it still doesn’t find you? Whose fault is that?

In the current news environment, Facebook has to bear at least some of that blame. It is a massive platform for news, to the extent that a majority of younger users say they get their news from the social network. And the fact that its algorithm chooses what to show us and what not to show us means that it is exercising a very obvious editorial function, much like newspaper editors used to.

When this debate came up in the aftermath of the shooting of Michael Brown in Ferguson, Mo. last year, there was a lot of criticism of Facebook for showing people “ice-bucket challenge” videos instead of news about Ferguson. Sociologist Zeynep Tufekci wrote about the risks we bear when we turn our news consumption over to a single platform that uses a poorly-understood algorithm to determine what we see.

Facebook, meanwhile, maintains that your feed is determined by your behavior — since the algorithm learns by watching what you share, what you like,etc. According to this definition, if you don’t see Beirut in your news feed, then it’s your fault for choosing the wrong things. Facebook is just showing you what you said you want.

But there’s more to it than just that. According to surveys, large numbers of Facebook users don’t even know that their feed is being filtered at all. Clearly, some social-media literacy is required before we can even begin solving this problem.

And Facebook bears some responsibility for that as well. When you are such a huge source of news — and when you are trying to become an even bigger one by partnering with news companies through projects like Instant Articles—it’s incumbent on you to be as ethical as possible about that responsibility.

That means showing people news that is socially important or valuable, even if it might not get clicks. And it means being a lot more transparent about why you remove certain kinds of content, especially when asked to do so by governments or police forces not just in other countries but in the U.S. as well. If Facebook wants to be a platform for news distribution, then it needs to start acting like one.


You can follow Mathew Ingram on Twitter at @mathewi, and read all of his posts here or via his RSS feed. And please subscribe to Data Sheet, Fortune’s daily newsletter on the business of technology.