A lot of questions are emerging about Facebook's role in this year's election cycle, especially given the proliferation of sensationalistic and even outright fake news stories, and CEO Mark Zuckerberg has responded.
“I think the idea that fake news on Facebook—of which it’s a very small amount of the content—influenced the election in any way is a pretty crazy idea,” he said on Thursday at the Techonomy conference in Half Moon Bay, Calif., just two days after Donald Trump was elected president, according to media reports.
“There have been hoaxes on the Internet, there were hoaxes before,” he said. “We do our best to make it so that people can report that, and as I said before, we can show people the most meaningful content we can.”
Get Data Sheet, Fortune’s technology newsletter.
Trump's victory in the U.S. presidential election stunned many, including polling analysts and the media. Since then, many questions about the role of social media—and mass media at large—have been swirling.
Facebook (fb), in particular, has grown into a powerful tool for sharing information and news articles—or what looks like news articles. In a recent report, BuzzFeed detailed how a slew of teens in Macedonia have been creating entire operations of pushing fake, but very dramatic, news onto the social network, which uses algorithms to decide what each user sees in their News Feeds.
Thus, the argument is that services like Facebook are enabling the viral spread of these faux news articles, and some are asking whether it may have had a significant effect on the election's outcome. Read more in Fortune's Mathew Ingram's take on why Facebook is partly to blame here.
“Part of what I think is going on here is people are trying to understand the results of the election,” Zuckerberg said. “I think there is a certain profound lack of empathy in asserting that the only reason why someone could have voted the way they did is because they saw some fake news."
Of course, Facebook is sticking to its guns. The social network giant has vehemently denied that it's a media company, insisting that it's merely an open technology platform enabling people to connect with each other and share information.
Facebook also argues that its users are exposed to a variety of viewpoints through the service, despite criticism that it's an echo chamber for each user to have their viewpoints reinforced over and over again. But even Zuckerberg has acknowledged that people don’t tend to click on articles promoting viewpoints different from their own, which only supports the idea of an online echo chamber. And therefore, fewer of those tend to show up in your News Feed.
For more on Facebook, watch Fortune's video:
That's also where Zuckerberg is both right and wrong. Fake news articles did play a role, but in reaffirming or confirming what people want to believe; probably not so much in swaying most users' pre-existing beliefs to the point that they entirely changed their minds.
With that said, Zuckerberg's comments shouldn't be surprising at all. Even if the company has come to believe its News Feed has had some effect, why would it possibly want to acknowledge that in public?
Doing so would only open the door even louder to calls for Facebook to start regulating the information on its network, and the company has no interest in getting into that messy game. Even when it tried to hire young employees to flag popular news items and write short descriptions of the stories, it was quickly confronted with accusations of bias against right-wing media, and dismantled the entire operation to replace it with algorithms.
Editorial judgement is difficult and carries risks—just ask any member of the media.