Facebook and the News: Trends, Filter Bubbles and Algorithmic Bias

May 12, 2016, 10:15 PM UTC
Social Media Illustrations
Social media apps including WhattsApp, LinkedIn, Twitter, FaceBook, Instagram, SnapChat and Periscope are displayed in a social media folder on the screen of an Apple Inc. iPhone 6 in this arranged photograph taken in London, U.K., on Friday, May, 15, 2015. Facebook Inc. reached a deal with New York Times Co. and eight other media outlets to post stories directly to the social network's mobile news feeds, as publishers strive for new ways to expand their reach. Photographer: Chris Ratcliffe/Bloomberg via Getty Images
Photo by Chris Ratcliffe/Bloomberg via Getty Images

The question of how much influence Facebook has over the news that more than a billion people see daily continued to stir up controversy on Thursday. In the latest developments, The Guardian published leaked internal guidelines for its Trending Topics feature, and the social-networking giant quickly published its own version, along with another internal document that describes how it decides what news to include and what not to.

The controversy started earlier this week with a piece by tech news site Gizmodo that looked at how several journalists who worked on the Trending Topics feature were treated by the social network, and what they were expected to do.

The original story focused on how these editorial contractors believed they were simply training Facebook’s news-filtering algorithms, and didn’t feel that the social network cared about journalism much, except as raw material for its engagement engines. Then a second piece appeared that featured comments from an anonymous editor about how staff routinely kept certain sites and topics out of the Trending feed.

This led to criticism about the appearance of bias against right-wing sources (since many of the sites that were not included were conservative-leaning). Facebook got some nasty comments, not just from the head of the Republican National Committee, but also from the head of the Senate Commerce Committee, who asked CEO Mark Zuckerberg to make his staff available for questions about how editorial decisions were being made at the social network.

In a blog post, a Facebook vice president said the Gizmodo story was inaccurate, and that the site doesn’t “allow or advise our reviewers to discriminate against sources of any political origin, period.” Justin Osofsky said all the system does is “help surface the most important popular stories, regardless of where they fall on the ideological spectrum.”

As a number of observers have pointed out (including Fortune), the question isn’t so much whether Facebook filters out certain kinds of news—something that newspapers and other media entities do every day without much scrutiny. The real point is that Facebook is orders of magnitude larger and more influential than any traditional media outlet, and yet the way it chooses the news its billion users see is still fundamentally opaque.

Whenever questions about its status as a media outlet come up, Facebook typically argues that it isn’t a media outlet at all, it’s merely a social service that uses algorithms to show people content that they might enjoy or want to see. Some of that happens to be news, but the Facebook argument is that it’s all done by algorithm, so there’s no real editorial activity.

Sign up for Data Sheet, Fortune‘s technology newsletter.

What that ignores, of course, is that algorithms are programmed by human beings, and in the process of doing so a million decisions are made that are journalistic decisions, including how to rank different news sources and what kinds of news to exclude.

The reality is that Facebook routinely removes or censors content in the main news feed, whether it’s breast-feeding photos or pictures of the war in Syria, and it decides to down-rank or hide other kinds of content on a daily basis. Those are fundamentally editorial decisions that have an impact on the way that a billion users think about the world.

Although Facebook points out that Trending Topics is completely separate from the main news-feed, what the recent controversy has done is highlight how much human beings are a part of everything Facebook does. And that is raising questions the social network hasn’t had to face before about how it makes the decisions it does.

On Thursday, the British newspaper The Guardian published what it said were leaked internal editorial guidelines for how to handle Trending Topics, including rules for when to remove certain terms and when to include them.

Within hours of the Guardian story appearing, Facebook published a lengthy post on its site describing the purpose behind the Trending Topics section, and its views about how items in that section were curated, first by an algorithm and then by human editors. The move appeared to be an attempt to get out in front of some of the criticism of potential bias, and Facebook also published a 28-page internal document about how Trending Topics functions.

Facebook denies it tampers with trending topics. Watch:

In an interview with The Verge tech news site about Instant Articles (a feature that takes content from news partners and makes it mobile-friendly by customizing it for the Facebook platform), news-feed product manager Will Cathcart talked about Facebook’s approach to curating news. He insisted that all the social network wants to do is “give users what they want.” The definition of that, he said, is left up to the algorithm.

In effect, Cathcart said that with more than a billion users, Facebook can’t possibly make across-the-board decisions about what is newsworthy or what is crucial information for users to know and what isn’t. So everything is personalized, via the algorithm, in order to give users the impression that they are “informed,” as he described it.

Cathcart didn’t talk about any of the potential down-sides of this approach, such as the “filter bubble” effect that can keep users from seeing potentially important topics because they don’t fit the platform’s pre-conceived notions of what that user is already interested in. Nor did he talk about whether Facebook bears any kind of editorial or journalistic responsibility because of its size and market power.

In an update posted to his Facebook page late on Thursday, CEO Mark Zuckerberg said that Trending Topics was designed to highlight “the most newsworthy and popular” conversations on Facebook, and that the site’s guidelines did not permit the suppression of political perspectives. He said that Facebook was conducting a full investigation of the Gizmodo report and added that “if we find anything against our principles, you have my commitment that we will take additional steps to address it.”

Zuckerberg also said that Facebook “stands for giving everyone a voice. We believe the world is better when people from different backgrounds and with different ideas all have the power to share their thoughts and experiences.” He said he would be inviting “leading conservatives” and people from all across the political spectrum to talk with him and share their points of view about the issues raised by the Gizmodo story.

One upside of the Trending Topics controversy is that Facebook (FB) has become a little more open and transparent about how the feature works, and what principles guide those choices. But in many ways, the trending section is a sideshow. All of the same kinds of questions apply to how the main news feed works, and so far there hasn’t been much openness about that at all, nor any real admission that the company has any ethical or moral responsibility related to how it shapes the world-view of its billion-plus users.

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward