Neal Blair, of Augusta, Ga., wears a hoodie which reads, "Black Lives Matter" as stands on the lawn of the Capitol building during a rally to mark the 20th anniversary of the Million Man March, on Capitol Hill, on Saturday, Oct. 10, 2015, in Washington. Black men from around the nation returned to the capital calling for changes in policing and in black communities.
Photograph by Evan Vucci — AP

A new report looks at how Facebook chooses our news.

By Jeff John Roberts
June 9, 2016

Here’s more fuel for the ongoing controversy over how Facebook shapes the news media: A report claims that a tool that helps news sites promote their stories on Facebook FB does not offer a “Black Lives Matter” tag—even as it lets sites pick from hundreds of thousands of lesser known phrases.

According to the Intercept, the issue turns on Facebook’s “audience optimization” feature, a recent service that lets online publishers tag stories with key phrases and direct them to a relevant audience. (An advocacy site might, for instance, use a “Fight for $15” tag to increase the chance the story will appear in the Facebook feed of people who care about labor issues.)

But despite the considerable media attention over the “Black Lives Matter” movement, which focuses on police behavior towards black people, it is not possible for sites to include this tag. This limitation means that news stories about “Black Lives Matter” may not enjoy the same reach on Facebook as other stories that can use a tag to optimize their audience.

In response to the report, Facebook told the Intercept that the lack of a “Black Lives Matter” tag did not reflect a policy choice by the company, but was instead a result of the software it used to spit out the suggested audience tags. The whims of the algorithm, in other words, meant that an obscure phrase like “Water motorsports at the 1908 Summer Olympics” became a tag, but “Black Lives Matter” did not.

Facebook did not immediately respond to a request for further context.

Sign up for Data Sheet, Fortune’s technology newsletter.

This issue with the tagging feature underscores what’s at stake as algorithms take on a growing influence of how and what we read.

While Facebook has attempted to profess that algorithms are somehow neutral, many people have pointed out that an algorithm also represents an editorial decision—the instructions that coders pour into it are just as subject to human values and bias as other choices.

As Fortune’s Mathew Ingram explained during last month’s controversy over the social network’s alleged suppression of conservative news stories, “Facebook Must Own Up to Being an Increasingly Powerful News Outlet.”

SPONSORED FINANCIAL CONTENT

You May Like