Facebook is about to have yet another, very uncomfortable conversation about race.

A new story from the investigative journalist organization, ProPublica, has revealed that the company not only allows advertisers to target users by specific attributes, they also let them eliminate users on the basis of race, a practice which is at best, alarming, and at worst, illegal. “Imagine if, during the Jim Crow era, a newspaper offered advertisers the option of placing ads only in copies that went to white readers,” their report begins.

Imagine indeed. From the story:

“The ubiquitous social network not only allows advertisers to target users by their interests or background, it also gives advertisers the ability to exclude specific groups it calls “Ethnic Affinities.” Ads that exclude people based on race, gender and other sensitive factors are prohibited by federal law in housing and employment.”

ProPublica further drives their point home by successfully purchasing an ad via the Facebook ad portal targeted to people looking for housing but eliminating black, Asian and Hispanic users. Not only is this potentially a legal issue, it feels like a digital continuation of the redlining and similar practices that have historically prevented people of color from homeownership, financing, and employment.

Facebook’s business model depends on advertising. To that end, they give advertisers the ability to target users to an exceptional degree of specificity. Not only do they collect an extraordinary amount of data about people based on their activity on the platform, they also enhance their databases with information they purchase from data brokers about offline behaviors as well. Ultimately, people can be categorized in nearly 50,000 different ways. The company told ProPublica that they’re vigilant about preventing discrimination or abuse on their ad platform and that exclusion is an important way that advertisers can test the effectiveness of ad copy, for example. But this episode will be hard to explain away.

If nothing else, this report should raise – yet again – important questions about how the algorithms that increasingly define our lives are shaping the world in unseen ways. With all their data and predictive analytics at their fingertips, how could Facebook not have seen this coming?

 

I’ve reached out to Facebook for comment and will update this story with their response.