Facebook is about to have yet another, very uncomfortable conversation about race.

A new story from the investigative journalism outfit, ProPublica, has revealed that the company not only allows advertisers to target users by specific attributes, they also let them eliminate users on the basis of race. At best, the practice is alarming, and at worst, illegal. “Imagine if, during the Jim Crow era, a newspaper offered advertisers the option of placing ads only in copies that went to white readers,” their report begins.

Imagine indeed. From the story:

“The ubiquitous social network not only allows advertisers to target users by their interests or background, it also gives advertisers the ability to exclude specific groups it calls “Ethnic Affinities.” Ads that exclude people based on race, gender and other sensitive factors are prohibited by federal law in housing and employment.”

ProPublica further drives their point home by successfully purchasing an ad via the Facebook ad portal targeted to people looking for housing but eliminating black, Asian and Hispanic users. Not only is this potentially a legal issue, it feels like a digital continuation of the redlining and similar practices that have historically prevented people of color from homeownership, financing, and employment.

 

Sign up for raceAhead, Fortune’s daily newsletter on race and culture here.

 

Facebook’s business model depends on advertising. To that end, they give advertisers the ability to target users to an exceptional degree of specificity. Not only do they collect an extraordinary amount of data about people based on their activity on the platform, they also enhance their databases with information they purchase from data brokers about offline behaviors as well.

Ultimately, people can be categorized in nearly 50,000 different ways. The company told ProPublica that they’re vigilant about preventing discrimination or abuse on their ad platform and that exclusion is an important way that advertisers can test the effectiveness of ad copy, for example. But this episode will be hard to explain away.

If nothing else, this report should raise – yet again – important questions about how the algorithms that increasingly define our lives are shaping the world in unseen ways. With all their data and predictive analytics at their fingertips, how could Facebook not have seen this coming?

UPDATE: Facebook’s public relations firms, The Outcast Agency, sent a statement in which it defended Facebook’s advertising practice, saying it is a way to ensure that ads can be tailored to a particular audience and that they take enforcement action if they find a problematic advertisement.

They also provided a screen shot of the ad ProPublica placed in the story, which was to promote a ProPublica event, ironically about illegal real estate practices. But, the ad could just as easily have been for an event subtly hoping to attract only white potential renters or home-buyers. Did the ad approval process work in this instance? I’m not sure they’ve made their case. Stay tuned for more reporting on this.

Screenshot of Facebook ad. Courtesy of Facebook

The Facebook statement:

“We are committed to providing people with quality ad experiences, which includes helping people see messages that are both relevant to the cultural communities they are interested in and have content that reflects or represents their communities — not just generic content that’s targeted to mass audiences. We believe that multicultural advertising should be a tool for empowerment. We take a strong stand against advertisers misusing our platform: our policies prohibit using our targeting options to discriminate, and they require compliance with the law. We take prompt enforcement action when we determine that ads violate our policies.

“When World Cup 2014 became a big focus throughout the US Hispanic community, a business developed a campaign to reach people who had shown interest in that community in order to create a positive association between its brand and the world’s most popular sport. This meant more relevant ads to those audiences about the World Cup.

“All major brands have strategies to speak to different audiences with culturally relevant creative. Just for purposes of illustration, a car company will run creative for one of their vehicles, but will have one creative execution targeting the Hispanic affinity cluster in Spanish. They may create a different creative for the African American affinity cluster featuring black actors and stressing another insight that is specific to that group. All major brands do this because they know that audiences respond better to creative that speaks to them specifically. This is the case across all industries.”

On exclusion targeting:
“Marketers use this type of targeting to assess whether ads resonate more with certain audiences vs. others. For example, some audiences might click on Spanish-language ads for a World Cup sponsorship vs. other audiences might click more on the same ads in English, so the sponsor might run one campaign in English that excludes the Hispanic affinity group to see how well the campaign performs against running that ad campaign in Spanish. This is a common practice in the industry. We expressly prohibit discrimination and take prompt enforcement action when we determine that ads violate our policies.”

 

Ellen McGirt is a senior editor at Fortune.