5 key things Facebook must improve, according to civil rights audit
Facebook failed to protect users from discrimination and hate, according to an audit the social network commissioned to review its track record on civil rights.
“Many in the civil rights community have become disheartened, frustrated, and angry after years of engagement where they implored the company to do more to advance equality and fight discrimination, while also safeguarding free expression,” the report said.
The audit, made public on Wednesday, was conducted by Laura W. Murphy, president of Laura Murphy & Associates and former director of the ACLU’s legislative office, and Megan Cacace, partner at civil rights law firm Relman Colfax. It’s the third such annual audit of Facebook, the first of which was in 2018.
In a blog post, Facebook chief operating officer Sheryl Sandberg responded to the audit by saying, “We have made real progress over the years, but this work is never finished, and we know what a big responsibility Facebook has to get better at finding and removing hateful content.”
The report comes as hundreds of companies, including well-known brands like Verizon, Starbucks, and Levi Strauss, participate in a one-month boycott of advertising on Facebook. The campaign, called #StopHateForProfit, aims to pressure the company to do more to eliminate hate on its service.
In response to pressure, Facebook has prohibited posts intended to reduce voter turnout and promised to add 30% more people of color, including 30% more Black people, to its leadership ranks. It also plans to invest $100 million in Black-owned small businesses.
But the audit also revealed where Facebook falls short. Here are five examples:
Voter suppression and fact-checks
The audit’s authors critiqued the company’s “troubling” enforcement of its rules related to voter suppression and decisions against fact-checking politicians. The report pointed to a series of posts by President Trump, who described state-issued mail-in ballots as illegal and “gave false information about how to obtain a ballot.” The auditors said they “vehemently expressed their views” to Facebook, but were not given the chance to speak with the company’s leaders until after they had already decided to do nothing.
“Facebook’s recent decisions on posts by President Trump indicate a tremendous setback for all of the policies that attempt to ban voter suppression on Facebook,” the report said.
The auditors also raised a red flag about Facebook’s policy against fact-checking posts by politicians and leaving their posts untouched after they violate the service’s rules. “These political speech exemptions constitute significant steps backward,” the report said.
Civil rights must be prioritized
The auditors criticized Facebook for a lack of civil rights expertise, especially in functions related to elections, hate speech, and ads. They recommended that Facebook not only add a civil rights leader to its ranks but also create a team to improve its internal processes. This would allow civil rights experts to potentially pause new products and policies if they find problems.
Resources are needed to address hate and organized hate
Facebook should do more to crack down on hate against minority groups including Muslims and Jews, the auditors said. They recommended that the company collect more data about which groups are targeted so it can create better policies that target specific types of violations. The auditors also want Facebook to gather more information about how long it takes to remove prohibited content and whether the response time varies by the group targeted.
White nationalist bans don’t go far enough
Last year, Facebook prohibited white nationalism and white separatism on its service. But the auditors said that policy only applies to explicit references and should go further.
Last year, the auditors had praised the then-new white nationalism ban but suggested it was “too narrow” because it only prohibited posts that use the specific phrases “white nationalism” and “white separatism.” Since then, the auditors complained, “Facebook has not made that policy change.”
Instead, Facebook has taken a different approach, including assigning a team to work exclusively on combating dangerous individuals and organizations. As part of that effort, the company banned more than 250 white supremacist organizations, the report said. Still, civil rights groups are concerned that Facebook’s success in detecting and removing hateful content is inadequate.
Algorithmic bias and discrimination must be addressed
The auditors voiced concerns about Facebook’s algorithms, which control what users see in their News Feed, including that they may inadvertently fuel extreme and polarizing content or promote bias. The report outlines a need for concrete actions to address these issues.
Facebook should require that all internal teams building algorithms follow best practices, and that existing algorithms and machine-learning models should be tested against that guide, the report said. Auditors also recommended that Facebook require new employees to be trained to reduce bias and discrimination in artificial intelligence.
“Facebook needs to approach these issues with a greater sense of urgency,” the report said.