Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward

Facebook’s Oversight Board finds that the social network isn’t great at making tough decisions

January 29, 2021, 12:55 AM UTC

Our mission to make business better is fueled by readers like you. To enjoy unlimited access to our journalism, subscribe today.

Facebook’s Oversight Board, the body that acts as a Supreme Court for how the social network polices its service, revealed an unsurprising reality on Thursday: Facebook often fails to properly enforce its rules.

The board released its first five decisions since its members started meeting last year. Four of the five rulings overturned decisions Facebook’s moderating team previously made. 

The four cases may seem small statistically speaking, compared to the hundreds of thousands of posts that are published on Facebook every day. But even if they represent a small portion of posts, the implications are large.

“Even a small percentage of incorrect decisions would still represent hundreds of hundreds of posts a day,” said Nadine Strossen, former head of the American Civil Liberties Union. “So a small percentage error percentage is very severe.”

The first five cases covered user posts from around the world that Facebook removed for allegedly violate its policies on hate speech, adult nudity and sexual activity, dangerous individuals and organizations, and violence and incitement. But ultimately, the board only upheld one of Facebook’s previous decisions to remove a post for using a slur against Azerbaijanis. In the four other cases, the board suggested Facebook overreached when it chose to remove the posts and called for those posts to reinstated, which Facebook has now done.

“Today’s decisions (and future board decisions) are binding on Facebook,” Monika Bickert, Facebook’s vice president of content policy, said in a blog post. “We will restore or remove content based on their determination.”

The board’s decisions come as Facebook struggles to balance free speech against the potential to create real-world harm. It also precedes one of the biggest cases the board will have to rule on in upcoming weeks: Whether to uphold Facebook’s decision to ban former President Donald Trump, who was booted following U.S. Capitol riots on Jan. 6. 

Conservatives, who have long blasted Facebook for unfairly censoring their views, will likely see the board’s decisions as a win. The board, made up of 20 international human rights experts and civic leaders, ruled in favor of free speech in most cases.

Strossen said the rulings are unsurprising given that international human rights are “remarkably speech protective.” And Nathaniel Persily, a law professor at Stanford University, took to Twitter to explain what the decisions tell the public about the board. 

“The fact that the Oversight Board overturned four of the five takedowns gives us a sense that it is likely to take more libertarian and speech-protective positions than Facebook,” he tweeted on Thursday. “This is not where most public opinion is, of course.”

In one case, for example, the board suggested Facebook should reinstate a post from a user in Myanmar, who posted a picture of a dead Muslim child and suggested that there’s something psychologically wrong with Muslims. In the case of the post from the Myanmar user, the board said terms used were not violent or derogatory and therefore did not violate Facebook’s hate speech rules. The ruling spurred backlash from Muslim advocates and Facebook critics. 

“Facebook’s Oversight Board bent over backwards to excuse hate in Myanmar—a county where Facebook has been complicit in a genocide against Muslims,” Eric Naing, spokesman for national civil rights organization Muslim Advocates, said in a statement.

In another case, the board overturned Facebook’s decision to remove a post that criticized the French government for not authorizing hydroxychloroquine, a “harmless drug” that was curing people elsewhere, as a coronavirus treatment. The board said it didn’t find that the post would cause imminent harm, despite the fact that the drug is ineffective as a treatment for the coronavirus.

Persily called this decision the most “significant” of all of them. “It also has potential implications for the Trump takedown because it combines disinformation with the threat of offline harm,” he said.

Along with their rulings the board also suggested several policy changes, including that users be allowed to appeal decisions made by Facebook’s artificial intelligence systems to human moderators. It also recommended that Facebook clarify its rules on health misinformation. 

In its blog post, Facebook said: “The board rightfully raises concerns that we can be more transparent about our COVID-19 misinformation policies. We agree that these policies could be clearer and intend to publish updated COVID-19 misinformation policies soon.”

Facebook has 30 days to respond to the optional policy recommendations. 

“It will be very interesting to see how seriously Facebook takes those,” Strossen said.