Despite Crackdown On Election Interference, Facebook Must Do More, Experts Say
Following revelations about Russians using its site to manipulate the 2016 U.S. presidential election, Facebook has increasingly tightened policing of political messages on its service. Now, as presidential campaign heats up, the company is trying again as it tries to repair its battered reputation.
On Sunday, Facebook detailed several new policies aimed at preventing misuse of its service in future elections including banning ads that are intended to suppress voter turnout, encourage people not to vote, or intimidate would-be voters. This includes any ads with specific and inaccurate data about how, where, or when to vote, or threats of violence against voters.
Ads meant to discourage voting were widespread in 2016, according to Fatemeh Khatibloo, principal analyst on privacy and data issues at Forrester Research. She has collected examples, including ads falsely stating that voting locations had changed. She says many of those were “very geographically or demographically targeted,” a technique that Russians often aimed at minority voters, Facebook said it its latest report.
Facebook said it is also working to prohibit other forms of voter interference, without providing details. But it acknowledged that further refinements may be challenging. “My concern at this point is, they’re honing in on a very narrow set of [voter suppression tactics] because they know they can do that algorithmically,” says Khatibloo.
Being able to screen content using text-reading algorithms, rather than human screeners, may cut expenses. But while machines can easily identify specific words or phrases, they have more difficulty determining the intent or meaning of sentences, much less paragraphs.
That means messages intended to suppress voting could still sneak through. In 2016, for instance, one message spread by Russia-backed actors and intended to suppress African-American voter turnout simply read “Everybody SUCKS, We’re Screwed 2016.” It’s unclear whether such a message, blunt though it is, would be blocked under any of the new or proposed policies. Political messages are “social, they’re anthropological, they have ethical dimensions,” says Khatibloo. “Technology and A.I. are not going to solve those problems.”
Personal messages posted by individual users are another obvious gap in Facebook’s new policy, which only impacts paid ads. “The echo chamber is not going to flag content that my friend shared,” says Khatibloo, “Because probably I’m in the same echo chamber as my friend.”
That content could involve anything from misleading voting information to the sort of inaccurate reporting and conspiracy theories that were also widespread on Facebook during the 2016 campaign. Facebook and other social media platforms have also become more aggressive about banning high-profile disseminators of misinformation.
Other new policies announced on Sunday may also have significant blind spots. For instance, Facebook will now ban race-based screening in housing ads, but Khatibloo says that it’s easy for advertisers to use other user data to substitute for race. Fans of Essence magazine, she says, could be blocked from viewing a housing ad as a rough stand-in for screening African-Americans.
Overall, Khatibloo describes the specific steps Facebook has taken to fight voter suppression and hate speech to putting “lipstick on a pig.” However, she gives the social media giant more credit for the internal structures it is implementing, including creating a civil rights task force that connects executives from across the company with experts on the topic to tackle ongoing issues.
“Firms that have said, we’re going to put a stake in the ground around some elements of corporate social responsibility,” she says, “[and] bring people from different parts of the organization together, things do get better.”
Khatibloo also praises Facebook’s commitment to transparency as it evaluates policies and considers new ones. “I’m really happy that they have opened this up and released the report,” she says. “It doesn’t feel like they’re whitewashing any of it, or watering any of it down.”
Facebook has made other substantive improvements to its handling of political advertising, according to Travis Ridout, co-director of Wesleyan University’s Media Project, which monitors election advertising. Ridout has closely followed Facebook’s reform efforts, and says one key advance is a searchable database of political ads that journalists and researchers can use to see ads they wouldn’t normally be shown as users. This makes it much easier for those observers to spot troubling ads.
“There are a lot of kinks still being worked out,” says Ridout, citing the system’s limited search options and scant data on ad targeting. “But it’s a lot better than it was during the 2016 campaign, when we had nothing.”
Ridout is under no illusions about the Facebook’s ultimate motives, though. “The primary [motive] might be to avoid bad press,” he says, “But bad press might also lead to government regulation, which they also want to avoid.”
Currently, Facebook and other social media platforms face few restrictions to the kinds of election ads they can sell in the U.S., or how they’re targeted to users, and Ridout says they’d like to keep it that way. Campaigns and SuperPACs, for their part, have to report their ad spending to the Federal Election Commission, but not additional details such as the target of their ads.
Facebook’s proactive efforts may not be enough to forestall tighter governmental oversight, domestically or overseas. Europe’s GDPR regulations, which place controls on how Internet companies handle user data, includes restrictions on how data is used to target ads. California’s Consumer Privacy Act, which goes into effect in 2020, creates similar restrictions, and because of the state’s size, will become a defacto national standard. Ridout says there’s “not a chance” such rules will wind up uniform globally, and that the varied rules could limit the profitability of Facebook’s political ads overall.
“In some ways you kind of feel sorry for the company,” says Ridout. “They started out just wanting to make money selling ads, and they ended up having to answer big questions about politics and government. And in some ways it doesn’t seem like they were prepared for that.”