By Jennifer Grygiel
October 4, 2017

Is Facebook having a Frankenstein moment when it comes to news aggregation—has it created something it can’t fully control? The company’s executives might tell you that it has, but this is absolutely not the case.

The reality is that Facebook needs to hire humans to edit and review the content it promotes as news—and it needs to hire a lot of them.

Facebook argues that it has just too much content to moderate and that new algorithms and artificial intelligence are what we need to stop the spread of false stories. Clearly, they are not. Shortly after the massacre in Las Vegas, a story from 4chan, a popular alt-right message board, blaming an innocent man for the shooting was being displayed on Google’s top stories module and the Facebook trends box and safety check page.

The companies defended themselves by blaming their algorithms and said that the fake news was only featured for a short time. We have to stop accepting these insufficient excuses. One man’s reputation and safety were put in jeopardy because Facebook and Google neglected to adequately monitor the content being promoted on their news platforms. And while these tech giants are deflecting blame, they continue to generate massive advertising revenue from their news services.

Facebook and Google could signal their seriousness in tackling fake news by creating an executive position responsible for preventing it. Facebook ads were recently used to target people using hateful words such as “Jew haters.” In response, Facebook Chief Operating Officer Sheryl Sandberg said she could not imagine that Facebook would ever have been “used this way.” But imagining possible harm is the exact job responsibility of a chief risk officer, a common position in the highly regulated finance industry. It’s time for Facebook, Google, and other major tech firms to start employing chief risk officers, given the impact that these companies have on society.

Facebook Co-Founder and CEO Mark Zuckerberg says he is sorry for how Facebook has been used for nefarious purposes and that he will try harder to make sure that his platform doesn’t continue to hurt people. But there are no legal checks to ensure he makes good on promises. Earlier in the year, after a man uploaded a video to Facebook of himself murdering another person, Zuckerberg said that the company would do “all we can” to prevent such instances from reoccurring. These apologies humanize the company, but it’s important to remember that Facebook is a corporation, and corporations do not just get to apologize and say they will do better. The public, and our elected officials, need to hold them accountable.

As Facebook and Google lead us into a new world of digital communication and information, we need to make sure that we as a people participate in this process. Our elected officials must step up and force social media companies to effectively govern and manage their potentially dangerous platforms.

Jennifer Grygiel is an assistant professor of communications (social media) at the S.I. Newhouse School at Syracuse University. Follow them on Twitter.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST