Alternative Views at Sao Paulo Fashion Week Fall/Winter 2017
Cell phones in live facebook and snapchat in Tig fashion show at Sao Paulo Fashion Week N 43 SPFW Fall/Winter 2017 on March 16, 2017. Mauricio Santana—Getty Images
Commentary

Hiring 3,000 More Workers Won’t Fix Facebook’s Violent Video Crisis

May 09, 2017

Last week, Facebook stated that it is hiring 3,000 new people to monitor and remove inappropriate posts like graphic and violent videos. This makes sense, given the many issues Facebook (fb) has faced since making video content its overwhelming priority over the past few years. The company has seen an influx of violent videos featuring murders, suicides, and rapes posted on the site, which has caused both internal struggles regarding how to appropriately address this disturbing trend and external struggles with negative press and public backlash.

Last week’s hiring announcement builds on CEO Mark Zuckerberg’s earlier admissions that the company must be more responsive when inappropriate content is shared on the site. Hiring 3,000 new employees isn’t a bad start, but these future hires will not eliminate Facebook’s video problems. And it isn’t clear yet if anything will.

Setting aside the broader public issue of eliminating societal violence, at the root of Facebook’s problem is its rush to roll out video services as quickly as possible. By many accounts, Facebook was much slower to integrate video into the platform than its social media competitors like YouTube, Periscope, and Snapchat (snap). Given that the company started at a competitive disadvantage, tools like Facebook Live were seemingly rushed to the marketplace without serious consideration about the negative impact of such offerings.

In one exclusive interview days before Facebook Live was launched, Zuckerberg touted the product as “a great medium” for “raw and visceral content” created by users, without discussing at all its potential dangers. Another person familiar with Facebook Live's development told The Wall Street Journal that the company “didn't grasp the gravity of the medium” during its hurried, two-month rollout process. Given this history, it isn’t surprising that Facebook has appeared unprepared to address the problems of violent videos and other inappropriate content.

In this vein, Facebook’s public commitment to hire more people, coupled with its earlier promises to improve technology and review reporting processes, seem somewhat hollow without more transparency regarding how these measures will be implemented. There are also questions about whether these things will work at all, even if implemented perfectly.

Hiring more people will provide more “eyeballs on the ground,” but given that Facebook’s current user base is comprised of nearly 2 billion people (roughly 25% of all human beings on the planet), 3,000 new hires—in addition to the 4,500 people Facebook currently employs on its community operations team—can only scratch the surface of the billions of videos viewed each day on the platform. While human judgment is certainly helpful in determining which videos violate Facebook’s standards, the company’s sheer magnitude makes it untenable for these new people to completely solve the problem. Even Zuckerberg himself acknowledged the impossibility of the task last week, telling investors, “No matter how many people we have on the team, we’ll never be able to look at everything.”

Compounding Facebook’s violent video problem is the process by which videos get reported to these human beings. Facebook currently relies heavily on users to flag inappropriate content. On one hand, the number of reports can be enormous. On the other hand, some violent videos have gone hours without being reported at all. In addition, reported videos depend heavily on users’ definition of “inappropriate”—some videos may not report community standards at all.

Which, of course, is why Facebook has emphasized its use of improved technology like computer algorithms and artificial intelligence. This technology could at some point have the capacity to identify inappropriate content as predefined by Facebook and remove it from the site. The technology has not, however, reached this point yet, as Facebook learned in August of last year when it fired its entire Trending staff after allegations that some people engaged in political bias when selecting trending items. Instead of relying on humans, Facebook began to use mostly algorithms to select trending items. Within days, fake news stories began trending instead. It is clear that the technology is not currently reliable enough to eliminate the need for human engagement in the process.

Finally, it may be that self-policing is inadequate to accomplish the task of addressing some inappropriate content, like violent and graphic videos. Unlike other communication forms like television and radio, which are regulated by the federal government, Facebook and other social media platforms have been left to largely decide for themselves what communication is appropriate to share publicly. It may be time to rethink the approach of allowing companies like Facebook to decide how to regulate themselves and the content they provide.

Today’s limitations of both human capacity and technological advancement put Facebook in a position where no solution will be perfect. Even so, it has both corporate and ethical obligations to pursue acceptable solutions to its violent video problem. Perhaps the only solution is one that incorporates human judgment, technology, and government intervention.

Shontavia Johnson serves as the Kern Family Chair in Intellectual Property Law and directs the Intellectual Property Law Center at Drake University Law School. She curates content related to law, innovation, and policy.

All products and services featured are based solely on editorial selection. FORTUNE may receive compensation for some links to products and services on this website.

Quotes delayed at least 15 minutes. Market data provided by Interactive Data. ETF and Mutual Fund data provided by Morningstar, Inc. Dow Jones Terms & Conditions: http://www.djindexes.com/mdsidx/html/tandc/indexestandcs.html. S&P Index data is the property of Chicago Mercantile Exchange Inc. and its licensors. All rights reserved. Terms & Conditions. Powered and implemented by Interactive Data Managed Solutions