Facebook Must ‘Pick Sides’ When Confronting Moral Issues, Departing Security Chief Says

Facebook’s outgoing chief security officer, Alex Stamos, told the company’s employees bluntly that the company had to pick sides when clear moral or humanitarian issues were at stake. His words stand in sharp contrast to CEO Mark Zuckerberg’s recent support for organizations like Infowars, which Facebook acknowledges shares “conspiracy theories or false news.”

Stamos made that statement in a March 23 memo that Buzzfeed obtained and published in full today. The memo followed a story on March 19 in the New York Times reporting on Stamos’s apparent efforts within the company to push for more disclosure of election interference attempts in the U.S. and elsewhere via Facebook by governments of other countries.

While the memo starts with pushback about some of the Times reporting, it doesn’t dispute his departure date planned for August 2018. However, the most significant part presents his brief, frank, and extensive list of changes required at Facebook to face threats properly.

“It would be really simple to believe that the outcomes of arguments between a handful of people got us to this point, but the truth is that we need to all own this,” Stamos wrote. “The problem the company is facing today are due to tens of thousands of small decisions made over the last decade within an incentive structure that was not predicated on our 2018 threat profile. While it has been disconcerting to hear anger and sadness in the voices of our colleagues this week, I also take heart in how widespread our desire has become to align ourselves in the new landscape. I saw this shift in many executives last year, as they clearly recognized the emerging imperatives to prioritize security, safety, integrity and trust over all else, but no number of all-hands or corporate goals was going to be able turn this huge ship without a bottom-up change in culture.”

This included his imprecation “to be willing to pick sides when there are clear moral or humanitarian issues,” which stands aside from the company’s frequent attempts to provide an artificial sense of neutrality. Facebook has attempted to overlay limited fact-checking on posts that may contain false or distorted information, and has discussed suppressing the reach of less-credible and non-credible news that doesn’t cross into being actively harmful.

Yet on Monday the Conservative Review posted a video to Facebook that appeared to be a host on its CRTV video network asking questions of New York Democratic congressional candidate Alexandria Ocasio-Cortez, but which comprised excerpts from a PBS Firing Line interview interspersed with a CRTV host’s separately recorded questions. On Facebook the video wasn’t labeled parody, nor was its nature explained—yet it collected more than a million views within 24 hours.

Stamos said Facebook needed to reward not shipping new features when that was the wiser decision, factoring in attempts by others to attack the site as they make decisions, and reduce the amount of information they collect and the duration they retain it about people. He also said that the company needs to listen to its employees and outsiders who “tell us a feature is creepy or point out a negative impact we are having in the world.”

Subscribe to Well Adjusted, our newsletter full of simple strategies to work smarter and live better, from the Fortune Well team. Sign up today.