After five days of silence during one of Facebook’s biggest privacy crises, CEO Mark Zuckerberg has finally spoken publicly about it by vowing to do better.
“The good news is that the most important actions to prevent this from happening again today we have already taken years ago,” he said in a Facebook post on Friday. “But we also made mistakes, there’s more to do, and we need to step up and do it.”
In recent days, a series of reports have detailed how Cambridge Analytica, a British data mining firm, gained access to personal data on 50 million Facebook users and relied on it as part of its work for Donald Trump’s presidential campaign in 2016. A researcher who had access to the data after creating an app for taking personality quizzes in 2013 had handed Cambridge Analytica a copy of that data, in violation of Facebook’s policies.
The app itself had only 300,000 users. But it also collected information about all of its users’ friends.
News about Cambridge Analytica’s access to the information, which Facebook users had never consented to, set off a chorus of criticism about Facebook’s privacy safeguards. Some users have responded by saying they would delete their accounts, a potential business catastrophe for Facebook, while government regulators and lawmakers promised to investigate and possibly toughen privacy laws.
Until now, Facebook has tried to weather the PR storm without putting its top executives in the crossfire, generating questions about their leadership and whether the social network was really taking responsibility. Instead, Facebook’s PR team and lower ranking leaders have led the company’s defense by drawing technical distinctions between what happened and more common breaches by hackers, and by pointing out that Facebook curtailed the amount of user data it shared with app makers in 2014.
In his post on Friday, Zuckerberg tried to get out in front of the growing problem by calling the situation a “breach of trust” between the researcher, Aleksandr Kogan; Cambridge Analytica, which has since suspended its CEO; and Facebook. But he also acknowledged that it was “a breach of trust between Facebook and the people who share their data with us and expect us to protect it” while adding, “we need to fix that.”
Zuckerberg’s plan is to investigate all apps that had access to large amounts of information before Facebook changed its data sharing policies in 2014 and to conduct an audit of any app that shows suspicious activity. “We will ban any developer from our platform that does not agree to a thorough audit,” he promised, along with banning those that Facebook finds misused personally identifiable information and notifying users who were impacted.
Additionally, Zuckerberg said Facebook would further restrict the data it shares with developers. That includes cutting off developer access to data for users who have not used their apps in three months and limiting the amount of data shared with apps when users sign in to only names, profile photos, and email addresses.
Facebook users can also expect to see a new tool at the top of their News Feeds that show the apps they’ve used and “an easy way” to block those apps from accessing personal data. Facebook already gives users the ability to control some of their data sharing in their privacy settings, but many users don’t bother to adjust the settings or don’t realize they can.
Although Zuckerberg talked a lot about protecting user information, he did not mention why Facebook overlooked the potential problems inherent in giving developers such broad access to user data in the first place. Instead, he said the policy was merely intended made to make Facebook “more social.”
“We will learn from this experience to secure our platform further and make our community safer for everyone going forward, Zuckerberg said.
Whether that’s enough for users and regulators is an open question.