In a company memo last week, Apple CEO Tim Cook thanked employees for their support in the company’s ongoing standoff with the FBI, echoing the company’s stance against giving backdoor access to an iPhone used by one of the San Bernardino terrorists.
Though what happened in California late last year was an atrocity, I think Cook is right in holding his ground in the encryption debate. Despite the circumstances, Apple simply can’t sacrifice the security of devices and integrity of their customers' data. If there is even a single mechanism through firmware or changes in the security architecture for the government to access encrypted information, that same "backdoor" will inevitably be used for nefarious purposes and have serious long-term ramifications.
The government needs to shift its thinking from an intelligence perspective and assume it will just be harder to perform traditional signals intelligence (SIGINT), such as intercepting texts and calls or siphoning information from a suspect’s devices. There was a time when terrorist groups and others used physical mediums we were blind to as an intelligence community, but managed to uncover through our own means.
Why are we not going back to something similar for these kinds of situations? We need to start looking at other ways of gaining the information we need, be it through intelligence gathered from human sources (HUMINT), such as clandestine espionage and interrogations, or similar methods to monitor these national security threats and collect sensitive information.
Unfortunately, this never-ending battle over encryption is just one facet of a much larger problem Washington has in its hands—the disconnect it has with Silicon Valley and other tech hubs around the U.S. The government is not attuned to the culture and standards supporting the adoption and development of new technology, or in this particular situation, how citizens’ digital information needs to be protected.
Consider President Obama’s recently proposed Cybersecurity National Action Plan. In all, it is the most forward-thinking and comprehensive plan to date (albeit with the hefty price tag of a 35% increase in the security budget, placing it at more than $19 billion). Despite the lofty and well-meaning ideals of the plan, I foresee issues with its implementation, especially over how private companies and talent, particularly in Silicon Valley, will be brought in to help the government enact necessary changes.
Even the talent or companies that the government does attract will eventually be put off by its cumbersome environment. Washington has become a bureaucratic behemoth that fails to see what its own interest is doing to harm everyone else’s security — from the enterprise to the individual. This will continue to be the case unless Washington realizes we have to make a real change ending the analog mode of thought that permeates the public sector.
Take it from me, someone who spent six years in the public sector—most recently at the U.S. National Security Agency as a senior analyst — and now three years running Synack, a cybersecurity startup in Silicon Valley. Both the court order for Apple to comply and the broad strokes of Obama’s plan miss the crux of what it would take to effectively support strong security for America. It’s time for Washington to see eye to eye with the tech community and understand that in order to innovate at the pace criminals do, it must think and act like a startup, from how it collaborates with separate (and private) entities to how it broaches the subject of encryption and user privacy.
Perhaps Washington should start by not using the All Writs Act of 1789 to demand aid in breaking an iPhone’s encryption—which precedes the iPhone by more than two centuries. Or maybe it’s more about it being able to step out of its government shoes and think as private citizens to understand consumer dynamics. Government officials need to understand that requiring backdoors in encrypted devices is shortsighted. Even the former director of the NSA has said that “the downsides of a front or back door outweigh the very real public safety concerns.”
Some say what the FBI is asking for is essentially the same as asking for emails or phone records from companies like Google (goog) or AT&T (t) . This, again, is the old way of thinking. The two situations may have been on the same playing field if Apple could gain access to the iCloud backup data—impossible in this case because county authorities changed the Apple ID account password at the FBI’s request, leaving the physical phone as the only viable option.
Now, the FBI is asking for a tool that will help them access this particular iPhone 5C by bypassing some of its security controls. Even if such a tool is created and then kept internal at Apple (appl) or seemingly destroyed, it will prompt hackers to target Apple to gain access to the code, or someone could even leak it internally. We have to remember that what’s made in the digital world is hardly easy to destroy.
Private companies should continue protecting consumer data, especially as technology proliferates every aspect of our lives and data becomes an extension of who we are. Consumers have placed their trust in these companies and expect that trust to be reciprocated. Just because more information is flooding the digital domain does not mean it should be handed over to the government or other requesting bodies.
Success will require that the government embrace a new mode of thought, built for and by those at the helm of the digital revolution. It’s not enough to come up with a committee, expound security awareness or force cooperation through court orders. There needs to be a foundational shift in how the government works, starting by embracing a digital work style, taking lessons from successes in the private sector and harnessing the best talent and technology to protect against today’s relentless cyber threats. Only then will Washington understand why Tim Cook, Apple and their supporters feel the way they do.
Jay Kaplan is CEO of Synack, a cybersecurity firm based in Silicon Valley.