Earlier this week, the FBI stalled its impending court date with Apple (aapl) after it revealed that it might have found a way to unlock the San Bernardino shooter’s iPhone without Apple’s help. Just 24 hours before the hearing, the FBI’s filing stated that it would use alternate means to access the data on Syed Rizwan Farook’s iPhone. Since the filing, it has been reported that Cellebrite, a provider of mobile forensic software, will be the third-party aiding the FBI. But without any specification as to what specifically Cellebrite will do, everyone has been left to wonder what those methods may be. More importantly, many are concerned about what this means for personal privacy.
Could this be viewed as the government essentially hiring someone to hack into personal data? What kind of precedent does this set, and should companies other than Apple be worried? The short answer is that this is not a new issue. Governments around the world, including our own, have been hacking into devices long before this case. It’s inevitable, especially in high-profile criminal or terrorist cases like this one that pose risks to national security.
As such, vendors need to prioritize user privacy — either create the most secure products possible to safeguard their customers or risk losing the trust of consumers. For their own part, consumers should continue trusting both in the products and the laws meant to protect them. Privacy should not be a problem as long as consumers stay out of trouble.
Worries about the possible implications of the government hacking their way into this particular iPhone are unfounded. While the act itself can be debated over—whether it’s wrong or right—the fact of the matter is that it’s been happening for years and will continue to happen as more of our lives become entrenched in the digital world.
In many cases, it’s the only way to stay ahead of threats that jeopardize national security. A single user’s privacy does not outweigh the safety of thousands of individuals. The counter-argument lies more in the principle — just one exception can lead to uncontrolled chain reactions of abuse of knowledge and power. Regardless, governments will continue to do what they need to do for national security and businesses will continue to be modeled around this paradigm.
The infamous Hacking Team, for instance, is a prime example of a company that makes its money selling malware to government and law enforcement entities across the world. They are essentially a malware-as-a-service vendor. Exploit brokers also do something similar, buying exploits from researchers or finding them on the Dark Web so they can then provide them to their “customers.” One such company, Zerodium, offered a $1 million bounty late last year for an exploit that could bypass Apple’s iOS security controls.
What is unique about this case is that the FBI took Apple to court instead of finding its own way to break into the phone. Perhaps the FBI assumed that it would be easy to pull off. If you need to get into a house, it makes sense to ask for the keys first before breaking down the door. Fortunately, Apple stood its ground and the Department of Justice (DOJ) blinked first.
An important distinction to make in this case is that it’s not a debate over encryption, intercepting communication or spying on the masses. It’s about being able to pull the information from a device that they have in their physical possession. How it will likely achieve this won’t be available for mass use, but even the possibility of a workaround strongly influences groups to focus on zero-day discovery and reverse engineering efforts.
As such, it’s possible that the FBI managed to gain one of these successful exploits from Cellebrite that can be leveraged with the device in hand. If so, it’d likely need time to confirm the exploit works and then pay for it—a likely reason behind the FBI’s ask for a two- week grace period. Another compelling theory comes from computer scientist Jonathan Zdziarski. The idea itself has been circling for some time—the DOJ likely found a way to replicate a key chip in the iPhone. It grants them the ability to replace the chip and essentially brute force their way in without worrying about losing data due to the auto-swipe feature. This method requires valuable time, expert resources, and, most importantly, possession of the device. Consumers at home shouldn’t worry (yet).
More importantly, there are laws and regulations in place meant to protect against a “Big Brother” outcome, especially after the Edward Snowden incident. For instance, wiretapping can’t be done domestically, interception of communications can only be performed with a warrant, and devices can’t be seized without a warrant or reasonable cause. Whether these actions unlawfully happen in the background is not relevant to this particular case or its implications.
What we can learn from this is that vendors need to design and build devices with the utmost degree of security possible. Security must be a priority—not an afterthought or added feature—so that if the situation arises, the government or other entities can’t easily probe them. In the same vein, consumers need to be able to trust the laws and vendors, while also understanding that any data on any electronic device is potentially at risk.
In all, Apple won (for now)—but expect the DOJ to bring this issue to the forefront again.
Jay Kaplan is CEO of Synack, a cybersecurity firm based in Silicon Valley. Kaplan and his firm are not investors of the companies referenced in this article.