Photograph by Thomas Trutschel Photothek via Getty Images

Michael Chertoff sits down with Fortune to talk Apple, the FBI, and data.

By Jonathan Vanian
March 18, 2016

Earlier this month, a group of former top government officials came out to say they sided with Apple in its refusal to help the FBI crack into an iPhone used by one of the San Bernardino shooters.

Former Department of Homeland Security Secretary Michael Chertoff was among those siding with the world’s most profitable company. He argued that developing software to weaken the iPhone’s existing protections would be like “creating a bacterial biological weapon” that could also be used by criminals.

Chertoff, now the executive chairman and co-founder of the security consulting firm The Chertoff Group, sat down with Fortune to discuss Apple’s aapl legal tussle, how companies collect data about their users, and nuances of storing that information across the globe.

Here is an edited version of the interview:

Fortune: When has the federal government compelled a company to create technology to aid an investigation?

Chertoff: There is an old case involving New York Telecom, in which basically the court said, “Look if there’s a lawful wiretap order, you have to be willing to connect up the wiretap device to the phone line to enable the output.” You do have to lend assistance.

The question is how far does that go? You could take the logic to the point of saying you have to build an entire business and invest $10 million to construct the tool, which is obviously absurd. I think part of the issue in the Apple case, since the relatively narrow issue is to what extent can this concept of helping the government execute a lawful warrant go so far as to really, in a sense, enlist you in actually preparing tools that you otherwise would not prepare. This, frankly, can be at odds with your own security interests.

Once the tool is created, you now have the risk that it might get out. That actually creates a broader problem for the company.

Does it make any difference that telecoms have a history of working with the government?

I think some of it is. Yes, the telecoms did have a tradition. More than that, I think the complexity of the way data is managed now is much greater than in the days of the original telecoms and wiretaps. That was a pretty simple concept; A is talking to B over a wire, we’re going to stick something in the wire and we’re going to get a record of what’s being said.

Now the way data moves and what’s done with it is so much more complex that it both opens the door to enormously more power on the part of whoever collects that data, but also is complicated because it starts to pose security risks to the system.

As you create back doors or tools, in effect you’re creating a kind of malware. Once you have it, are you confident you can protect it? If it gets in the wrong hands, is that going to be a problem? I think that this kind of relatively simple pre-digital model of what it means to assist technologically is just vastly oversimplified in light of the current technological context.



What is a possible solution to the Apple and DOJ fight?

Chairman [Michael] McCaul from the House Homeland Committee and Sen. Mark Warner have put together a bill to do a commission on this. The idea would be to sit down and look more holistically at the benefits of encryption, where encryption creates issues for law enforcement, but also where it helps law enforcement. Not only in terms of allowing law enforcement to encrypt its own data, but as you empower companies to defend themselves against crime, you’re actually making law enforcement’s job easier.

Let me ask you this question. Would law enforcement be better off in a world where nobody had any locks on the door? It would be easier to serve search warrants, but there could be a lot more burglaries. Maybe sometimes you want to reduce the number of burglaries even if it’s harder to execute search warrants.

I think these are the kinds of things that ought to be discussed. More broadly, because I think the larger issue is that if you require companies to architect and design and build systems and networks and devices, there’s always a tool or capability to penetrate it and bypass the encryption. Not only are you creating vulnerabilities, but you are also eliminating a certain type of innovation.

If you come, for example, to a form of encryption where you have keys that you use once, and the key disappears, by definition you can’t have a duplicate. Unless you have an algorithm that is essentially predictable, then of course that reduces the value of the tool.

Frankly, the discussion would be helped by being able to sit down with people who are very technically experienced in the area of encryption to get a real sense of what’s possible and what’s not possible.

Explain some current regulatory issues over data that you are seeing.

As data is globalized and as the amount of data that is in private hands exponentially increases, there’s been a paradigm shift in the relationship between government intelligence collection and the private sector.

If you go back 20 years ago, the government was a much better intelligence collector than the private sector. Their focus was on collecting data against foreign leaders or military tactical intelligence. In the twenty years since the Internet—and also the rise of asymmetric warfare, terrorism, and transnational crime—the focus of intelligence has moved away from [targeting] the leader to the population. Who in the population risks being a terrorist? How do we collect information about uprisings or social upheavals?

At the same time, the private sector now vacuums up a huge amount of that data—more than the government could or would want to. That’s created an issue where increasingly the government’s going to look in the private sector and say, “Okay, we want to have what you have.”

By necessity, essentially?

Exactly. Here’s a subpoena or here’s a warrant. Even if you assume the government’s being absolutely faithful to the law and they are getting everything properly authorized, it still creates two dilemmas.

One is what do you do when the data you want is in another part of the world and the country where the servers are has a different view of how you deal with the data? How do you avoid putting your companies in the middle? What you do when you have companies whose business model is, “We don’t want to collect data; we simply want to enable individuals to transfer data among themselves or hold their own data?”

Are you referring to a cloud provider?

Cloud provider. The one that does not want to actually have access to the data itself like an Apple. They are basically saying: “We’re not going to use your data for marketing purposes or whatever. What we want to do is encrypt it in such a way that the sender and recipient will be able to read it but we’re going to step out of this. We’re simply going to facilitate the storage and transfer, but we’re not going to have any visibility into the data. We’re not going to monetize it. We’re not going to use it. If the government wants your data let them go to you.”

Both of those are reflections of the dramatic change in both what intelligence and law enforcement wants and how much of that is in private hands. The private sector is struggling with, “We want to obey the law, but which law? To what extent are we obliged to alter our business model to conform to what makes it easier for law enforcement or intelligence?”

Is this why are seeing data-related legal cases around the world blowing up in the public eye?

You’ve seen that a Facebook fb executive gets arrested in Brazil. Brazil says: “We want the data and we don’t care if it’s in the U.S. We don’t care about U.S. law.” You have Microsoft msft versus the U.S in Ireland [over whether the U.S. government can search emails stored abroad].

Now you have the Apple issue, which involves not a question of who controls the data, but whether you can make the company actually open up access to the data when their entire structure has been not to do that.

What is the worst-case scenario we could see if there’s a potential data war between countries over who owns the data?

The worst case scenario is everybody says, “All the data related to my citizens has to be in my country.” So that instead of having servers located where the engineering and the climate are most conducive, you wind up having servers for every country within the borders of that country. Then you wind up without a real Internet; you have many inter-nets.

We already see some of that happening with reports coming out like Google goog putting some servers in Russia.

Right. Some of these companies are trying to mitigate that by, for example, if they encrypt the data they keep the key in the United States. Sometimes you can do a little bit of that because you reduce the latency of the servers to your customer, but there’s no question it can be the thin edge of the wedge until you’re basically fragmenting the Internet. That would, again, really reduce a lot of the value the Internet has brought.


You May Like