Skip to Content

British Spies Tried to End Tech’s Encryption Debate. But Their ‘Ghost Proposal’ Only Rekindled It

Late last year, the British counterpart to the U.S. National Security Agency—Government Communications Headquarters, or GCHQ—made an attempt to break the logjam that is the encryption debate. That attempt has just crashed and burned.

In an open letter published Thursday, Apple, Microsoft, Google and Facebook’s WhatsApp, plus a host of cryptography experts and human rights groups, rejected the agency’s proposal to modify services such as WhatsApp so that spies can be secretly added to people’s conversations, to let them listen in.

Why the rejection? The usual reasons: the proposal could weaken the security of people’s communications; it would certainly violate users’ trust; and it’s ripe for abuse. Ironically, these are all pitfalls the proposal’s authors said they were trying to avoid.

Good principles

In their November article on the Lawfare national security website, GCHQ cybersecurity technical director Ian Levy and codebreaking chief Crispin Robinson set out six principles for solving the quarter-century-old “Crypto Wars” conundrum of how to let law enforcement and intelligence agencies investigate miscreants without making regular phone and Internet users more vulnerable to other miscreants, and without trampling all over people’s rights.

Those principles included:

  • Maintaining privacy and security protections
  • Allowing investigative methods to evolve with technology
  • Accepting that proportionate, human-rights-respecting spying solutions will never be 100% effective
  • Ensuring that governments can’t use methods meant for targeted investigations to just read everyone’s data
  • Maintaining people’s trust in the communications services they use, by “not asking the provider to do something fundamentally different to things they already do to run their business”
  • Making sure the spying mechanism is transparent and open to scrutiny by experts.

“We welcome Levy and Robinson’s invitation for an open discussion, and we support the six principles outlined in the piece,” Thursday’s open letter read. “However, we write to express our shared concerns that this particular proposal poses serious threats to cybersecurity and fundamental human rights including privacy and free expression.”

Ghost in the machine

Here’s how Levy and Robinson describe their big idea, which others have since dubbed the “ghost proposal”:

It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call. The service provider usually controls the identity system and so really decides who’s who and which devices are involved—they’re usually involved in introducing the parties to a chat or call. You end up with everything still being end-to-end encrypted, but there’s an extra ‘end’ on this particular communication.

As the public letter pointed out, this violates Levy and Robinson’s own trust principle. “The moment users find out that a software update to their formerly secure end-to-end encrypted messaging application can now allow secret participants to surveil their conversations, they will lose trust in that service,” it read.

That’s not the only problem with the “ghost proposal.”

The idea would force companies like Facebook and Apple to fundamentally change how their services authenticate the identities of people involved in conversations—there’s no other way to allow someone to join a conversation without other participants being alerted. That means making the security mechanisms of the apps more complicated—and the more complicated computer code is, the higher the risk of the code containing flaws that could undermine security.

Then there’s the risk of abuse, either by people working in law enforcement or intelligence agencies, or by repressive regimes that want to use the new mechanisms to spy on and suppress dissidents.

And on top of all that, there’s the problem of this being proposed by British spies. The U.K.’s Investigatory Powers Act means communications providers can be banned from even disclosing that they’ve been asked to change their systems. So much for the transparency principle.

Going in circles

Again, none of these problems are new.

Look back to the 1990s, when the Clinton administration tried to force the installation of spy chips in everyone’s phones—the technical basis of the scheme turned out to be insecure.

Just a few years ago, when Apple and FBI were facing off over a dead terrorist’s encrypted iPhone, Apple’s argument came down to the fact that changing software to meet investigators’ needs would make hundreds of millions of users less secure. (CEO Tim Cook memorably described what the agency wanted as the “software equivalent of cancer.”)

And, as the public letter’s authors noted, when President Barack Obama assembled a task force to consider options for spying on encrypted conversations, the group rejected mandating software changes because of the trust issue. Eroding trust in this way doesn’t just lead people to stop using a certain app or to watch what they say—they may also reject app updates that include patches for serious security flaws, thus making them more vulnerable to crooks.

GCHQ’s Levy responded to the criticism in the public letter by saying: “We welcome this response to our request for thoughts… The hypothetical proposal was always intended as a starting point for discussion. We will continue to engage with interested parties and look forward to having an open discussion to reach the best solutions possible.”

It’s great that this discussion is happening. There’s no question that both sides of the debate would dearly love to move past it. But there’s still no resolution in sight.