What Everybody Misunderstands About Privacy Pioneer David Chaum’s Controversial Crypto Plan

January 14, 2016, 6:30 PM UTC
Courtesy of David Chaum

David Chaum, a pioneer behind technologies that anonymize Internet users, unveiled a contentious plan last Wednesday. He outlined a proposed social media system—PrivaTegrity, he’s branded it—capable of supporting a host of features, including status update feeds, private chats, payments, pseudonymous interactions. But that’s not all. Chaum also claimed that his system may overcome a seemingly insurmountable impasse. The technology, he told Wired‘s Andy Greenberg, could “break…this standoff called the encryption wars.”

The so-called crypto wars refer to the heated political debates that have lately resurfaced between governments and tech companies, such as Apple (AAPL) and Google (GOOG), over the use of strong, end-to-end encryption in their commercial products. Tech giants assert that this practice protects their customers’ data from spies and hackers. Law enforcement agencies decry it for impeding investigators’ ability to gather useful information during criminal and terror probes. Thus the standoff.

Cryptography and privacy enthusiasts reacted to Chaum’s statements with shock, disbelief, and—in some cases—outrage, for reasons detailed below.

Fortune spoke to Chaum amid the ensuing fracas in order to gather additional information about his proposal. He said he believes his plan was misconstrued. Asked whether he stands by his claim that PrivaTegrity may “break” or “end” the crypto wars, he responded: “I don’t want to retract it, but the sense that I meant it was misunderstood.” He continued: “Because to me, it’s more the privacy war—not the end-to-end encryption war.”

“Today, social media is absolutely insecure and is openly being manipulated and pressured by government,” he added, citing a recent summit between United States security honchos and Silicon Valley tech companies in support of the allegation. “It’s irresponsible and misleading to focus on the specific security aspects of this system without contrasting them with the only known alternative.” To wit: Facebook (FB) and its ilk, he offered as an example. This is an attempt to better convey his purpose.

Get Data Sheet, Fortune’s technology newsletter.

An Architect Re-emerges

Chaum’s suggestions are not to be taken lightly. He’s an inventor of key concepts undergirding identity-cloaking software such as the Tor web browser, a bulwark of the crypto community. Indeed, privacy advocates everywhere have Chaum to thank for helping devise the tools that allow everyday Internet users to escape surveillance and that enable political dissidents, ones who are careful enough at least, to spread and express their opinions online. That’s why it came as a surprise when Chaum’s peers read that he now seemed to be proposing what many view as something verboten: a “backdoor,” or an intentional vulnerability, that snoops can exploit to unscramble private user data.

This is the point at which Chaum indisputably lost the narrative. He did not mean to imply, he insists, that his system comes with an inbuilt backdoor. “It’s B.S.,” he told Fortune. “There unconditionally is no hidden weakening within PrivaTegrity. My whole career has been about reducing the likelihood of these backdoors,” he said, “and all about creating structures that aren’t subject to clandestine manipulation.” See, for instance, Chaum’s early involvement in evaluating cryptographic standards. Also, his prescient, albeit ill-fated, development of the world’s first digital money, eCash, as well as an electronic voting system to designed to thwart election fraud.

So what did Chaum most recently create? Chaum’s technical paper introduces the concept on which his PrivaTegrity system rests—an idea he’s dubbed “cMix.” The development builds on a landmark routing protocol he devised in the early 1980s, which gave rise to “mixed networks”—and, by extension, the foundations of online anonymity. (Mixed networks obscure communications by scrambling, shuffling, and randomizing messages as they leapfrog between computer servers.) The newer version is three orders of magnitude faster, he claims, and considerably lighter weight than the old framework—so much so that it achieves smartphone-compatibility and unprecedentedly low latency. No mean feat.

Close readers of Chaum’s academic text will note that he and his co-authors never mention a backdoor at all. (Shrewd, given the entirely warranted knee-jerk revulsion many a technologist has at the phrase.) The terminology was rather introduced in the Wired piece as a metaphor. “I agreed to allow the term ‘backdoor’ to be used in the article to refer to access in general, not as deliberate weakening of a system,” Chaum told Fortune. “This probably was my big mistake.”

Correcting the Record

In order to appreciate the leap forward that Chaum says his system takes, one must first grasp the architecture of most present-day info and communications businesses. These companies possess doorways of their own; each has control over its own infrastructure, ergo each has the ability to rig its own keyservers (the machines that set up and manage the cryptography) to surreptitiously swap in unintended contacts, redirect messages, and spy on users.

Even companies that purport to uphold strong, end-to-end encryption can do this. It is within the technical power of a company like, say, Apple, to secretly add another device to someone’s iCloud account, and thereby insert a man-in-the-middle between a user and a correspondent on iMessage, as the cybersecurity expert Nicholas Weaver notes in an excellent post last year on Lawfare. (Apple has denied ever doing so, or of ever having any intention to do so, as far back as 2013.)

However concerned you are about this possibility likely depends upon your level of paranoia.

Count Chaum among the unsettled. After learning of the supposedly cozy relationship between U.S. tech companies and the National Security Agency when Edward Snowden leaked a massive trove of NSA documents in 2013, Chaum got to work formulating an alternative. In his scheme, such a company would instead have to enlist third party contractors (think data centers) operating independent infrastructure across multiple jurisdictions, all of whom would have to work in unison to have any chance at undermining the integrity of the system. This dispersion of control serves to limit, ideally, the potential for abuse. Imagine the two-man rule for nuclear missile launches, but with added complexity and greater division of power. It takes unanimous agreement and simultaneous key-turns to meddle with or unmask a user.

In practice, each server would contain a sliver of identifying information, Chaum explains. One might ask for a cell phone number, another for one’s favorite color, yet another for a password entry or other means of authentication, such as a response to a text message or email message. By cryptographically divvying up these identifying tidbits, the system becomes less risky and more trustworthy, he claims, compared to one that consolidates such information under a sole roof. “If one server were hacked, it would not constitute the kind of threat to identity that we have currently, where when you break into one of these systems, you can do real damage to someone in terms of identity theft,” he says.

Technically speaking, yes, the architecture of PrivaTegrity does include a point of access—a kind of doorway, depending upon whom you ask—split between 10 contracting companies across just as many countries. Each of those companies would have to work together in order to interfere with or glean actionable intelligence about the network’s users. (The seemingly arbitrary choice of 10 springs from a calculation intended to optimize the setup, according to Chaum. Coincidentally, there are also about 10 countries that he says he feels have data protection laws stringent enough to tolerate.) Now consider the alternative: Single companies and single governments bearing full say and sway. Chaum insists that his vision is “only an improvement” on the present paradigm.

“Current social media systems all have a front door through which those who operate them, possibly under the influence of government, can do whatever they want,” he says, “including inserting a man in the middle—even if clients think they’re doing end-to-end encryption.” Indeed, soldiers of the crypto wars have spent so much time fending off law enforcement’s assaults on endpoint encryption that they may have left open their flanks.

For more on encryption, watch:

A firestorm erupts

Upon encountering Chaum’s proposal, many onlookers expressed skepticism about the plan. Nearly everyone balked at Wired’s mention of a backdoor, for starters. Opponents objected on the grounds that any encryption scheme that incorporates a doorway (no matter how convoluted, check-and-balanced, or theoretically secured) is necessarily susceptible to adversaries of all sorts. Take your pick: Hackers, criminals, nation states, spy agencies—anyone with the resources, patience, and know-how, really. In critics’ view, such schemes are, a priori, unworkable.

(Never mind that a majority of communications systems today possess less secure doorways of their own, Chaum counters.)

Noted cryptographer Matt Blaze, who effectively ended the first crypto war in the 1990s when he discovered a crippling flaw in a proposed encryption system that deliberately included a backdoor vulnerability (see the Clipper Chip), urged caution in assessing the claims.

Another encryption expert, Matthew Green, an assistant professor of computer science at Johns Hopkins University, praised Chaum’s past accomplishments, while echoing Blaze’s circumspection.

Several hardline commentators took their criticisms further, censuring Chaum for playing into what they deem to be the hands of law enforcement officials, simply by, apparently, engaging with the logic of that side’s premises.

Tech companies and governments have clashed bitterly over this politicized feud in recent months. In the wake of recent terror attacks, Senator Dianne Feinstein (D-Calif.) vowed to introduce legislation that will “pierce” encryption, once and for all. Apple CEO Tim Cook, meanwhile, has become one of the most vocal supporters of strong encryption. No wonder the backdoor talk set people on edge.

It’s worth noting that law enforcement officials have recently adopted a softer strategy, attempting to stroke cryptographers’ egos in something of a charm offensive. James Comey, crypto war-embattled director of the Federal Bureau of Investigation, and a number of other authorities have been calling on technologists to drum up “solutions” to what has been termed the “going dark problem”—the present-day situation in which investigators armed with warrants are sometimes handicapped, stranded in the supposed dark, unable to inspect the content of suspects’ strongly encrypted communications. They need visibility, they say.

Privacy proponents, on the other hand, shoot back that buying into this line of reasoning is dangerous territory for cryptographers to yield.

Focusing on this partisan debate utterly misses the point, Chaum says. “This is essentially a separate issue from end-to-end encryption,” he said of his scheme. “There’s no way you can deny that technology to people, and this has nothing to do with stopping people from using those.”

In fact, there’s no reason why a tough encryption standard should not be adopted by default in PrivaTegrity, Chaum tells Fortune. On the phone, he kicks around another idea: Why not allow people to plug in standards of their own choosing? “Be our guest,” he says of the possibility; that way, anyone who doesn’t trust current NSA-approved encryption standards could bring their own. People who believe that established ones might be compromised may welcome the option. (Along those lines, read this Monday Note blog post by former Apple executive Jean-Louis Gassée.) “Why wouldn’t we?” Chaum says.

Besides, there is an even more important spy tactic that PrivaTegrity is designed to foil.

Confounding the Snoops

One of the true powers of Chaum’s system is its obstruction of traffic analysis. Traffic analysis a process of observation and deduction that, when properly applied to a network, reveals who is associated with whom. From there, an analyst can proceed to map out a web of correspondents, uncover hierarchies, identify persons of interest, and if needed, target them through more intrusive means, such as hacking or wire-tapping. Governments, intelligence agencies, spies, and others gather data—metadata, specifically, meaning a record of who talked to whom, when, for how long—on people through a number of means, including subpoenaing companies into divulging customers’ personal information. “Once you find out that,” Chaum says, referring to metadata, “all these issues involving backdoors are moot.”

Indeed, given the option between accessing metadata and viewing the content of peoples’ communications, many analysts opt for the former, since they’re less messy and can be astonishingly revealing. Consider this: Even strongly end-to-end encrypted systems, such as that employed by chat tools like Apple’s iMessage and Facebook-owned Whatsapp, generate oodles of rich metadata capable of revealing immense amounts of detail about a person’s life and contacts, as the crypto guru Nicholas Weaver also notes in that aforementioned post on Lawfare. (He describes the iPhone as “actually metadata-friendly” and says it “seems designed to support a backdoor.”) Governments can obtain that information with relative ease.

“The idea would be to give the possibility for a transparent, published policy for extraordinary access,” Chaum says of PrivaTegrity, although this hypothetical remains just that: a possibility. What those policies might look like are open to debate. He offers several examples, but insists that he does not advocate for any one particular policy over another. (A certain number of traces per year, perhaps? Or none at all?) Given the furor over the supposed backdoor in his system, he says he imagines that people will prefer severe restrictions on surveillance.

Chaum says he believes that when a citizenry knows it may be being watched—when governments can see what people are reading, who they’re speaking to—that can have a potentially chilling effect on democracy. He leans libertarian. Privacy advocates tend to promote Tor software, and crowd-sourced router management, as one of the best ways to evade this sort of detection, for now. That said, Chaum isn’t sold on the system’s virtues. He listed off a number of recent incidents and papers that raise issues with “onion routing,” as the protocol on which Tor is based is known, a nod to its characteristic layers of encryption. (Look to citations 4, 16, 21, 23, 30, and 38 in the “references” section of his paper for some of these.)

Why rely on volunteer-run servers, Chaum says, when you can ensure that top tier security maniacs are supervising the apparatus? That’s his stance. He says he believes PrivaTegrity could provide even more ironclad anonymity than Tor. Fortune contacted the Tor Project for comment on Chaum’s proposal. “We look forward to it being peer reviewed, coded, and tested,” Kate Krauss, spokesperson for the non-profit organization, said via email.

Who Wins the Crypto Wars?

For all the criticism that says Chaum’s crypto scheme plays into the hands of governments and law enforcement, one must not overlook a simple fact: National security hawks will probably despise the plan. The complexity and careful international distribution of power would no doubt drastically complicate many of their jobs, as Kevin Paulsen, a contributing editor at Wired has noted.

As for tech companies? They will likely not find solace in the plan either; by all means, they’ll probably be equally unreceptive. Chaum’s entire premise rests on the idea that the modus operandi of these companies—awash in their users’ metadata, and capable of fiddling with their encryption infrastructure at any time—are far too threatening and privacy invasive in their present form. No doubt some bystanders will regard his vision as fatally utopian, while others may deem it unnecessary. Perhaps it is something greater.

If nothing else, Chaum’s breakthrough appears to offer a marked improvement on his original “mixnet” concept, a technique he hopes will allow people to reclaim their privacy online. Right now, the technology only exists in a mostly untested alpha version, and it’s too early to draw conclusions about the scheme without seeing it in action. The system could present an intriguing alternative to Tor, the anonymizing web browser that his decades-old ideas helped bring to life.

Chaum, for one, says he believes a better system is possible than what many Internet users are accustomed to. “We’ve created an alternate space where people can go where their data, metadata, and transaction data are not subject to a government that has covert access to it,” he says. “We’ve brought in a new option to divide those facilities up. Instead of just having to trust a single company not be coerced by a government”—or a subset of governments, for that matter—”what PrivaTegrity allows is the spreading out of those servers into different countries so that no one country”—or subset of countries, again—”can easily cause a service to reveal information.” With information so carefully and cryptographically divided, the system remains resilient, even if all but one pillar falls.

All this promise rests, of course, on the security and incorruptibility of the groups assigned to keep the topmost computers servers running. As long as just one of those nearly dozen parties refuses to collude with the rest—and as long as just one of them evades hacking—the system should work (at least on paper). That there are 10 of these organizations serves as a buffer. No system is invulnerable though. As the papers’ authors plainly state: “To learn anything about which inputs correspond to which outputs within any such batch of messages, the entire cascade of ten mix servers, each preferably operating independently in a different country, would have to be compromised.”

This hardly an impossible scenario, given what the world knows about top spy outfits’ capabilities. But critics must weigh the risk of that hypothetical against the present state of affairs in which one company monopolizes a user’s personal information and does with it as it pleases, Chaum urges. There’s no doubt that Chaum’s scheme is theoretically interesting. And it would be a shame for his invention to enter the politically charged tech arena as D.O.A., shot down by potentially friendly-fire. “I think people read ‘backdoor’ and get all riled up,” Chaum tell Fortune. “I don’t think it’s the correct term to use—it’s about access.” Access to data—and to metadata, especially.

This is what Chaum means when he draws a distinction between what he calls the privacy wars versus the crypto wars. While the specter of a backdoor has raised peoples’ hackles, in Chaum’s view, that same access-splitting aspect of his system, in fact, affords far greater security and transparency than do the protocols in place today. Where exactly does Chaum’s idea fall on the spectrum of privacy preservation? Is it a great compromise? Or is it greatly compromised? The devil will, of course, be in the details: the code, the implementation, the scaling of the system. Read the paper yourself. And consider sharing your thoughts, comments, or ideas with Fortune.

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward