A version of this post originally appeared in the Cyber Saturday edition of Data Sheet, Fortune’s daily tech newsletter.
At this year’s Defcon hacker conference in Las Vegas, Jason Healey, an Atlantic Council fellow and Columbia University researcher, revealed details, from what he could gather, about the United States government’s process for deciding whether to release or retain strategic computer bugs, useful for spying.
He began by asking the audience a question: How many of you think the number of software vulnerabilities in the National Security Agency’s stockpile is in the thousands? (Some hands raised.) How about the hundreds? (Most up.) How about dozens? (Crickets.)
The mustachioed academician proceeded to debunk what he deemed to be a misconception about the Vulnerabilities Equities Process, the covert procedure by which the government determines whether to disclose or to keep its secrets. Based on his research, which included interviews, public documents, and budget information, the NSA’s horde likely amounts to dozens—just dozens—of exploitable code flaws.
“I was shocked,” Healey told the roomful of disbelieving Defcon attendees, a proudly paranoid lot. “I assumed it was in the hundreds.” (He added that he had “moderate confidence” in the assessment.)
Unsurprisingly, given the process’ confidential nature, the public knows few details. And so civil liberties groups have agitated for openness and reform: individually submit every vulnerability to review, publish regular transparency reports, disclose all vulnerabilities after a short time. Who could object?
For more on cyberespionage, watch:
“This is a mindbogglingly terrible idea,” wrote Matt Tait and Dave Aitel, two founders of security consultancies, in a post on the blog Lawfare this week. The pair of intelligence agency alums listed reason after reason in a 3,300-word essay why the current system is “broken”—”it is, at some level, empty PR gamesmanship or simply poorly thought out guesswork,” they wrote—and how the reformers “now clamour to make things significantly worse.”
Their argument boils down to this: the Vulnerabilities Equities Process puts the U.S. at a disadvantage compared to its adversaries. Per the post:
Herein lies the basic problem: US cyber operations already face a greater level of scrutiny and limitations than our competitors. But single-minded reformists seek still more restrictions. At the same time, US cyber capabilities grow increasingly critical and central to the basic function of democratic interests worldwide. Without a robust investment in these capabilities, the US will lack the ability to solve the “Going Dark” issue and our intelligence efforts will start to run into quicksand around the world.
Governments must stockpile bugs, they argue. As the world’s communications adopt strong, end-to-end encryption, the only recourse that intelligence gatherers and law enforcement have to hack their targets. To do so, they need access to defects.
“This argument against the current VEP process is worth reading even if (like me) you disagree,” commented Kevin Bankston, director of the Open Tech Institute at the think tank New America, on Twitter.
Last week’s leak of an NSA-linked zero-day (previously unknown) vulnerability in Cisco (CSCO) firewall products has restoked the debate about the government’s disclosure policies. Healey, in an opinion piece for The Christian Science Monitor, knocked the spy agency (assuming it is responsible) for not disclosing the bug earlier: “The Shadow Brokers revelations”—named after the group that dumped the cache of spook hacking tools—”give the impression of an NSA that’s out of control,” he wrote, calling for an overhaul of the vulnerability review process.
When does stashing a vulnerability make the world safer, and when does it weaken the Internet for everyone? There are no easy answers; I welcome your input.
Have a great weekend, readers. More here.