Microsoft’s Old Software Is Dangerous. Is There a Duty to Fix It?
A global ransomware epidemic is winding down, but questions over the fallout are just beginning. Who’s to blame for the crisis that hijacked hundreds of thousands of computers? And can anyone stop such criminals, whose victims included hospitals and police, from striking again?
These aren’t easy questions, but one company, Microsoft, has more explaining to do than most. After all, it was flaws in Windows systems that allowed hackers to carry out the ransomware attacks, which also struck companies and governments. In some cases, like the U.K.’s National Health Service, the frozen computers put lives at risk.
If this was a different industry, Microsoft would likely face lawsuits for selling a faulty product. But its product is software, and suing over flawed software is difficult. This means the legal case against Microsoft is feeble—even if the moral one may be strong.
Windows as “Unsafe Building Material”
When the ransomware crisis erupted last week, causing computers from England to Asia to lock up, not everyone was surprised. Weeks earlier, a hacking group known as the Shadow Brokers announced that it had stolen a set of Windows software flaws used by the National Security Agency for spying, and published them on the Internet. It was just a matter of time until someone deployed them in a crime spree—which is what happened when bad guys (North Korea is a suspect) held computer owners hostage, and demanded payments in bitcoin.
The damage, though, would have been worse if the NSA hadn’t earlier warned Microsoft that the Shadow Brokers had stolen the spy agency’s Windows hacking tools—a warning that allowed the company to create a patch for the software flaws. This immunized many Windows computers from the ransomware attacks.
But the attack still succeed in capturing hundreds of thousands of computers, in part because not everyone installed the updates Microsoft had issued. Meanwhile, many of the victims who used older versions of Windows, including XP and Windows 8, remained totally exposed to attack because Microsoft no longer provides updates for those versions.
As the crisis unfolded, Microsoft did, though, offer a free patch for the XP machines. While this provided additional security, some criticized the company for only releasing the patch well after the epidemic was underway. Prior to that, according to the Financial Times, Microsoft had been charging up to $1,000 per device for the patch—too costly for many organizations, especially public sector ones.
All of this raises the question of whether Microsoft, which declined to comment for this story, should have done more to fix the faulty software in the first place. The company’s after-the-fact approach to safety differs from other industries, such as car companies, where manufacturers have faced massive liability for failing to warn people about faulty ignition switches and other defective products.
There’s also the fact Windows is a closed software platform. This means any defects in its source code are hard to detect because the internal workings that make it run—the source code—are all but invisible to those outside the company. This is why some people like Eban Moglen, a noted computer law professor at Columbia University, considers platforms like Windows to be intrinsically dangerous.
“Proprietary software is an unsafe building material,” he explained in a published speech. “You can’t inspect it. You can’t assess its complex failure modes easily, by simply poking at the finished article. And most important of all, if you were aware of a problem that was of a safety-enhancing kind, that you could fix, you couldn’t fix it.”
‘They Made a Defective Product’
So if Windows is indeed “unsafe,” why isn’t Microsoft liable in the same way as General Motors when its products fail? The best answer may be that Windows is software, and software is a special sort of product. As one legal expert told Reuters, Microsoft and others rely on licenses, which are contracts stuffed with fine print, to ensure customers can’t hold them accountable for any software flaws.
Meanwhile, a core business strategy for Microsoft and other software sellers involves persuading people to upgrade to newer software versions like Windows 10. In this context, companies may have an incentive not to fix security vulnerabilities in older software—but instead treat those vulnerabilities an inducement to upgrade.
“One way to view it might be: they made a defective product that the current laws don’t make them liable for, and force you to buy a new product or else you are vulnerable to harm from the existing product,” a government official, told the Financial Times.
Microsoft responded to that article by stating, “Security experts across the industry agree that the best protection is to be on a modern up-to-date system that incorporates the latest defense-in-depth innovations.”
Microsoft’s approach to updates, though, isn’t simply a way to avoid liability. The licensing system also reflects that, unlike an automobile, software is constantly evolving and changing—think of the frequent updates to mobile phone software— and that it’s impossible for companies to guarantee the reliability of every line of code they write.
Cyber law professor Jennifer Granick of Stanford University suggests auto-industry style liability is not appropriate for software.
“While it is true that companies need to start to prioritize security in coding, it is unreasonable to ask Microsoft to be liable for anything that can be done with the 50 million lines of code in Windows 10,” Granick told Fortune by email.
Meanwhile, software licenses aside, it’s unclear how much lawsuits over faulty software would improve computer security. In the view of many in the tech industry, more liability could be crippling. Their fear is that the fast-moving software business would come to resemble the pharmaceutical industry, where legal and regulatory concerns mean it can cost hundreds of millions to develop a single product.
As such, a series of negligence lawsuits—which helped to spur safety improvements in automobiles and consumer goods—may not be the best path to improving computer security. This is especially the case given how the low level of tech literacy that prevails in many courts.
“Operating systems are complex by demand, and computer scientists do not yet know how to ensure that software is secure. Courts and juries are not going to do that job any better,” said Granick.
A Better Way?
Right now, when it comes to software vulnerabilities, the way it works is that software companies do what they can to fix security problems, and expect consumers to install those fixes through updates. And as one Microsoft engineer suggested on Twitter, consumers will demand up-to-date software.
This makes sense in theory but it does nothing to address the millions of Windows systems that remain unpatched, sitting like so many ticking time bombs.
So far, Microsoft’s prime response to the ransomware epidemic has been to point fingers at the U.S. and other governments that stockpile computer vulnerabilities as spying tools. In a blog post, the company’s chief legal officer, Brad Smith, also reiterated a call for a “digital Geneva Convention” to discourage the use of the Internet as a weapon.
And there are signs that some in government are receptive to Microsoft’s message. This week, U.S. lawmakers proposed a bipartisan bill, titled PATCH, meant to improve the process by which the government discloses software vulnerabilities to companies.
But such initiatives do little to help the many organizations running old Windows software that cannot afford every upgrade, and that are prime pickings for hackers. These organizations should not, of course, expect to receive new Microsoft features for free—but it is not unreasonable to think that Microsoft should help them when it comes to security.
Chemical and asbestos companies have, for instance, established trusts to clean up their past mistakes. Microsoft, which has an enormous cash hoard, could create something similar to mitigate the security dangers inherent in its old software—and (in an ideal world) inspire other software companies to do the same.
One way to do this would be to make the source code from its legacy products open for others to test along with cash incentives (known as bug bounties) for people to find and fix flaws, and share those fixes with organizations that need them. Would Microsoft actually do such a thing? A former executive told Fortune last fall this would amount to a “hell freezes over” moment but, given the recent shifts in the company’s culture, and some unprecedented open source initiatives, it no longer seems possible.
According to Jeremiah Grossman, an executive at security company SentinelOne, there is also a policy case for requiring Microsoft to make its old code available.
“I think Microsoft has done all that could be reasonably expected of them,” he said by email. “They gave sufficient notice to the world, and worked at it greatly over the years, yet here we remain. [It] might be time for a governmental policy change. When a vendor does end-of-life a piece of software, particularly one located in mission critical areas, they may be compelled to make the source code available.”