After SolarWinds: Untangling America’s cybersecurity mess

The SolarWinds hack exposed dozens—maybe hundreds—of U.S. companies to hackers' spying eyes. Here's what went wrong, and how business and government can fix it.
January 29, 2021, 11:00 AM UTC
Leland Bobbe—Getty Images

In early December, the cybersecurity firm FireEye disclosed that it had been hacked, by miscreants who had taken copies of powerful tools used by the company’s security testing teams. As it investigated, the company soon found that the attack was part of something much larger. FireEye’s systems, it turned out, had been compromised using spyware implanted via the IT management and security platform Orion, a top-selling product of an Austin-based IT software firm called SolarWinds. 

Five days after the initial disclosure, FireEye alerted the public to what it had discovered, triggering a broader search that revealed a threat of staggering scale and terrifying subtlety. The compromised Orion software had not only reached thousands of SolarWinds’ corporate customers, but it had also exposed systems at the U.S. Treasury, the State Department, and the Department of Homeland Security. The attack had apparently begun more than a year earlier, and its methodical, long-simmering information-gathering approach suggested a nation-state was behind it. By early January, U.S. intelligence agencies were blaming Russia, and “SolarWinds” had become shorthand for a hacking catastrophe. (See “Timeline of a cybercrime” below.)

“It’s certainly going to be the worst cyberattack in United States history thus far, and I don’t believe people understand its magnitude,” says Tom Bossert, a former homeland security adviser for President Trump who is now president of cybersecurity startup Trinity Cyber. “It’s primarily so troubling because of its alarming scope—the scale of this is breathtaking.” 

That includes the scope of the potential damage to business. While many high-profile hacks target customer data, such as credit card numbers and addresses, the SolarWinds attackers appear to have focused on much higher-value internal information. Their objectives appear to have included penetrating email systems and accessing corporate secrets—among them, the source code underlying Microsoft’s software. Cisco, Intel, Nvidia, and Deloitte are among the other giants who say they were exposed to the Orion patch. No company has admitted to being seriously impacted yet, but IT departments are bracing for the worst: theft of assets and intellectual property, leaks of internal data and deliberations, and the vast expense of scrubbing or even rebuilding their compromised systems.

Just as frightening as the scope is how smoothly the attack circumvented government and private-sector cyberdefenses. Some see the attack not as a failure by one software vendor, but as an indictment of U.S. cybersecurity itself. “The United States created the Internet,” says Katie Moussouris, founder of Luta Security and a pioneering cyberdefense pro at Microsoft and Symantec. But in cybersecurity, “we are losing our lead.” Whatever its consequences turn out to be, the SolarWinds hack has exposed major flaws in the patchwork public-private partnership we’ve relied on to keep our information technology safe, drawing attention to just how ill-coordinated and permeable it can be.

In the traditional defense supply chain—think the makers of fighter jets or Coast Guard cutters—private contractors submit to strict oversight and rigorous standards in exchange for long-term, high-value government contracts. In cybersecurity, in contrast, a handful of midsize government agencies work with a vastly larger constellation of private software developers, cybersecurity contractors, and their customers, offering relatively few guidelines and imposing only loose oversight.

Most experts in the industry view the decentralized, market-driven structure of U.S. cybersecurity as a source of agility and innovation. But in the SolarWinds debacle, they also see the system’s weaknesses on full display. In this mega-breach, the industry’s flawed financial incentives, a lack of transparency, underinvestment in training, and old-fashioned cost-cutting each played a role. 

These failures encapsulate the challenge of fixing America’s cybersecurity structure. The encouraging news is that corporate and public-sector reformers are already responding with repairs and countermeasures; the less good news is that many of those repair efforts are in their earliest days. (For more about those efforts, see the accompanying story, “Striking back: 4 ways the Biden administration should respond to SolarWinds.”)


When FireEye went public with its SolarWinds news, neither the National Security Agency, the Pentagon’s Cyber Command, nor any other U.S. intelligence or cyber agency had detected the attack, even though it had likely been underway for months. That notion is troubling enough: Even more stunning is the fact that FireEye wasn’t legally obligated to inform anyone—publicly or privately—about its discovery. 

There are a growing number of legal requirements for firms such as retailers or banks to report hacks involving theft of customers’ personal data. But the U.S. does not require independent research firms to share their findings about cyberthreats with government agencies, even if they constitute a potential national security threat.

In a $170 billion industry in which many companies are beholden primarily to private-sector clients, this gap in the law can create troubling incentives. Because their business models are often based on selling exclusive warnings and defense strategies to clients, cyberthreat intelligence firms don’t have a direct financial inducement to share their findings publicly. Charles Carmakal, the FireEye investigator who led its probe of the SolarWinds attack, notes that private firms might hoard information about a novel cyberattack “to use it as a competitive differentiator.”

To be sure, many cybersecurity firms do share their findings widely, and it’s unlikely that any firm would have stayed quiet about a threat as dire as SolarWinds. But executives across the industry cite the lack of uniform expectations around disclosure as a pitfall that may let some attacks go unpublicized. Imagine if Boeing, in mid-1941, had discovered the plan to bomb Pearl Harbor, then taken time to weigh the costs and benefits of informing the Navy.

It’s certainly going to be the worst cyberattack in United States history thus far, and I don’t believe people understand its magnitude.

Tom Bossert, president of cybersecurity startup Trinity Cyber and former homeland security adviser for President Trump

Private-sector dominance of software design contributes to another serious cybersecurity risk that the SolarWinds attackers exploited.

The malicious code at the core of the attack was delivered via a “supply-chain attack” that was fiendishly hard to detect. Once attackers compromised SolarWinds and inserted spyware in a trusted “patch,” or update, it became nearly invisible to routine cyberdefense measures.

This vulnerability is especially dangerous because, in part for the sake of cost savings, most software reuses a variety of third-party components, such as open-source code. One compromised component can become a foothold for attackers to access or corrupt other systems, potentially turning it into a backdoor master key to any software built using it. Yet there is no widely accepted standard for tracking or reporting what components and interactions make up a piece of software, much less any regulation requiring such tracking. In the SolarWinds attack, this may have added obstacles, as thousands of private and government customers scrambled to understand the threat posed by the corrupted Orion patch.

There is growing awareness that effective cybersecurity must be built into software and networks from the ground up, including through  “zero-trust” software that would limit hackers’ movement between systems. But there’s little market incentive to build software that is so robust, and governments don’t have much leverage to require it. “You see an increasing reliance on commercial and even off-the-shelf technology in even very high-end [government] system spaces,” says Eric Wenger, a technology policy director with Cisco Systems. That’s why the SolarWinds attack impacted public agencies and private companies alike: Using the same software exposes them to the same attacks. 

SolarWinds can boast of an achievement that few of its rivals can match: Between 2010 and 2019, its profit margins tripled. The company was able to keep widening those margins, in part, by outsourcing work on its Orion product to less expensive software engineers in Eastern Europe. What seemed like a savvy business move may turn out to be the company’s undoing: As of this writing, investigators were pursuing the possibility that hackers compromised SolarWinds through its Eastern European operations. But in outsourcing its engineering, SolarWinds was reacting rationally to market forces (not to mention following the lead of many of its software peers).

The underlying problem is that the U.S. has a severe shortage of cybersecurity talent. A recent survey by the labor analytics firm Emsi found that the U.S. has less than half the qualified cyber professionals it needs, across the public and private sectors—a gap that has resulted in hundreds of thousands of unfilled jobs. That helps make existing talent more expensive: The median cybersecurity salary in 2019 was $99,730, according to the Bureau of Labor Statistics, about 15% above the median salary for software engineers. And that extra cost gives firms in the industry another incentive to take production offshore. 

In a sense, the problem starts in our universities, and some reformers think the solutions should begin there too. A 2016 review found that of the top 10 computer science programs in the U.S., none required cybersecurity coursework to graduate; three of those 10 didn’t have a cybersecurity program at all. Some companies have been stepping up efforts to train entry-level cyberdefenders. Frank Cilluffo, a former adviser to the George W. Bush administration on counterterrorism and cyber issues, recommends thinking even bigger: “We need the equivalent of an educational moonshot around cyber issues, which will require federal funding,” he says. 

New cohorts of cybersecurity graduates won’t help much, though, if they’re working within a dysfunctional system. Restructuring that system is core to the work of the Cyberspace Solarium Commission, a task force commissioned by Congress to help reform U.S. cybersecurity. “Our focus [is] on making the market more effective at driving good behavior,” says commissioner Suzanne Spaulding, a senior adviser for cybersecurity and counterterrorism at the Center for Strategic and International Studies. “If the market isn’t performing the way it should, why isn’t it?” 

The commission spent the past year drawing up a wide-ranging list of recommendations, and in January, 26 of them became law as part of the 2021 National Defense Authorization Act. The NDAA creates a White House–level Office of the National Cyber Director and grants new private-sector threat-response powers to the federal Cybersecurity and Infrastructure Security Agency—significant changes that commission members hope will prompt closer collaboration between government and industry on security standards. “A lot of the recommendations, some of us have been making for years,” says Cilluffo, who’s also a commissioner. “But the political will was not where it needed to be. Now, we don’t need any reminders.”

Solarium’s mandate has been extended for at least another year, and its next round of advocacy and recommendations will focus more squarely on the private sector. The goal: creating better incentives for building secure software and sharing intelligence about cyberthreats.

We need the equivalent of an educational moonshot around cyber issues, which will require federal funding.

Frank Cilluffo, former adviser to the George W. Bush administration on counterterrorism and cyber issues

On the engineering side, Solarium is pushing for a national cybersecurity certification authority—something like a Better Business Bureau that would grant its seal of approval to safer software. Such certification would likely involve tracking software manufacturers’ defenses against well-known attacks, which make up a growing share of breaches. Increasingly, rank-and-file cybercriminals barely code at all, and instead use prepackaged tools sold by more skilled malefactors. Standardized defenses against this sort of simple, repetitive attack, Cilluffo says, could help free up cyberdefense teams to focus on fighting more subtle, innovative, and dangerous attacks, such as those from state-backed adversaries.

Other reformers back a so-called Software Bill of Materials: Such a bill would require U.S. tech companies to catalog the various third-party modules, open-source components, and library code they used to build a piece of software—addressing the industry’s transparency problem. Such an inventory would make it easier to alert developers and customers to software in which security flaws were discovered, just as an auto manufacturer can issue a recall if, say, a brake shoe turns out to be defective. 

The more urgent advocacy will be around the sharing of information about cyberassaults. Spaulding says that new rules could include enhanced threat reporting requirements for “critical infrastructure companies”—a category that might include private cybersecurity firms. 

Proposals like these elicit understandable wariness from industry. Complying with a Software Bill of Materials, for instance, could be cost- and labor-intensive for companies, and could even create new risks by providing detailed information to attackers. Intelligence-sharing mandates, too, face opposition from many cybersecurity firms, which see them as eroding their competitive advantages. Cisco’s Wenger worries, for example, that such rules could deter investment in cybersecurity, by reducing how much private firms can benefit financially from their research. 

The question ultimately may be whether the cost of cooperation is greater than the cost of the next SolarWinds—or the one after that. 

Timeline of a cybercrime

Security experts rank the so-called SolarWinds hack among the two or three most serious cyber espionage intrusions in U.S. history—and it may be the one that reaches deepest into corporate America. Here’s what we know so far about how it unfolded.

Sept. 4, 2019

SolarWinds, an Austin-based software company, is compromised. Engineering done for SolarWinds by subcontractors in Eastern Europe is one possible source of the breach. 

The hackers implant malware into an update for Orion, an IT and cybersecurity management dashboard product made by SolarWinds. This malware, dubbed “Sunburst,” is a so-called backdoor that allows hackers to monitor and potentially further infiltrate networks it is installed on.

March–June 2020

SolarWinds believes that 18,000 Orion customers downloaded a software-update patch that included the Sunburst backdoor during this period. Because this download is part of a trusted process, there is nothing to alert even vigilant security teams that anything is amiss.

Dec. 11

The hackers, using stolen credentials, attempt to register a new device for multifactor authentication. This action tips off the cybersecurity company FireEye to the intrusion; two days later, it publicly reports the hack.

Dec. 19 

FireEye reports that more than 1,000 infiltrated systems have been pinging a command-and-control server operated by hackers, giving them the option to penetrate further by stealing internal users’ credentials that give them access to more systems. 

Microsoft says about 40 of its own customers were attacked. An Amazon internal report later pegs the total of impacted firms and agencies at more than 250. Those reports have not named the victims, but Microsoft reported that 44% of the escalation targets it detected were in the IT sector, 18% were in government, 19% were NGOs or think tanks, and 9% were government contractors. The follow-on attacks appear to have mainly targeted email systems; their full extent is unknown.

Dec. 31

Microsoft announces that the SolarWinds hackers accessed the internal source code that underlies some of its software. To date, there’s no indication anything was tampered with or that customer data was stolen.

Jan. 5, 2021 

A group of U.S. intelligence agencies issue a statement saying that “an Advanced Persistent Threat (APT) actor, likely Russian in origin,” is responsible for the attack. Some experts suspect the SVR, the Russian foreign intelligence service associated with the Cozy Bear hacking team. Russia denies involvement.

Jan. 10

The Cybersecurity and Infrastructure Security Agency says “fewer than 10” federal agencies were impacted by the compromised Orion patch. Affected agencies reportedly include the Department of Homeland Security, the Treasury Department, and the State Department. Microsoft email accounts at the Department of Justice are compromised, though the DOJ says there’s no indication that classified systems were breached. 

Jan. 12 

Email security firm Mimecast reports that it has also been compromised by Orion. It says about 4,000 of its customers were “potentially affected,” but only a “low single-digit” number of customers were targeted.

This story appears in the February/March 2021 issue of Fortune.