Don’t Blame NSA for WannaCry
When giving talks on cybersecurity, I often get asked what keeps me up at night. My short, and glib answer is my four-year-old (he really is a horrible sleeper). I certainly don’t sit up at night worrying about a cyber-attack on the power grid or the manipulation of the stock market by cyber criminals.
In fact, nothing I ever saw in classified channels about a cyber threat cost me a wink of sleep. Other intelligence did though, about planned terrorist attacks and nuclear proliferation and other horrors managed by other directorates. During the year I spent working on counterterrorism at DHS before I went to work on cybersecurity at the White House, I spent many nights wondering if we had made the right decisions to counter some very dangerous threats.
So when it comes to WannaCry, I don’t discount the possibility that the closure of hospital ERs and the rescheduling of operations may have cost lives. Many pundits in the field seem to agree with Edward Snowden, who told the Guardian that the NSA should have disclosed the vulnerability exploited by the malware when they found it, not when they lost it. Yet, even Snowden hedges on whether disclosure would have prevented the attack. If the NSA had disclosed the vulnerability earlier, the attack “may not have happened” (emphasis added).
Snowden hedges because no amount of warning would have been enough to get Windows XP out of hospitals, or get hospitals to install the latest patches in a timely manner. If NSA had disclosed the vulnerability years ago, it would likely still remain exploitable today.
But I am also attuned to the reality that the intelligence collected by NSA through exploiting this vulnerability likely saved lives, possibly many.
Contrary to prevailing sentiments in the privacy community, NSA does not exploit vulnerabilities for its own amusement. I don’t know what intelligence NSA collected using this exploit kit. What I do know is that it is difficult to overstate the importance of signals intelligence to our national security.
That vulnerability may have been exploited to gather intelligence vital to negotiating the Iranian nuclear deal, slowing North Korea’s program, or yes, stopping a terrorist attack.
NSA deserves blame for losing the exploit kit, not for developing it in the first place. I am deeply disturbed that seven years after the Manning leaks, and four years after the Snowden leaks, we still don’t have good protections against insider threats within the defense and intelligence community.
But NSA is a spy agency. More specifically, it is a signals intelligence agency. In the 21st century, that means it will, for certain missions, need to develop and exploit zero day vulnerabilities and not release them to the public.
Contrary to what Microsoft President Brad Smith has written, this incident doesn’t show the dangers of stockpiling vulnerabilities. There is no evidence that NSA was hoarding hundreds or thousands of vulnerabilities it was not using (stockpiling). Instead, it shows they were actively exploiting a small number of very useful vulnerabilities.
Smith is right that this incident is comparable to the U.S. military having some of its Tomahawk missiles stolen. To continue the analogy, his solution suggests that the theft of a Tomahawk missile should mean that the U.S. government should remove them from its arsenal instead of tightening security controls around them.
We can blame NSA for poor operational security (though we should applaud them for getting information to Microsoft so a patch could be issued two months ago).
We can blame the criminals behind WannaCry for targeting hospitals. And we can blame hospital administrators for wanting the benefits brought with the IT revolution without taking on the costs of securing or updating those systems.
But we can’t blame the NSA for spying. That’s what they do.