As I continue to do my part to evangelize the NIST Cyber Framework, the most common criticism I hear is that it is nothing more than one big, long checklist. My initial response was to parrot back what the Framework says on this point: “The Framework is not a checklist to perform. It presents key cybersecurity outcomes identified by industry as helpful in managing cybersecurity risk.” My new answer is to say it’s not a checklist, but your organization should turn it into one.
Nobody in the cybersecurity community likes this answer. The reason they state is that we face an adaptive enemy, one that will change tactics and can’t be contained by a set of measures on a clipboard (or iPad app, or GRC application). They point out that companies that are compliant with security requirements are often breached. Therefore, requirements are useless.
In other fields where an operator faces an adaptive adversary—say fighter pilots or surgeons facing possibly the most adaptive of all adversaries, cancer—checklists have been shown to be crucial to both safe flights and better patient outcomes. When things go wrong in these fields, instead of throwing out the checklists, operators work to understand why the failure occurred. They then update or create new checklists to prevent the failure from happening again.
That doesn’t sound like much fun to today’s cybersecurity industry. “Cyber Warriors” don’t want to think of themselves as box checkers or to believe that what they do can be broken down into a series of repeatable steps that someone else could do. Yet it is probably our best hope for managing cyber risk.
In The Checklist Manifesto, Atul Gawande walks the reader through the painful process of getting omnipotent surgeons (like himself) to recognize that the volume of knowledge they needed to perform operations and the complexity of those operations were not something they could manage in their heads—no matter how good they were. This process is something the aviation industry went through long ago in order to get aircraft operations to the point at which only one in 500,000 flights experiences any form of mechanical failure.
Over time, and through a lot of revisions and introspection after things went wrong, the use of checklists has caught on in both fields. Planes rarely fall out of the sky. Over the last decade, no wonder drug or device has been shown to reduce both morbidity and mortality as much as the adoption of checklists in hospitals. A World Health Organization study found that use of a pre-surgery checklist reduced the rate of deaths and surgical complications by more than one-third.
The process of moving cybersecurity from the domain of the cyber warrior to the cyber box checker will be slow and painful but it will happen (threats to aircraft from hackers probably will speed up this process at Boeing—the company that taught Gawande a lot of what he knows about checklists). Given that all the systems we need to protect in cybersecurity are already running on 1s and 0s (as opposed to the human body or older aircraft), cybersecurity has the inborn advantage that the checking of these checklists can be automated and done continuously. The Department of Homeland Security through its Continuous Diagnostics and Monitoring has pioneered much of this work.
Operators hate checklists because they make what they are doing boring. From a policy perspective, that is exactly what we want to happen. Bringing in Total Quality Management or Six Sigma to cybersecurity is exactly how we will get this problem off the front page and out of the boardroom.