Schrödinger’s Hacking Law and Cyber Burnout: Capacity Building in U.S. Cybersecurity
from Net Politics, Digital and Cyberspace Policy Program, and Renewing America

Schrödinger’s Hacking Law and Cyber Burnout: Capacity Building in U.S. Cybersecurity

Recruiting problems in cybersecurity will continue until private and public sector organizations make defenders' mental health a priority and policymakers address the poorly written Computer Fraud and Abuse Act. 
Andrew Beard and Barrett Darnell compete in a contest during the Def Con hacker convention in Las Vegas, Nevada on July 29, 2017
Andrew Beard and Barrett Darnell compete in a contest during the Def Con hacker convention in Las Vegas, Nevada on July 29, 2017 Steve Marcus/Reuters

In 2021, more than 2.7 million jobs in cybersecurity were unfilled. The dearth of cybersecurity experts serving anywhere in government and private industry has been described as a national security threat and an imperative. There are two reasons for this severe shortage of people in cybersecurity: a bad law and missing mental health support. 

First, the bad law–which makes it arguably illegal to learn to be a computer security expert–has a villain’s backstory. In 1986, thanks to policymakers who were overly terrified by a 1983 fictional movie starring Matthew Broderick called Wargames (to be fair, this movie along with 1992’s Sneakers and 1995’s Hackers is beloved among the cybersecurity community), the United States got stuck with a truly terrible law called the Computer Fraud and Abuse Act (CFAA). And every day since, every person who’s been recruited to serve as a cyber warrior by the U.S. government has no idea whether they are a de facto multiple felon. There’s no real way to determine whether a CFAA violation has or will actually happen if you’re practicing on almost any computer using almost any technology, because interpretations of that law are up to the individual understanding of any local prosecutor, and local criminal prosecutors do not, in my sadly-more-than-typical involvement in CFAA prosecutions, have a great deal of understanding of the finer points of computer network access. 

More on:

Cybersecurity

Technology and Innovation

This lack of prosecutorial technical knowledge makes the CFAA uniquely problematic. Most prosecutors and juries can intuitively understand things like assault, drugs, and theft, but prosecutorial discretion in tech crimes, when those prosecutors do not understand the tech itself, means that many prosecutors rely on their emotions and politics to determine whether to prosecute someone under the CFAA. The CFAA, and the lack of technical knowledge of prosecutors combined with the range of discretion it offers them, makes learning offensive cyber techniques a kind of Schrödinger’s felony.

If policymakers had reacted to watching Jaws by banning surfing and leaving enforcement up to prosecutors who’d never learned to swim, you’d have the marine equivalent to the Computer Fraud and Abuse Act. Eventually, you’d end up with no one able to cope with oceanic threats other than those who’d been willing to break the law to brave the waves. Then, imagine that the United States had a severe shortage of Coast Guard applicants who could already swim, fish, survive in hurricanes, and engage in deep sea rescue, and was totally bewildered as to why this shortage existed.

The best cybercrime attorneys in the United States most of the time cannot tell you if you have broken or would break the law by doing something as simple as running an Nmap scan, which is the cyber equivalent of walking down a neighborhood street, noticing from the sidewalk that a neighbor’s door is gaping open even though no one looks to be home, and texting your neighbor to tell them they might have a security issue. This inability to tell if a law is being broken is not only unconstitutional per the Vagueness Doctrine, it is also one of the ways that a profound lack of diversity in cybersecurity manifests, contributing even further to the lack of cybersecurity talent development. The consequences for potentially breaking the CFAA are clearly, tragically far worse if you are a person of color, career failures and missteps are punished far more harshly in women, and it makes sense to more carefully avoid learning offensive techniques if you are likely to face outsized consequences for so doing.

You can trip and fall accidentally into a CFAA violation, or your good intentions could be misinterpreted or politicized, and that makes many people who want to go into government service or any other organization incredibly nervous about learning any offensive cyber techniques. Imagine if in every county of the United States, laws regarding whether you could hold and use a knife or firearm or pair of combat boots were not only different, but totally up to the local prosecutor’s judgment? How many Army recruits do you think would show up to basic training having absolutely never touched any of the simple tools of the military for fear they might have run afoul of any regulations before they showed up? It’s still common for me to hear from regular information technology professionals who expand their skill sets out into even defensive cyber techniques that they fear their companies will view them as untrustworthy or criminal. Our ability to confidently train this talent is absolutely parlous. 

The government hiring process regarding clearances for national security service is especially hamstrung by the CFAA and its consequences. I just read a form SF-86, the background check form for all security clearance applications, the other day (idly speculating that I might one day want to be cleared for government service) and there’s a question in it on whether you’ve illegally accessed computers you don’t own. The answer, for me and for almost any good faith researcher who’s done any exploration of online networks, is almost certainly “well, what do you want me to say?” Under Schrödinger’s Hacking Law, someone could very much say “I don’t know”, “Yes”, or “Of course not” and their answers could each be true. Sadly, answering this question truthfully could lead to being denied the clearance to serve our country, and the lengthy process of clearance (often taking up to two years at this point) while someone’s life is on hold means that many of the best computer security experts don’t want to wager on years of their high-earnings lives being eaten up with no reward. And there’s little to no path to that service for anyone who used to walk a bit more on the wild side, may have gotten in trouble a time or two, and who genuinely wants to keep people safe now. It is a shame to give up on that talent when it is so desperately needed. I’ve said it a few times: when it comes to cybersecurity, the best shepherd is not a sheepdog, but a reformed wolf.

More on:

Cybersecurity

Technology and Innovation

Last year, the Department of Justice announced it would no longer prosecute good faith hackers reporting vulnerabilities (why were they ever doing so to begin with?), and many headlines misleadingly implied that the CFAA would no longer apply to them. Unfortunately, no actual laws were changed, and the only thing truly achieved was a very nice statement of intent. Unfortunately, the Department of Justice is chock-full of prosecutors who cannot tell or do not care about the difference between a good faith researcher and a computer criminal. Though a clarification through the recent Supreme Court decision on Van Buren v. United States (a police officer searched online records to aid in criminal activity and the Court overturned his conviction under the CFAA as he did not exceed his authorized access) has slightly improved the precedents around the CFAA, we are learning that precedent may not matter very much in cases where judges have no personal stake in upholding it or a very political motivation to ignore it.

Second, and vitally, we have serious problems supporting people who go into information security in both government and private industry in terms of mental health and stress. A large proportion of the cybersecurity industry is dedicated to something called Digital Forensics and Incident Response (DFIR). A lot of what these folks do (we call them the blue team) is find, examine, and provide evidence in cases of revenge porn, child sex abuse material, ransomware, blackmail, theft, cyber harassment and stalking, and evidence in physical assault and violence.  My industry is filled with thousand-yard stares from people who should be cycling in and out of their roles every two years like the military and police do with high violence and abuse tours including bomb disposal and sexual assault cases. 

Instead, infosec professionals have been known to self-medicate with booze and memes and conference parties that make 1970s rockstar hotel bashes look like knitting circles. My industry is despairing, seeing the worst of humanity, and most people won’t listen to the simplest request that the security team makes if it costs any productivity time or cuts into the sales kickoff. Cybersecurity professionals are exhausted, burnt out, and their warnings are often sidelined or deliberately buried to protect stock prices or the boss. There’s no concept of evacuating civilians first before the military moves in, in cyberwar--the civilians can’t leave the battleground because the civilians, in all ways that matter, are the battleground. Until that problem is addressed, the national security process will not mix well with cybersecurity professionals.

Mental illness, burnout, and what can only be called battle fatigue runs so rampant in my industry that there’s an accepted if not always visible cycle of transitioning in and out of companies and taking time off for mental health care. One in six chief information security officers self-report that they drink to manage stress. I’m pretty sure a few of the others are either lying or in recovery. For those of us who don’t self medicate in that same way, our industry is big into hobbies like helicopter skiing, motorcycle racing, skydiving, stunt piloting, hunting and target shooting–essentially, mentally and physically absorbing activities that kill you if you do them wrong. I’m a pilot and motorcyclist, because I absolutely cannot think about anything other than flying the plane or the next curve on my bike or I’ll die. I find it quite relaxing.

This mental health crisis, and the not-always-healthy responses it gives rise to, are especially consequential for government hiring. The Bond Amendment makes it explicitly illegal to give someone who has illegally used any kind of drug, including cannabis, any level of security clearance. Additionally, security clearance investigators often look for signs of alcohol abuse, and problematic alcohol use can disqualify you from the process. Many people in cybersecurity use cannabis either with a doctor’s prescription or simply as a mild intoxicant or anxiety aid, and the idea of making sure that per regulations you’re not even in the room when someone is using cannabis is burdensome and pointless to many people. Former FBI head James Comey lamented almost a decade ago that it was difficult to hire cyber talent because young people would simply choose to not apply if they were required to avoid cannabis. Those young people have not grown magically intolerant of the devil’s lettuce in the intervening decade—instead, they’ve grown ten years older and increasingly impatient with a seemingly-pointless ban on a substance many times less harmful than legal cigarettes or alcohol that is completely legal in twenty three states and with a prescription in fourteen more

I recall a great conversation with someone who hires in national security cyber wherein it was pointed out to me that being able to follow rules when it comes to cannabis and clearances is a feature, not a bug, in hiring. However, being the kind of cybersecurity pro who can think sufficiently like a criminal to protect key government assets is also often accompanied by a less-than-rulebound personality. Unfortunately, the U.S. government is pre-sorting in hiring for people who have chosen their whole lives to follow the rules, which naturally excludes a large number of cybersecurity professionals by inclination. I don’t know what the solution to that problem is, but I’m not sure that decriminalizing cannabis would do the job.

Fixing the cybersecurity pipeline requires repealing the CFAA, providing mental health care as a matter of course, and having a modern attitude to self-care in a high-stress job for information security professionals to respond to threats to our national cybersecurity. We have less of a pipeline problem in cybersecurity than an intake problem because of bad laws along with a support systems problem. We have to tackle both those problems before continuing to cram more people into this field either ill-prepared or wholly unsupported as we expect them to take on the world’s most complex technical issues.

Creative Commons
Creative Commons: Some rights reserved.
Close
This work is licensed under Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) License.
View License Detail