Robert M. Lee is an active-duty U.S. Air Force Cyber Warfare Operations Officer and a PhD candidate at Kings College London researching cyber conflict and control system cyber security. His views and opinions are his alone and do not represent the U.S. Government, Department of Defense, or U.S. Air Force. You can follow him on Twitter @RobertMLee.
The sharing of cyber threat data has garnered national level attention, and improved information sharing has been the objective of several pieces of legislation and two executive orders. Threat sharing is an important tool that might help tilt the field away from adversaries who currently take advantage of the fact that an attack on one organization can be effective against thousands of other organizations over extended periods of time. In the absence of information sharing, critical infrastructure operators find themselves fighting off adversaries individually instead of using the knowledge and experience that already exists in their community. Better threat information sharing is an important goal, but two barriers, one cultural and the other technical, continue to plague well intentioned policy efforts. Failing to meaningfully address both barriers can lead to unnecessary hype and the misappropriation of resources.
The first barrier is the tight-lipped culture that hinders information sharing within the U.S. critical infrastructure community. Asset owners and operators of the type of critical infrastructure often highlighted in the news, such as the energy and water sectors, live in a culture where reporting incidents seems to only bring trouble. Voluntarily reporting a cyber incident can bring legal repercussions. Likewise, many of the organizations that own, operate, and maintain the United States’ critical infrastructure are publicly traded companies. Even with legal immunity, there are financial losses that can occur from reporting a cyber incident when stockholder and investor confidence is lost. Moreover, reporting incidents can initiate follow-on requests from the government which take valuable time and limited resources to satisfy.
It is not uncommon for the critical infrastructure community to think poorly about bringing in outside help instead of trying to tackle issues on their own. Speaking out publicly can also be difficult when news media is quick to highlight stories of cyberattacks on infrastructure and chastise the insecurity of the systems running it. Stories about cyberattacks against the power grid, oil pipelines, and hydroelectric dams generate immense attention especially when nation-state level actors such as Russia, China, or Iran can be seemingly linked. It makes for good media, it makes for good readership, and it makes for a good story. Unfortunately, this hype has lasting impacts, diverting attention to perceived threats instead of the real issues.
The difficulty in debunking the hype stems from the second barrier to information sharing—technical shortfalls in critical infrastructure systems that lessen the availability of meaningful data. Proper cyber threat sharing requires meaningful technical data that is often difficult to obtain in critical infrastructure. Industrial control systems use information to affect the physical world. These control systems generate and harness power for the grid, control the flow and purification of water, and operate nuclear reactors. They were built to last for decades, to operate in harsh environments, and be efficient at their designated tasks. Information security for these systems detracted from their mission, such as keeping the power on or keeping the water running, and do so safely with the highest efficiency possible, and so they were often developed with little to no thought of how to keep attackers out. They were also built without consideration of recording and maintaining the type of data useful for threat sharing.
After a cyber incident occurs, incident responders are called in to collect data, contain the incident, and extract lessons learned. This data can help identify the adversaries or malicious software in other organizations as well. These indicators of compromise are central to threat information sharing. However, incident response for cyber incidents in control systems is a young field. The data is simply not present in most cases, allowing observers to generate wild theories instead of relying on facts. For example, Bloomberg published an article on the 2008 Baku-Tbilisi-Ceyhan pipeline explosion in Turkey. When the event occurred, Kurdistan Workers’ Party, an internationally recognized terrorist organization, claimed credit for the attack. The Bloomberg article, published seven years later, refuted this claim stating that the attack was caused by cyberattacks, with Russia as a likely culprit. The story presents a number of attack paths that the adversary leveraged to gain access and cause physical damage to the pipeline. Each one of the attack paths presented is plausible, making the story is believable, and has quickly been accepted as a true event.
Unfortunately, an understanding of the attack paths presented together with technical knowledge of the type of systems impacted reveals a different story. The incident response data that would have been required to validate the story largely does not exist. Instead, the story relies on anonymous intelligence officials and anonymous incident responders. The report is very likely not true and yet it will remain an incident cited for years to come. The absence of data to confirm the story is the same reason that threat sharing in critical infrastructure is difficult—the data required simply does not exist.
Identifying threats to critical infrastructure will continue to be an appropriate motivation for cyber threat sharing discussions. However, facilitating information sharing through legislative change is not a silver bullet. Barriers to threat sharing is more often a mixture of cultural and technical challenges than any one root cause that can be easily overcome. Making information sharing discussions more meaningful will require incentivizing the community, from the companies who manufacture the control systems to those organizations who operate them, to create and run systems that are capable of reporting and storing the technical data needed. It will also require the proper cultural mechanisms to share the meaningful data without facing penalties even beyond those that can be mitigated with legal immunity.