Is Digital Always Better?
from Digital and Cyberspace Policy Program and Net Politics

Is Digital Always Better?

Digital technology has been widely embraced, especially during the coronavirus pandemic. However, in some situations, it could actually make systems more prone to unpredictable accidents. Policymakers and business leaders need to weigh these risks to determine whether digital is always better.
Computer network equipment is seen in a server room in Vienna, Austria.
Computer network equipment is seen in a server room in Vienna, Austria. REUTERS/Heinz-Peter Bader

Nathan Marx is a graduate student at Columbia University’s School of International and Public Affairs and a former cybersecurity researcher.

As digital technology has become increasingly central to our social and professional lives, it could seem farcical to imply that its costs sometimes outweigh its benefits. Digitalization, the process of converting analog systems to digital ones, undoubtedly creates opportunity for immediate efficiency gains. The proliferation of sensors and ubiquitous connectivity will also provide more data about the physical world. One analysis suggests that the number of internet-connected devices will grow by at least 12 percent per year over the next decade. Moreover, videoconferencing and file sharing have allowed many businesses to function during the coronavirus pandemic, a time of unprecedented disruption. Yet, in our increasingly networked world, relatively little analysis has explored the inherent risks of digital systems. Some results from systems theory [PDF] suggest that switching to digital technology is costlier than it seems. Counterintuitively, for many applications, the smart move could actually be to de-digitalize systems.

More on:

Cybersecurity

COVID-19

Infrastructure

Technology and Innovation

In his seminal work on complex systems, Normal Accidents, Charles Perrow demonstrates how interactive complexity is a property that makes a system prone to damaging and unpredictable accidents. Systems have interactive complexity when their components can interact in non-linear and unanticipated ways: for example, take a power plant that converts its analog systems to networked digital devices. These computers undoubtedly make operators’ jobs easier and probably make the plant run more efficiently. But, even if these computers are not connected to the outside world, they could be networked to each other and thus have more potential for interaction than functionally equivalent analog devices. The outcome is that an error on one machine could bring other machines down with it. Due to the interactive complexity of many systems, it is impossible to enumerate all the ways they could fail, so planning for failure becomes much more challenging.

A computer that replaces an analog system massively increases the chance of complex interactions, even when running a simple program, in part because necessary system software is so complex. The Microsoft Windows 10 operating system, for example, contains approximately fifty million lines of code. This complexity can lead to vulnerability and emergent effects, where unpredictable interactions between systems lead to unanticipated and possibly damaging behavior. As the birth of the field of cybersecurity shows, digital systems have also proven far more susceptible to malicious use than analog ones. Complexity gives attackers more places to strike and internet connectivity means attackers do not need to be physically proximate to their target, a massive barrier against attacks on analog systems.

This danger is far from theoretical and exists in systems as diverse and critical as power plants, electronic voting machines, and nuclear weapons. Fortunately, in the realm of industrial control systems (ICS), these risks are beginning to be taken seriously. Richard Danzig, former Secretary of the Navy, suggests [PDF] ICS operators use analog components as fail-safes to reduce some dangers that digitalization introduces. Also, Senator Angus King (I-ME), co-chair of the Cyberspace Solarium Commission, suggested in his Securing Energy Infrastructure Act that reintroducing analog devices might reduce ICS risks. Separately, in his book Hacking the Bomb, the scholar Andrew Futter suggests that nuclear weapons and their command and control systems could be especially vulnerable to unpredictable failures, and that digital systems likely exacerbate the risk. Regardless, the 2018 Nuclear Posture Review [PDF] explicitly directs the government to develop and manufacture new digital systems for nuclear weapons. Though parts of election systems have been digitalized in some countries [PDF], like Estonia [PDF], it has been suggested that voting systems be completely analog to reduce the risk of interference.

It is unlikely that the systems mentioned above are the only ones that would benefit from selective de-digitalization. Instead of accepting digitalization as an inevitability, both businesses and governments should consider the cost-benefit tradeoff of digital systems. The costs are far higher than just the price of computers and installation; when potentially catastrophic constant risks, like cyberattacks, are factored in, the efficiency gains digitalization brings could not be worth it. Although the coronavirus pandemic has created a rush of new business and social practices that are more dependent on digital technology than ever, business leaders and policymakers should question whether digital is always better.

More on:

Cybersecurity

COVID-19

Infrastructure

Technology and Innovation

Creative Commons
Creative Commons: Some rights reserved.
Close
This work is licensed under Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) License.
View License Detail