- Blog Post
- Blog posts represent the views of CFR fellows and staff and not those of CFR, which takes no institutional positions.
Michael Harteveldt is an intern with the Digital and Cyberspace Policy program at the Council on Foreign Relations.
Every quarter, Net Politics publishes Report Watch, which distills the most relevant digital and cyber scholarship to bring you the highlights. In this edition: internet shutdowns, cyber wargames, and defending democracies from disinformation campaigns.
The Global Network Initiative (GNI) released a report in June analyzing government-mandated internet disruptions and their effects on vulnerable and marginalized groups.
In recent years, network disruptions and shutdowns have become more common and costly, costing the world economy $2.4 billion in 2015 according to one estimate. These shutdowns aim to disrupt the flow of information often when there is rising public protest or opposition that a government deems to be driven and intensified by digital communication channels. In 2017 alone, the report estimates that about 109 shutdowns or disruptions occurred, up from seventy-five in 2016 and just over thirty in 2015.
Notably, India was accountable for about 70 percent of the shutdowns and disruptions in 2017, showing that this is a tool of democracies, not just illiberal regimes. Ryzdak argues the expansion of shutdowns in stable democracies like India is particularly disturbing because other countries at a similar stage of development could adopt similar information-control techniques.
To reverse the trend of shutdowns, Ryzdak recommends that advocacy campaigns “illustrate the counterproductive consequences of shutdowns for all human rights, stakeholders, and social and economic sectors” through the collaboration of academic research, governments, and large companies. To reduce the frequency of shutdowns, disruptions must be proved “ineffective and prohibitively expensive.”
“Cyber Operations in Conflict – Lessons from Analytic Wargames” by Benjamin Jensen and David Banks
Jensen and Banks use war games to investigate the strategic preferences of decision makers for the use of cyber operations as coercive tools. Despite the widespread assumption that cyber operations are escalatory, their findings suggest that cyber capabilities have a moderating influence on coercive actions. The authors contribute to an emerging body of work that uses war games to understand how cyber operations might be used in an interstate crisis.
Jensen and Banks first ran a version of the games with students and national security professionals before running two different games in the form of survey experiments with more than 3,000 online participants. One simulated a crisis between the U.S. and China in the South China Sea and another simulated a crisis where an opposition movement launched cyber operations against a national government in the context of an escalating crisis.
The authors draw three conclusions from the war games. First, participants were “restrained in their use of cyber tools.” They argue this is because cyber operations do not replace traditional coercive policies such as economic sanctions, diplomacy, and military strikes.
Second, participants generally used cyber operations when they lacked other effective coercive means below the threshold of armed conflict. This signals that cyber operations may in fact be stabilizing rather than destabilizing given that states will use tried and true tools like diplomatic or economic coercion before reaching for their digital weapons.
Finally, the type of regime of an adversary impacted cyber strategy. Jensen and Banks found that participants who played autocratic regimes in cyber disputes were more likely to use cyber weapons, while those who faced democratic adversaries were less likely to use them.
Jensen and Banks conclude that cyber tools allow states to “manage escalation ‘in the shadows,’” meaning cyber power should be viewed as a means of covert action, not as a traditional means of warfare.
“The ASD Policy Blueprint for Countering Authoritarian Interference in Democracies” by Jamie Fly, Laura Rosenberger, and David Salvo
Fly et. al. outline a series of steps the U.S. government and non-government actors can take to counter future Russian and other authoritarian interference in the U.S. democratic system.
Consistent with the U.S. intelligence community finding of Russian interference in the 2016 election, Fly et. al. argue that Russia “weaponized” the United States’ openness to exploit existing political fault lines to advance Russian interests. Specifically, they point to Russia’s ability to successfully exploit the polarized U.S. media environment and social media to push disinformation, porous campaign finance laws, poor cybersecurity, and a declining faith in institutions. The authors also recognize that the Russian interference effort is still ongoing, and other authoritarian states like China, Qatar, and the United Arab Emirates have adopted similar interference operations in the United States.
The authors put forth ten recommendations to combat future attacks and manipulation, as summarized below in three broad categories:
1. De-incentivize, Deter and Protect: the authors recommend that the U.S. government should articulate the threat of foreign influence in the democratic process and develop clear actions in response to Russian-style incidents. Through congressional action, administrations would be mandated to report attacks on election infrastructure, including campaigns. Candidates and political parties should also pledge to not weaponize information obtained through kompromat. Lastly, the authors suggest closing vulnerabilities by ensuring integrity in the electoral process by ending “illicit finance” and political influence from foreign entities.
2. New Norms of Transparency, Communication, and Respect: the authors recommend that the U.S. government should fortify European-U.S. partnerships to coordinate EU, NATO, and U.S. efforts to stop foreign influence operations. They advocate for the creation of new norms for social media that promote more transparency about data privacy policies and political ads. Furthermore, the U.S. government and tech sector should foster better public-private sector relationships to combat emerging disinformation threats, such as the use of “deep fake” technology.
3. New Guidelines: the authors recommend media organizations create new guidelines for journalists regarding the use of information from social media accounts and state-sponsored misinformation campaigns Fly et. al. also recommend media outlets take caution when reporting on leaked information and be vigilant to the agenda of the actor who obtained and released the information. They also advocate for increasing support for struggling local and independent media sources through philanthropy, as these types of media are generally more trusted than national media companies and can help in the effort to combat misinformation from foreign adversaries.
Overall, Fly et. al. argue that a “bipartisan response from both the Executive Branch and Congress” is critical to better fix U.S. vulnerabilities and improve deterrence, as well as make the penalties for launching similar operations more severe and effective.