For more than 50 years during the Cold War, deterrence was a cornerstone of U.S. strategy. The United States aimed to prevent the Soviet Union from attacking the West by threatening to retaliate with a devastating nuclear response. Following the terrorist attacks of September 11, 2001, however, many observers argued that deterrence was irrelevant to the U.S.-led war on terror. Analysts claimed that unlike the Soviet Union's leadership, terrorists were irrational, willing to incur any cost (including death) to achieve their goals, and would be difficult to locate following an attack. For these reasons and others, it was thought that threats to retaliate against terrorists would be inherently incredible and insufficient to deter terrorist action.
These early views shaped the U.S. government's initial strategy to address the terrorist threat. The deterrence approach remains a poorly understood and underutilized element of U.S. counterterrorism strategy. It holds, however, great potential for helping to thwart future terrorist attacks. We argue that, unlike in state-to-state deterrence, deterrence against terrorism can only be partially successful, and that it will always be a component—and never a cornerstone—of national policy. Nevertheless, as long as states can deter some terrorists from engaging in certain types of terrorist activity, deterrence should be an essential element of a broader counterterrorism strategy.