The Sources of American Legitimacy

Authors: David C. Hendrickson, and Robert W. Tucker
Foreign Affairs

Summary: Throughout its history, the United States has made gaining international legitimacy a top priority of its foreign policy. The 18 months since the launch of the Iraq war, however, have left the country's hard-earned respect and credibility in tatters. In going to war without a legal basis or the backing of traditional U.S. allies, the Bush administration brazenly undermined Washington's long-held commitment to international law, its acceptance of consensual decision-making, its reputation for moderation, and its identification with the preservation of peace. The road back will be a long and hard one.

Robert W. Tucker is Professor Emeritus of American Foreign Policy at Johns Hopkins University. David C. Hendrickson is Robert J. Fox Distinguished Service Professor at Colorado College.

The 18 months since the launching of the second Iraq war have brought home, even to its advocates, that the United States has a serious legitimacy problem. The pattern of the first Iraq war, in which an overwhelming victory set aside the reservations of most skeptics, has failed to emerge in the aftermath of the second. If anything, skepticism has deepened. The United States' approval ratings have plunged, especially in Europe-the cooperation of which Washington needs for a broad array of purposes-and in the Muslim world, where the United States must win over "hearts and minds" if it is to lessen the appeal of terrorism. In both areas, confidence in the propriety and purposes of U.S. power has dropped precipitously and shows little sign of recovery.

Legitimacy arises from the conviction that state action proceeds within the ambit of law, in two senses: first, that action issues from rightful authority, that is, from the political institution authorized to take it; and second, that it does not violate a legal or moral norm. Ultimately, however, legitimacy is rooted in opinion, and thus actions that are unlawful in either of these senses may, in principle, still be deemed legitimate. That is why it is an elusive quality. Despite these vagaries, there can be no doubt that legitimacy is a vital thing to have, and illegitimacy a condition devoutly to be avoided.

How to restore legitimacy has thus become a central question for U.S. foreign policy, although the difficulty of doing so is manifest. At a minimum, restoring international confidence in the United States will take time. The erosion of the nation's legitimacy is not something that occurred overnight. Washington is unlikely to succeed at renewing it simply by conducting better "public diplomacy" to "make the American case" to the world, for world public opinion already rejects the case that has been made. If the United States is going to be successful in recapturing legitimacy, it will have to abandon the doctrines and practices that brought it to this pass.


To understand the sources of U.S. legitimacy in the post-World War II era, it helps to examine the public professions of the country's leaders. They tell a remarkably consistent story, one that pledges the use of U.S. power to international law. Just as civilization itself is distinguished by the insistence that conflicts be settled by means other than brute force, so U.S. postwar leaders insisted that international relations be ordered by the same principle. This principle had all the more appeal because it was championed in circumstances in which, only a short time before, it had been blatantly violated. The old European order that perished in 1945 had begun its descent into oblivion and nihilism with the butchery of 1914 and with the declaration of Germany's then chancellor, Theobald von Bethmann-Hollweg, that the treaty guaranteeing Belgium's neutrality was merely a "scrap of paper." The German regime that brought on World War II was even more contemptuous of international law. It acted avowedly on the principle that might makes right.

Much of the world in the twentieth century rebelled against this position. In 1945, at Nuremberg, Supreme Court Justice Robert Jackson, leading the American prosecution of the major Nazi war criminals, emphasized that the wrong for which German leaders were on trial was "not that they lost the war, but that they started it." Jackson refused to be "drawn into a trial of the causes of the war, for our position is that no grievances or policies will justify resort to aggressive war. It is utterly renounced and condemned as an instrument of policy." The same position was taken in the U.S.-inspired Charter of the United Nations. Peace was the great goal to which all other ends were subordinated. In obligating the UN's individual member states to refrain "from the threat or use of force against the territorial integrity or political independence of any state," the charter permitted but one clear exception: force could be employed in self-or collective defense against an armed attack.

Despite the repeated avowals of U.S. leaders committing their country to the rule of law, some influential pundits now maintain that international law had little or nothing to do with the legitimacy accorded the United States after 1945. "It was not international law and institutions but the circumstances of the Cold War, and Washington's special role in it, that conferred legitimacy on the United States, at least within the West," writes noted commentator Robert Kagan. "Contrary to much mythologizing on both sides of the Atlantic these days, the foundations of U.S. legitimacy during the Cold War had little to do with the fact that the United States helped create the UN or faithfully abided by the precepts of international law laid out in the organization's charter." Washington reserved for itself, he maintains, the right to intervene "anywhere and everywhere." These are convenient retrospective judgments regarding a state that now flouts principles it once held dear, but Kagan's position reflects a case of profound historical amnesia about the bases of U.S. internationalism.

In denigrating law as a pillar of U.S. legitimacy, Kagan emphasizes instead the role Washington played in containing Moscow. Although it is certainly true that the protection the United States accorded Western Europe from Soviet expansionism conferred legitimacy on U.S. power, it is equally true that allied diplomats repeatedly justified this enterprise in terms of its conformity with the principles of the UN Charter and its rule forbidding aggression. Had NATO been constituted on any other basis it would not have gained the support it did. This marrying of strategic vision and moral purpose is not so strange; indeed, neoconservatives themselves often emphasize that it ought to be the hallmark of U.S. foreign policy. Where they have departed from the classic understanding is in substituting their own moral purpose-promoting the extension of democracy, through force if necessary-for that favored by the architects of the post-World War II order, which emphasized the protection of the democratic community through rules constraining the use of force.

The United States, to be sure, did not always scrupulously adhere to the rules of the charter in its conduct of diplomacy, as for instance when it quarantined Cuba to prevent the arrival of further Soviet nuclear armaments in 1962. But U.S. leaders generally made every effort to square their actions with international law. And despite some transgressions, the overall fidelity of the United States to internationalist norms contributed strongly to the legitimacy of U.S. power. The converse proposition-that world public opinion was reassured by illegal and aggressive U.S. actions, such as may have existed-would be absurd. This, clearly, is not what European leaders say now, and it is not what they thought then.

The legitimacy of U.S. power was also enhanced by Washington's commitment to consensual modes of decision-making, a commitment that stemmed from the democratic character of the U.S. polity and that was reflected in the flurry of institution building that occurred during and after World War II. Although the collaborative system of decision-making envisioned by the UN Charter was an early victim of the Cold War, the United States continued to seek for its policies the widest possible consensus within the Western alliance and within international society more generally. The normative order of the alliance enshrined the importance of consultation, and U.S. policymakers worked tirelessly to reconcile the divergent views of the nation's partners and forge a common policy through compromise.

As the preponderant military power within the alliance, the United States was always accorded a special role, but U.S. leaders took close account of the vital interests and perspectives of their allies. This attitude arose both from the exigencies of the Cold War, especially the danger that allied states might be tempted by neutralism, and from a heartfelt commitment to democratic procedures. From the standpoint of the U.S. political tradition, which based legitimate rule on the consent of the governed, it was apparent that the union of like-minded republics could be created and preserved only through institutions that gave its members a voice in their common concerns. It was not always possible to do that, and U.S. administrations prosecuted the war in Vietnam despite mounting worldwide opposition nearly as strenuous as that which attended the Iraq war of 2003. The United States' isolation in international society in the late 1960s, however, also represented the lowest ebb of U.S. legitimacy in the post-World War II era.

A third factor underlying U.S. legitimacy was the reputation Washington acquired for moderation in policy. After World War II, it seemed apparent that the United States had assumed its responsibilities as guardian of the peace with genuine reluctance. European leaders therefore worried that the United States might at some point be tempted again by the siren call of isolationism. By virtue of its geographic separateness, the United States could have considered opting out of the superpower contest, as it had previously opted out of the Treaty of Versailles, and that very sense of unwilling participation helped underpin U.S. legitimacy. So, too, its geographic remoteness gave its participation in the system a more disinterested character than that of European states and fitted it well for the pursuit of what the political scientist Arnold Wolfers called "milieu goals" relating to the broader international environment, such as the pursuit of peace.

The fourth factor accounting for U.S. legitimacy was Washington's success in preserving peace and prosperity within the community of advanced industrialized democracies. Although Western European and Japanese leaders sometimes worried that U.S. belligerence might land them in a war with the Soviet Union, peace among the great powers was preserved. Happily, the contradictions of "extended deterrence" and the nearly theological disputes over nuclear doctrine were never put to the test, and it was not unreasonable to attribute the long peace to the persistence and stability of U.S. power. To those whose memories had been formed by the catastrophic world wars of the twentieth century, this seemed like deliverance from unspeakable evil. The widespread response within the free world was gratitude for the salvation wrought by the United States and the belief that U.S. power was both necessary and rightful-was, in short, legitimate.


Seen against the backdrop of these factors, the startling loss of legitimacy that has occurred in the administration of President George W. Bush is not so mysterious. Even before the attacks of September 11, 2001, the Bush administration revealed a deep suspicion of international law. Its undersecretary of state for arms control and international security, John Bolton, had noted in the late 1990s that "it is a big mistake for us to grant any validity to international law even when it may seem in our short-term interest to do so-because, over the long term, the goal of those who think that international law really means anything are those who want to constrict the United States." This augured a fundamentally contemptuous attitude toward the principles that had previously sustained U.S. legitimacy. But what were straws in the wind before September 11 soon became a virtual tornado as the Bush response to the attacks became clear.

In short order came a series of pronouncements and a set of doctrines that stood in stark contrast to the ideals and principles that had attended the United States' rise to superpower status. By declaring that "either you are with us, or you are with the terrorists," President Bush cast profound doubt over whether his administration would even bother to consult with traditional allies. Rather, it seemed intent on issuing diktats to which they were expected to conform. A new doctrine of preventive war, misnamed the "strategy of preemption," took the place of the doctrines of containment and deterrence that had preserved the nuclear peace during the long contest with the Soviet Union.

Even when the administration approached international institutions it did so with an air of feigned regard but real contempt. The White House made clear that it intended to invade Iraq even in the teeth of Security Council opposition and repeatedly warned that the UN would pass into irrelevance unless it bowed to U.S. demands. The Bush administration also asserted that war against Iraq was justified to depose a tyrant and free the Iraqi people-a position that strongly suggested that Bush accepted in principle the legitimacy of war against any government failing a democratic litmus test.

Taken alone, any one of these doctrines might have seemed an understandable, albeit regrettable, reaction to the trauma induced by September 11. But together they struck the United States' traditional allies, and much of the world, with terror. The United States was showing a face that appeared radically opposed to the ideas and principles for which Washington had once stood. All four of the pillars that supported U.S. legitimacy in the post-World War II era-its commitment to international law, its acceptance of consensual decision-making, its reputation for moderation, and its identification with the preservation of peace-were now in question.

The neoconservatives responsible for this startling loss of U.S. legitimacy have defended themselves by pointing to various precedents in which the United States engaged in illegal or unilateral conduct. But although certain aspects of the Bush doctrine were presaged by earlier administrations, no preceding administration brought all of these elements together in so alarming a way. Ronald Reagan proclaimed the right in theory to overthrow undemocratic regimes, but in practice was hobbled by a resistant Congress and was himself unwilling to commit U.S. forces for this object. George H.W. Bush declared in the aftermath of the Gulf War that he possessed the authority to go to war without the authorization of the UN Security Council or Congress, but he had still sought and received approval from both institutions. Bill Clinton embraced regime change in Iraq but was unwilling to fight a major war for it, preferring the more modest (and ineffectual) strategy of supporting a military coup against Saddam Hussein. Clinton also did not rule out in theory a doctrine of preventive war to forestall the acquisition by "rogue states" of weapons of mass destruction (WMD), but in practice he did not fight one. The precipitous collapse of support for U.S. aims under George W. Bush demonstrates that the nation's allies, indeed most of the world, believe that something fundamental in the U.S. global posture has changed-for the worse.

Undoubtedly, U.S. legitimacy did undergo a dramatic transformation with the end of the Cold War. U.S. legitimacy did not collapse "along with the Berlin Wall and Lenin's statues," as Kagan argues, but it became problematic in a way it had not been previously. Having built up a prodigious military machine in the course of its rivalry with the Soviet Union, the United States now found itself without a military equal and in a position, from a narrow military standpoint, to act without the serious prospect of external restraint. This advantage created a potentially dangerous situation, one that, from the standpoint of traditional American political thought, required correction. Whether in international or domestic affairs, it has been almost a first law of U.S. statecraft that any situation of unbounded power heralds an incipient condition of political pathology. Since the post-Cold War world continued to hold many dangers, it was easy to make the case that the international order required a guardian, but it was equally evident that the guardian's power needed to be restrained, whether internally or externally. The end of the Cold War thus thrust the United States and the world into a Madisonian moment. "In framing a government which is to be administered by men over men," James Madison wrote in the Federalist Papers, "the great difficulty lies in this: you must first enable the government to control the governed; and in the next place oblige it to control itself."

This consideration goes a long way toward explaining the renewed emphasis on U.S. cooperation with the UN that occurred during the Gulf War, when the international body, as George H.W. Bush emphasized, stood poised to fulfill the mission foreseen by its founders. It also explains the emphasis, both before and after the end of the Cold War, that the United States placed on maintaining a global concert of the advanced industrialized democracies. Neither George H.W. Bush nor Bill Clinton allowed the Security Council to constrain U.S. policy in all instances, but they were keenly aware of the importance of respecting the international body's opinions. It is part of the pathology of U.S. power today that the evident need for a constitutional check on the world's most powerful state-a constraint the United States would welcome if it were true to its political heritage-is now seen to stem from spiteful anti-Americanism.

It cannot be said, to be sure, that the Bush White House has been oblivious to the need for securing international legitimacy. By styling its doctrine of preventive war the "strategy of preemption," it sought to approximate its strategy to one of self-defense-for preemption, if the threat is imminent, can at least make a tolerable claim to legitimacy. This approach would have been unconvincing even if banned weapons had been found in Iraq-possessing weapons is not proof of impending attack-but it utterly collapsed when no weapons were discovered. Advocates for war then argued that the administration had never actually said that the threat was imminent, only that it was "grave and growing." Absent a showing of imminence, however, one could not make a plausible claim for the lawfulness of the action. In truth, the Bush administration did not care a fig for whether the war was lawful. It wanted its strategy of preventive war to seem lawful, but the doctrine's implementation never depended on whether the administration's lawyers could write a coherent brief in its favor.

Bush's acceptance of Secretary of State Colin Powell's argument-that Washington would be wise to seek authorization for intervention from the UN Security Council-was another sign that at least some in the administration saw the need to secure legitimacy. There followed the six-month phony war at the UN, when the administration claimed that it was taking the last step for peace, that it expected a peaceful resolution if Saddam were willing to cooperate, and that no decision for war had been taken-all of which was untrue. Throughout the crisis, it was apparent that the decision of the Security Council was perfectly irrelevant to the question of whether the United States would go to war-despite the fact that Washington's rationale for going to war relied in part on Security Council resolutions that the United States proposed, illegally, to enforce itself.

Evidently, the administration regarded the UN in an entirely instrumental light. If it were useful in securing wider support for the contemplated action, the Bush White House was not averse to working through it. But when it became clear that support would not be forthcoming, notice was served that the U.S. commitment to multilateralism was at an end. Said one State Department official in March 2003, "We will want to make sure that the United States never gets caught again in a diplomatic choke point in the Security Council or in NATO." In recognition of the importance that consensual decision-making had played in shoring up U.S. legitimacy in the past, the administration invoked a "coalition of the willing" and went so far as to claim that the real unilateralists were those who opposed its policy. Unfortunately, these "unilateralists" happened to include overwhelming majorities throughout the world, even in most of the countries that were brought into the coalition.

Indeed, it was clear that the Bush doctrine was severely wanting in all four of the elements that had sustained U.S. legitimacy in the past. Washington had acted illegally in going to war against Iraq, and events following the end of major combat operations (the absence of WMD, the growing anarchy) served to weaken rather than strengthen its case. It had gone far beyond the parameters of the 1990s debate over whether the United States should give the nod to the UN or to NATO, evidently deciding that it could dispense with both. It had confirmed the observation of Alexander Hamilton that the "spirit of moderation in a state of overbearing power is a phenomenon which has not yet appeared, and which no wise man will expect ever to see." And it had demonstrated by its every action that it had no plan to secure the peace. Peace was the furthest thing from the administration's agenda. As Edmund Burke said of the French revolutionaries, Bush's policy was "military in its principle, in its maxims, in its spirit, and in all its movements."


Judgments of legitimacy are rooted in law but sometimes do transcend its commands. It is therefore easy to understand why in some hard cases it seems imperative to accord legitimacy to actions otherwise illegal. At the same time, it is easy to understand why the continual process of making exceptions may so vitiate the law that the exceptions themselves become the rule. U.S. foreign policy now finds itself at the bottom of this slippery slope.

The experience of the Cold War contributed to this process; the exigencies of the competition made exceptions to legal conduct seem necessary and therefore rightful. Since the end of the Cold War, unipolarity sent the nation further down the slope. The development was natural, entirely understandable.

But to understand is not to forgive. It is evident that the United States has reached a kind of tipping point, where world public opinion defines Washington as much, if not more, by the ease with which it justifies illegal actions as by its commitment to legality. The United States has assumed many of the very features of the "rogue nations" against which it has rhetorically-and sometimes literally-done battle over the years. The legitimacy of U.S. power has, at a minimum, been eroded significantly, and at certain moments-for instance, in the general revulsion to reports of widespread torture in Iraq-it seems to have vanished entirely.

The road back from perdition will not be easy. It is impossible to undo the various actions that have tainted U.S. legitimacy; they will remain as blots on the record. Still, there can be little doubt that the first requirement for the restoration of the country's legitimacy is a return to lawful conduct. It is not simply "modern liberalism" or the "postmodern" sensibility of western Europe that is offended by the world's greatest power taking the law into its own hands. Objection to that state of affairs has been at the core of Western reflection on international relations since the birth of the modern state system, and it was axiomatic to America's founders, who erected their constitutional regime on the proposition that power must be checked and balanced.

Yet the injunction to return to law must nevertheless contend with two powerful objections: that it would be both imprudent and immoral. It would be imprudent, say the critics, because the principles of the UN Charter that allow for force only in circumstances of self-and collective defense cannot meet the dangers of a world in which terrorists and "rogue states" may acquire WMD. We have to be prepared to take the war to the enemy before he takes it to us. It would be immoral, runs the second line of criticism, because there are certain circumstances in which it is morally imperative to transgress state sovereignty and intervene militarily in the domestic affairs of repressive regimes, especially when acts of genocide are contemplated or ongoing. The United States cannot, according to the now dominant view, allow the Security Council to prevent it from acting in either of these instances.

All of this declaiming against the UN overlooks the fact that the charter itself provides a basis for states to act for their national security without seeking the approval of the Security Council. Nothing in the charter, reads Article 51, "shall impair the inherent right of individual or collective self-defense if an armed attack occurs against a member of the United Nations, until the Security Council has taken the measures necessary to maintain international peace and security." The United States, if attacked, is obligated to report its counterattack to the Security Council, but its right of individual or collective defense is otherwise unimpaired, and with its veto power it may legally prevent any constraint on its right to respond by force. The question is not, then, whether the United States should accord a veto to the Security Council in cases of national or collective defense, but whether it should do so when the use of force would otherwise be illegal.

Such illegal uses of force are in fact unnecessary for U.S. security and actually imperil it. The Iraq war clearly illustrates both points: not only did containment and deterrence offer a perfectly workable method of dealing with Saddam's Iraq, but the consequences of the U.S. occupation have also made Americans much more insecure. Those consequences include daily attacks on American soldiers, the inflammation of opinion in the Muslim world (encouraging new recruits for al Qaeda), and the possibility of further wars arising from the potential disintegration of the Iraqi state.

The baleful results of the Iraq war are also relevant to the dangers posed by the acquisition of nuclear weapons by North Korea or Iran, two instances in which preventive war is often urged. As with Iraq, "preventive" attacks would be remedies worse than the disease and could mean catastrophic war in both regions. U.S. threats of "regime change" also undermine the more reasonable policy of dissuading either state from acquiring such weapons through measures short of war-that is, through a mixture of negative sanctions and positive inducements. The prospects of a grand bargain with either Pyongyang or Tehran would be enhanced were Washington to abandon its not-so-secret wish to bring about the downfall of these regimes.

A second area in which the temptation exists to go beyond the law, but where the need to do so is less than apparent, concerns humanitarian intervention-that is, military action within the territorial jurisdiction of another state to halt abuses of human rights. There is a developing consensus that such interventions may indeed be justified and that the traditional international law forbidding intervention in states' internal affairs must yield to the need to put an end to acts "that shock the conscience of mankind." The question concerns not whether such interventions ought to be forbidden but whether the United States ought to respect certain procedural safeguards if such interventions are to be justified.

Although it is always possible to imagine scenarios in which intervention is blocked by the refractory obstruction of a veto-wielding member of the Security Council, the UN has in fact been reasonably accommodating over the last 15 years, during which time the number and scale of humanitarian interventions have grown far beyond what occurred during the Cold War. In the case of Rwanda, it was not opposition from Russia and China but U.S. skittishness after the failed intervention in Somalia that prevented action by the council. The multitude of peacekeeping missions authorized over the last decade and a half does not support the claim that humanitarian interventions have been seriously hobbled by the veto.

Kosovo, of course, was an exception to this generally permissive environment. In that instance, Russia blocked approval by the Security Council, but this was arguably a case in which skepticism was indeed warranted. Although the Kosovo intervention continues to be defended as an "illegal but legitimate" intervention, there are good grounds for challenging this judgment. The case for military action was attended by gross exaggeration of the scale of killing by Serbian forces and paramilitaries. The indictment of Slobodan Milosevic and his colleagues at the Hague tribunal alleges Serbian complicity in the deaths of "hundreds of Kosovo Albanian civilians" rather than the 100,000 or more that advocates of intervention had claimed before the war. It was the bombing campaign itself-launched after Serbia failed to submit to NATO's humiliating demand for military access throughout its territory-that provoked the most serious humanitarian crisis. Neglected, too, by those advocating intervention was the fact that the Kosovo Liberation Army had taken up arms to secure independence from Serbia; what were represented as acts of actual or impending genocide by Serbia were measures not terribly different from those that most governments would undertake if confronted with a threat to their territorial integrity. Rather than action to prevent genocide, the Kosovo intervention promised outside support to ethnic groups that seek the territorial dismemberment of an existing state. During the war, the NATO coalition's unwillingness to introduce ground forces, instead bombing the infrastructure of Belgrade and throughout the country, was a further moral cost of this ostensible humanitarian action. Five years after the intervention, the United States and its allies remain far from achieving a stable settlement and have prevented one kind of ethnic cleansing, of Albanians by Serbs, only at the price of another, of Serbs by Albanians.

In the aftermath of Kosovo, NATO governments repeatedly claimed that the intervention "would constitute the exception from the rule, not an attempt to create new international law," in the words of former NATO Secretary-General Javier Solana. But this attempt to limit the reach of the Kosovo precedent did not prevent the advocates of the Iraq war from invoking it to justify toppling Saddam. Undoubtedly, reasonable people could see the need to deal with the serious humanitarian crisis in Kosovo, and it was a partially legitimating factor that the intervention at least secured the support of a regional security organization. But reasonable people should also see that a bad precedent was set by the NATO action, and that the way the West responded to the crisis-particularly the irresponsible bombing of yet another capital city-made this humanitarian intervention quite inhumane in its methods.


There is no simple and direct route to the recovery of U.S. legitimacy. The years when the United States appeared as the hope of the world now seem long distant. Washington is hobbled by a reputation for the reckless use of force, and it is going to take a long time to live that down. World public opinion now sees the United States increasingly as an outlier-invoking international law when convenient, and ignoring it when not; using international institutions when they work to its advantage, and disdaining them when they pose obstacles to U.S. designs.

The United States has gone down a road in which the use of force has become a chronic feature of U.S. foreign policy, and the country's security has been weakened rather than bolstered as a consequence. It is true, of course, that the American public does not like the idea of deferring to others, but it may come to see the advantages of doing so once it appreciates that enterprises undertaken on a unilateral basis must be paid for on a unilateral basis. Ultimately, however, the importance of legitimacy goes beyond its unquestionable utility. Certainly the leaders who earned the United States' reputation for legitimacy in the post-World War II era believed it to be a good in itself. For its own sake, and for the sake of a peaceful international order, the nation must find its way back to that conviction again.