Leashing the dogs of war.

AuthorRivkin, Jr., David B.

WAR HAS always had rules, even if only to protect the dead. In The Iliad, for example, Homer tells us that Achilles' desecration of Hector's corpse angered the gods. Medieval churchmen sought to limit warfare to certain days of the week and evolved an entire just war theology to constrain the use of armed force. By the Age of Reason, international law "publicists" were busily expounding on the subject, and the 20th century opened with a substantial body of law governing both the right to initiate combat (jus ad bellum) and how armed force is applied (jus in bello). These "laws of war" were based both on custom and treaties and were accepted by all of the Great Powers--including the United States. In more recent years, however, fissures have opened between America and Europe over what the laws of war require with respect to when it is permissible to launch an armed attack, how warfare must be waged, and how the relevant legal norms should be enforced. Today, these disagreements are so fundamental that America and its partners in Europe can be said to operate under different legal codes.

The core of this divergence can be traced to efforts--largely initiated during the Vietnam War era--both to leash the dogs of war and make the laws of combat more humane by mimicking the rules governing domestic police activities, in which deadly force is always the last resort and must not be applied in an "excessive" manner. In the process, "humanitarian" concerns were to be elevated above considerations of military necessity and national interest.

These efforts have taken the form of multilateral conventions, such as the 1977 Protocol I Additional to the 1949 Geneva Conventions (Protocol I) or the 1997 Ottawa anti-landmine convention, and of new interpretations of existing treaties (such as the UN Charter), or of customary norms. Although the United States helped to negotiate a number of these treaties, it has steadfastly rejected the most sweeping innovations, favoring instead more traditional jus ad bellum and jus in bello norms. In particular, the United States has clearly asserted that it will use force, where necessary, to defend its interests with or without UN Security Council approval, and has rejected agreements that could be interpreted as contrary to key aspects of U.S. military doctrine.

This reticence is not part of a nefarious American effort to achieve immunity from international law, as critics have sometimes asserted. Unlike many countries, which embrace new international conventions with little intent to comply thereafter, the United States has always taken its obligations seriously--refusing, for example, to ratify treaties it does not plan to implement, whether because of policy or constitutional concerns. What the critics fail to realize is that binding international legal obligations must be based on the consent of the affected states. They cannot be imposed. In eschewing many of the new international legal norms accepted by Europe, the United States has simply acted within its legal rights as an independent sovereign.

Nor does the American refusal to follow Europe's lead in this area stem from any lack of humanitarian zeal. Rather, it can be traced to recognition by the United States that the world remains a dangerous place, and that adoption of a "policing" model for warfare would hamper, if not cripple, America's ability to defend itself--and its allies. Peacetime norms, which guide the conduct of police and security establishments in modern democracies, are far more restrictive than the laws of war because they operate in an environment in which the state has an effective monopoly on the lawful use of force, and in which the damage that any single individual or group can inflict is limited. The laws of war, by contrast, apply in a context in which the state does not have a monopoly on either the lawful right to use force or on the use of the most destructive weapons. War and peace remain different worlds, each with a unique logic and distinct imperatives that require dissimilar rules.

Accepting a "policing" model for warfare would undermine the key tenets of American strategic thinking. For starters, the fundamental American doctrine of "decisive force" would have to go. Any robust use of force is certain to cause some civilian casualties, and, under a model of armed conflict better suited to "managing" problems than winning wars, decisive force would be considered "excessive" and subject to sanction. Similarly, the high value the United States places on force protection would be suspect under these rules. Indeed, one of the principal allegations leveled against the United States is that it has improperly sought to shield its soldiers from the dangers of combat--for example, by operating its aircraft at heights well beyond the range of enemy air defenses, making it difficult in many cases to distinguish between military and civilian targets.

Overall, the importance of this Euro-American doctrinal divergence cannot be overestimated. For the first time in modern history, the principal military powers differ fundamentally over the proper rules governing warfare.

The Legitimate Use of Armed Force

NOWHERE are the divisions between the United States and its European allies deeper or more apparent than on the question of when the use of armed force is legally permissible. In the months prior to the Coalition attack on Iraq, France and Germany insisted that force could be used legitimately only after passage of an additional UN Security Council resolution specifically authorizing it. This interpretation became the European Union mantra. Of course, the Coalition went forward regardless. Yet even such key European members of the Coalition as Britain relied solely upon Security Council resolutions for legal justification. (1) Unlike the United States, it did not also cite the inherent right to use force in self-defense. Indeed, while Europe has increasingly come to consider the UN Security Council to be the primary--if not the sole--source of legitimate authorization for the use of military force, the United States, in its September 2002 National Security Strategy, articulated a policy of strategic "pre-emption" firmly rooted in the traditional jus ad bellum.

Proponents of the European position cite the United Nations Charter, an instrument that does bind the United States and arguably limits the right of self-defense. Certainly some of the UN's "founders" wished to "outlaw" war, and even to require countries to absorb an aggressor's first strike. (This appears to have been, for example, the position of former Minnesota Governor Harold Stassen, a member of the American delegation to the San Francisco conference. Stassen insisted that the Charter's acknowledgement of the "inherent right of individual or collective self-defense" also include the language "if an armed attack occurs.") The Charter as written, however, did not reflect this purpose.

Indeed, the UN Charter never purported to replace the jus ad bellum. In this regard, it does not grant the right of self-defense, but acknowledges it, suggesting the continued viability of pre-Charter customary norms. When the Charter is read as a whole--as it must be--states retain the right to use force so long as they do not threaten the "territorial integrity" or "political independence" of another state, or otherwise act in a manner inconsistent with the United Nation's purposes. Indeed, the veto power--which was the sine qua non of American participation in the UN system--would have been virtually meaningless in this critical area if the Security Council's mere inaction were sufficient to deny legal legitimacy to the anticipatory defensive use of force. (Significantly, this interpretation of the Charter did not originate with the Bush Administration. For example, in a 1962 legal opinion on America's lawful alternatives in the Cuban missile crisis, the Kennedy Justice Department noted that the UN Charter does not "prohibit the taking of unilateral preventive action in self-defense prior to the occurrence of an armed attack.") And, at a time when civilians, rather than combatants, have become the targets of choice for both rogue regimes and international terrorist networks, and when at least some of these groups appear to be beyond deterrence, pre-emptive military action taken in self-defense is far more likely to promote international peace and security than it is to threaten it.

The survival of traditional self-defense norms becomes especially clear when the actual practice of states as well as the Security Council's track record over the past fifty years are examined. From its first years, the Security Council has been torn by divisions among its veto-wielding members and has been singularly ineffective as a guardian of international peace and stability. The hopes that the end of the Cold War would revitalize the Council have not panned out. By contrast, UN member-states have continued to use force unilaterally and with considerable frequency. As Michael J. Glennon, who argues that the UN Charter did intend to limit strictly the use of armed force even for defensive purposes, has correctly observed:

The question--the sole question, in the consent-based international legal system--is whether states have in fact agreed to be bound by the Charter's use-of-force rules. If states had truly intended to make those rules obligatory, they would have made the cost of violation greater than the perceived benefits. They have not. The Charter's use-of-force rules have been widely and regularly disregarded. Since 1945, two-thirds of the members of the United Nations--126 states out of 189--have fought 291 interstate conflicts in which over 22 million people have been...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT