The economic calculus of fielding autonomous fighting vehicles compliant with the laws of armed conflict.

AuthorWallach, Evan

TABLE OF CONTENTS INTRODUCTION I. AUTOMATION AND TRUE AUTONOMY IN WEAPON SYSTEMS II. INTERNATIONAL LAW CONCERNING THE LEGALITY OF DEPLOYING AUTONOMOUS WEAPON SYSTEMS III. EFFECT OF ECONOMIC COSTS ON THE IMPLEMENTATION OF COMPLIANCE SYSTEMS IN AFVS IV. EFFECT OF THE ECONOMIC COSTS OF COMPLIANCE SYSTEMS ON THE DESIGN OF AFVS V. RECOMMENDATIONS FOR POLICYMAKERS CONCLUSION INTRODUCTION

In 2001, the U.S. Military had only 162 unmanned aerial vehicles, commonly referred to as drones. (1) By 2010, that number exceeded 7,000, accounting for 41% of aircrafts in the U.S. Air Force. (2) As their numbers have increased, these systems have become increasingly automated. (3) Newly deployed weapon systems have taken the first steps towards target selection without input from human operators. (4) The revolution in robotics and weapons technology raises numerous questions about the legality of deploying Autonomous Fighting Vehicles (AFVs) onto the battlefield.

As human-operated weapons evolve into self-directed warriors, the applicable legal framework expands beyond the traditional determination of weapons' compliance with the law, (5) imposing additional positive and negative requirements. (6) The guiding principles for use and deployment (for example proportionality, military necessity, and chivalry) remain the same.

This paper examines the interplay between the obligation to produce legally compliant weapons and the economic costs of those weapons, and assesses how these costs may influence AFV design. We begin by defining an autonomous weapon system. We then examine obligations imposed by the Law of Armed Conflict and Customary International Humanitarian Law on AFVs. In particular, we evaluate how the Law of Armed Conflict influences AFV design, construction, and inventory maintenance. We conclude with recommendations for executive and legislative policymakers, including technical design improvements, cost and compliance policy considerations, modifications to increase command battlefield awareness of legal compliance, and increased policymaker awareness of AFVs' legal compliance advantages.

  1. Automation and True Autonomy in Weapon Systems

    The continuum from human control of the use of lethal force to complete autonomy begins in automated weapon systems. An automated weapon system is designed to automatically engage a target when certain pre-determined parameters are detected. (7) Automated weapon systems have a long history. The pit trap and its technological successors, the land and sea mine, are examples of early automated weapons systems. (8) They are "victim activated." The target actuates the weapon, but there is little or no ability to distinguish among targets.

    Newer weapon systems are advancing towards a dynamic in which the weapon systems have a greater capacity to both identify targets and choose not to activate against inappropriate ones. For example, new anti-vehicle mines have the capacity to distinguish between "friendly" vehicles and "enemy" vehicles based on whether they meet certain sensor signatures. (9) As technology has evolved, these systems have gained greater range and ability to choose their own targets, moving them into the realm of autonomous weapons. (10)

    Lawful autonomous weapon systems are defined in our analysis as weapons that have the capacity, without human intervention, to identify, engage, and attack legitimate targets without violating any law governing armed conflict. They may or may not have the capacity to learn and adapt their battlefield behavior without further human intervention or programming. (11) Some deployed weapon systems are capable of defensive autonomous reactive targeting of perceived nonhuman targets due to these systems' necessarily short reaction times. (12) Indeed, potentially offensive autonomous targeting decisions occur in some currently deployed weapons systems. (13)

  2. International Law Concerning the Legality of Deploying Autonomous Weapon Systems

    International law mandates that contracting nations determine whether a developing weapon system is compliant with the unvarying requirements of the laws of war. (14) An autonomous weapon system must observe the core principles of the Law of Armed Conflict: distinction, military necessity, proportionality, chivalry, (15) and avoiding unnecessary suffering. (16) Given AFVs' capacity to operate independently and lethally, their deployment and design also implicate laws concerning command responsibility. This paper will primarily focus on distinction and command responsibility. (17)

    1. The Principle of Discrimination and Its Application to AFVs

      A prime deployment issue is whether an autonomous weapons system is capable of adequate target discrimination. Combatants are required to observe the principle of "distinction" (i.e. discrimination), (18) which prohibits 1) the use of weapon systems that indiscriminately strike both lawful and unlawful targets, and 2) the indiscriminant use of a weapon regardless of its accuracy. (19) A conventional weapon system need only be designed in a way that places the burden on the operator to employ it in a discriminatory manner. An autonomous weapon system, on the other hand, must comply with both facets since it selects and strikes a target.

      Already there is a spectrum of responsibility between full machine and full human responsibility for AFV target selection. As autonomous weapons systems have become more sophisticated, the extent to which a human or machine exercises the principle of distinction has begun to shift. For example, in traditional automated weapon systems such as land mines, the principle of discrimination was exercised by the military commander through the placement of mines in either marked locations or locations where they were unlikely to be triggered by civilians. (20) In contrast, those deploying a system such as the Harpy, which patrols a broad geographic area, cannot rely on the absence of civilians from its targeting area as a means of discrimination. To meet the distinction requirement, a new autonomous weapon system must have an effective means of distinguishing civilian from military targets. Minimum technical requirements for distinction in any autonomous system can be met only at a significant price, potentially requiring sophisticated targeting sensors and the software and computing power to fully and immediately process the sensor data. (21)

    2. Command Responsibility and Its Application to AFVs

      Commanders bear responsibility for the actions of their troops even when their troops act outside the commander's orders. (22) In 1947, in In re Yamashita, the United States Supreme Court cited the Annex to the Hague Convention of 1907 for the principle that an armed force must be "commanded by a person responsible for his subordinates" to be accorded the rights of lawful belligerents. (23) As part of these responsibilities, a military commander has several important duties, failure of which constitutes a war crime. (24) Three aspects of a commander's responsibility are particularly implicated by autonomous weapon systems: the duty to train troops in the laws of war, the duty to control troops, and the duty to monitor and punish.

      Commanders are responsible for ensuring that their troops are trained in the Law of Armed Conflict. Under Geneva Convention I of 1949 Article 47, contracting nations have an obligation to include lessons on the Convention in their military instruction. (25) In the case of AFVs, the obligation to properly train is effectively replaced by an obligation to properly program. AFVs differ significantly from regular troops, however, in that a greater investment of resources in programming and processing necessarily increases the AFVs ability to implement the Law of Armed Conflict. Commanders and their lawyers, of course, will require compliance training more than ever. (26)

      A commander also has a duty to control subordinates; otherwise their crimes may be imputed. (27) With human combatants, the obligation is generally fulfilled through Rules of Engagement. (28) To satisfy this obligation, commanders need a means to program (re-train) and disable a malfunctioning AFV. These additional requirements further increase the cost of a weapon system. To reprogram an AFV, a commander must, at least in the foreseeable future, possess an ability to require conduct specific to the area of operations. For the latter, the AFV designer will not only need to incorporate something like a "kill-switch," but also invest in security measures to prevent activation of the switch by the enemy. An alternative approach might require machines to periodically check in or return to base. That, however, might require the purchase of a larger number of AFVs than otherwise necessary. (29)

      Finally, a commander is responsible for investigating Law of Armed Conflict violations about which she is or should have been aware. (30) Under the "should have known standard" a commander does not need to have specific knowledge that a crime has been committed and can be held liable for ignoring violations of the Law of Armed Conflict by his troops. (31) Therefore, any commander who deploys AFVs must have some method of monitoring their actions and behavior to ensure that they are not violating the Law of Armed Conflict, and a means of verification should any general information arise suggesting misconduct. Accordingly, AFV designers must include a recording mechanism that is available for inspection. Current United Kingdom military doctrine, for example, recognizes a duty to include recording and information transmission systems in AFVs that operate autonomously for an extended period of time, so that commanders can monitor the AFVs' activity. (32)

      The governing principles described above are immutable. The conflict we discuss here is how these principles fare in the face of a range of economic realities governing states' conduct. Indeed, even the richest state must be able...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT