Autonomous Weapons: How Existing Law Can Regulate Future Weapons

JurisdictionUnited States,Federal
Publication year2020
CitationVol. 34 No. 2

Autonomous Weapons: How Existing Law Can Regulate Future Weapons

Charles P. Trumbull IV

AUTONOMOUS WEAPONS: HOW EXISTING LAW CAN REGULATE FUTURE WEAPONS


Charles P. Trumbull IV*


Introduction

Swarms of miniature drones assassinate U.S. Senators,1 Terminator-like robots go rogue, and "the Singularity" finally materializes.2 These could be scenes in a science fiction movie, but they have also been the subject of serious discussion among States, academics, and non-governmental organizations (NGOs). In particular, the potential development and use of autonomous weapons systems—which, broadly defined, are weapons systems that can select and engage targets without further human intervention after activation—have generated significant debate over the past decade.3

Inter-governmental discussions of autonomous weapons have occurred primarily at the convention on certain conventional Weapons (CCW). Following the release of a provocative report by Human Rights Watch in 2012,4 the High contracting Parties to the CCW convened an informal group of experts to discuss various issues related to "lethal autonomous weapon systems" (LAWS).5 After three years of meetings, in 2016 the CCW made these

[Page 534]

discussions more formal with the establishment of a Group of Governmental Experts (GGE).6

The GGE is deeply divided on how to address autonomous weapons.7 A group of NGOs under the umbrella of the Campaign to Stop Killer Robots, and a growing number of States, have called for a preemptive ban on LAWS or, at a minimum, international regulations on their development and use.8 This opposition to autonomous weapons is hardly surprising. These weapons raise a perfect storm of concerns regarding civilian casualties, accountability gaps, destabilizing arms races, and the ethics of giving machines the ability to decide over life and death. The U.N. Secretary-General, Antonio Guterres, stated before the General Assembly that the prospect of autonomous weapons "raises multiple alarms" and is "morally repugnant."9

A number of leaders in the field of robotics and artificial intelligence (AI) have also voiced concern about autonomous weapons. An open letter to the CCW, signed by dozens of industry experts, claims that these weapons "threaten to become the third revolution in warfare."10 These weapons, they assert, "will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend."11 Corporations are also entering this debate. In 2018, Google announced that it would not "design or deploy AI in ...

[Page 535]

[w]eapons or other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people."12

Most major military powers oppose new international regulations on the development or use of autonomous weapons.13 They argue that that existing international humanitarian law (IHL)—the legal framework applicable during armed conflicts—is sufficient to appropriately regulate new weapons with emerging technologies.14 States also point to the potential humanitarian benefits of greater autonomy and artificial intelligence in weapons systems. Such weapons may be more discriminate than existing weapons, thus reducing the risk to civilians and friendly forces.15 Finally, it would be exceedingly difficult, and likely counter-productive, to attempt to regulate military use of a technology that is rapidly evolving. Even if States did agree to prohibit such weapons, much of the underlying technology is dual use and being developed by the private sector. A ban would be difficult to verify or enforce.16

The ongoing debate on whether to ban autonomous weapons is unlikely to deter many States from pursuing weapons with increasing degrees of autonomy.17 The United States, Russia, China, and other military powers are investing heavily in AI18 because they believe it will provide competitive

[Page 536]

advantages.19 Vladimir Putin predicted that whichever country leads the field of AI "will be the ruler of the world."20 In June 2018, the then-Deputy Secretary of Defense wrote, "to preserve and expand our military advantage ... we must pursue AI applications with boldness and alacrity while ensuring strong commitment to military ethics and AI safety."21 In future wars, victory may depend on "the quality of each side's algorithm" rather than on the skill or bravery of a State's armed forces.22 Just as technology has profoundly affected the role of humans in other professions—such as medicine, finance, and transportation,23 there is little doubt that advances in autonomy and AI will transform the nature of, and humans' role in, warfare.24

The rules and principles of IHL have thus far been able to adapt to the use of increasingly sophisticated weapons in warfare, such as unmanned aerial

[Page 537]

vehicles (UAVs).25 The pace of technological advancement and its effect on the conduct of hostilities, however, is rapidly outpacing the more glacial evolution of IHL.26 Advances in AI and machine learning will pose new and more difficult challenges to current interpretations and applications of IHL.27 In particular, the ability of machines to make certain decisions that have traditionally been made exclusively by humans will force States to reconcile IHL's focus on human decision-making with this new technology.

This Article provides the first comprehensive analysis of how fundamental principles of IHL can and should be interpreted and applied in light of rapidly changing advances in warfare and the correlated humanitarian risks. Although humans' role in warfare may be increasingly removed from the physical battlefield, human judgment and decision-making over the use of force remains the focus of IHL. IHL imposes obligations on humans, and these obligations cannot be delegated to machines.28 This Article will accordingly focus on the challenge of using weapons with significant degrees of autonomy consistent with IHL, as well as the challenges for IHL in regulating the use of this new technology. This Article does not address the more technical and speculative question of whether future weapons could independently make the same judgments that are required of humans in order to comply with IHL.29

This Article proceeds as follows. Part II describes the different levels of autonomy in weapons systems and seeks to clarify some of the ongoing confusion regarding the term "lethal autonomous weapons systems." Part III discusses the military benefits of autonomy, the foreseeable applications of this technology in warfare, and the humanitarian concerns that it poses. Part IV explains why autonomous weapons are not categorically prohibited by IHL. Part

[Page 538]

V analyzes why autonomy will challenge core understandings of how IHL regulates human decision-making over uses of force. Part VI then turns to how core provisions of IHL—including those relating to weapons development, targeting, and accountability—can be applied to the use of autonomous weapons.

I. What are Autonomous Weapons Systems?

A significant impediment to substantive discussions of autonomous weapons among States and in the academic literature is the lack of universally agreed terms of reference. The term "lethal autonomous weapons systems" was coined as the title of the GGE and has since been adopted more broadly.30 The term, however, is an artificial and somewhat misleading construct. Weapon systems may incorporate varying degrees of autonomy distributed among a number of different functions. For this reason, it is "helpful to think of autonomy as a spectrum or series of spectrums."31 Paul Scharre, a leading expert on autonomy in warfare, refers to three "dimensions" of autonomy: the human machine relationship, the type of task to be performed by a machine, and the sophistication of the machine's decision-making capabilities.32

This Part discusses the spectrum of autonomy33 in weapons systems, including the defining characteristics of semi-autonomous and autonomous weapons. This Article generally disfavors the term "LAWS" that is used in the CCW discussions, given the confusion regarding what it encompasses and the political baggage that it carries. The term, and the ongoing discussions in Geneva, also tend to oversimplify the complex issues presented by autonomy in weapons by suggesting that it is possible to draw a bright line along this

[Page 539]

spectrum of autonomy, both in terms of degree and function. After Part II, this Article will generally use the term "autonomous weapons," which for the purpose of this Article may include weapons with varying degrees of autonomy across different functions.34

A. Semi-Autonomous

Semi-autonomous weapons have autonomous functions to identify, track, and maneuver to targets.35 These weapons, however, can "only engage individual targets or specific target groups that have been selected by a human operator."36 Semi-autonomous weapons are often described as having a human "in the loop"—a reference to the Orient, Observe, Decide and Act (OODA) loop.37 A semi-autonomous weapon cannot complete this loop absent a human decision.38 Semi-autonomous weapons can be used to attack known targets or to identify unknown targets, but they cannot attack unknown targets.39 For example, homing munitions may be used to search and engage a target within a confined geographic area.40 The homing munition uses autonomous capabilities to identify and track the intended target, but it cannot select unintended targets.41 A semi-autonomous weapon may also be used to identify and track (but not engage) unknown targets.42 For example, the South Korean SGR A-1 Sentry is a border defense system that uses sensors to identify and track potential targets along the De-Militarized Zone with North Korea.43 Although this weapon system can independently locate potential targets, it is considered semi-autonomous because a human must complete the OODA loop by initiating the use of lethal force.44

[Page 540]

B....

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT