The case for regulating fully autonomous weapons.

AuthorLewis, John
PositionCOMMENT

On April 22, 2013, organizations across the world banded together to launch the Campaign to Stop Killer Robots. Advocates called for a ban on fully autonomous weapons (FAWs), robotic systems that can "choose and fire on targets on their own, without any human intervention." (1) Though no such weapon has been fully developed, (2) the campaign has gained momentum and attracted the support of international bodies, (3) activists, (4) and scientists. (5) In May 2014, just a year after the campaign began, the United Nations Convention on Certain Conventional Weapons met to debate whether a ban on FAWs is warranted. (6) In pressing their case, activists cited the Ottawa Treaty, which banned virtually all anti-personnel landmines, (7) as a model for full-scale prohibition. (8)

This Comment takes a different lesson from landmines. Drawing on the 1996 Amended Protocol on the use of landmines (9) and other case studies in international law, I argue that regulation, rather than an outright ban, would likely be more effective in ensuring that FAWs comply with international law. This argument begins from the premise that the best approach to FAWs is the one most likely to reduce human suffering. I contend that FAWs are amenable to regulation and that, as a practical matter, regulation is more likely than a ban to induce compliance from countries such as the United States, China, and Russia. Ultimately, I argue that in regulating these weapons systems, nations may well be able to create an administrable legal regime for a new technology of war.

The Comment proceeds in three Parts. Part I defines FAWs and introduces the legal and ethical issues surrounding these weapons. Part II draws on the history of attempts to regulate weapons systems, including landmines, to explain why regulation is the correct response to FAWs. Part III develops a framework based on the Amended Protocol to guide the use of FAWs. Though it is difficult to develop standards for such novel weapons, the momentum around a preemptive ban makes it important to consider whether regulation might instead be an effective response to FAWs. By demonstrating that existing frameworks are capable of regulating FAWs, this Comment aims to integrate FAWs into current debates in international law and to dispel the notion that these weapons raise wholly unique legal challenges.

  1. DEFINING AND CRITIQUING FULLY AUTONOMOUS WEAPONS

    The definition of a FAW hinges on the distinction between automated and autonomous weapons. Automated weapons have certain automated features, such as an autopilot function, and are common in many militaries. The U.S. Navy, for example, can land unpiloted drones on moving aircraft carriers, (10) while both South Korea and Israel have deployed automated sentry guns capable of selecting targets and alerting a human operator, who makes the decision to fire. (11) Autonomous weapons, in contrast, "can select and engage targets without further intervention by a human operator." (12) Autonomous weapons, in other words, require no human input--once activated, they possess the power to fire on their own. (13) Any weapon that possesses this essential characteristic should be considered a FAW.

    Like any weapons system, FAWs must be designed and deployed in accordance with international law. Under Article 36 of Additional Protocol I to the Geneva Conventions, (14) nations must review new weapons systems to ensure that they are not indiscriminate by nature or likely to cause unnecessary injury. (15) Furthermore, when using a particular weapon, a combatant must apply the familiar rules of international humanitarian law: she must distinguish between combatants and civilians (16) and avoid collateral damage disproportionate to the military objective. (17) Finally, weapons must be consistent with the Martens Clause, which is found, among other places, in Article 1 of Additional Protocol I to the Geneva Conventions, and is understood to represent customary international law. (18) The Martens Clause stipulates that weapons must comply with the "dictates of public conscience," a nebulous requirement that, read broadly, may render weapons that offend public opinion impermissible. (19)

    It is difficult to evaluate the legality of FAWs under these frameworks because such systems are in their infancy. At present, there are two main strands of criticism levied against FAWs. First, there is a legal critique, which questions whether FAWs can comply with the principles of distinction and proportionality, and whether anyone can be held responsible when FAWs violate international law. For example, Human Rights Watch (HRW) argues that FAWs will never be able to distinguish between combatants and civilians as well as a human soldier can. (20) Groups like HRW contend that machines cannot be designed with human qualities, such as emotion and ethical judgment, which are important to the decision to take a life. (21) Opponents also argue that if a FAW violates international law, it is unclear whom to hold responsible--the machine, the commander who authorized its use, or the manufacturer who designed it. Critics charge that, without someone to sanction, international law's deterrent function is weakened, making violations more likely. (22)

    Second, opponents level an ethical, deontological criticism of FAWs: it is simply wrong to remove humans from the process of killing. According to this line of reasoning, the decision to kill must be reserved for a human decision maker, whatever the ultimate consequences of widespread FAW usage. A related line of critique emphasizes that FAWs, like drones, put distance between human decision makers and the reality of war, thereby desensitizing decision makers to the consequences of their actions and making future wars more likely by reducing their human cost. (23)

    Opponents of FAWs invoke each of these critiques to argue in favor of an absolute, preemptive ban on FAWs. (24) In their view, once nations have developed FAW technology, it will be difficult to get them to stop. (25) The time to act is now, before the military utility of FAWs has been demonstrated and before other countries develop FAW technology in response. These arguments resonate with apocalyptic images, replete in pop culture, of killer robots run amok--think Terminator. (26) Yet they also exaggerate the danger posed by FAWs and underappreciate regulation's potential to respond to the threats that FAWs do pose.

  2. THE ARGUMENT FOR REGULATION, RATHER THAN PROHIBITION

    This Comment begins from the proposition that the purpose of international humanitarian law is to minimize harm understood in terms of suffering--primarily to civilians, but also to combatants. (27) Many have argued that public policy should be guided by consequentialist aims, given, among other things, the differences between individuals and states (28) and the inevitability of trade-offs in policymaking. (29) Gabriella Blum contends that the argument for consequentialism is particularly strong in the case of armed conflict, which "is about committing evils and choosing between evils." (30) Following Blum's logic, this Part brackets the deontological critique of FAWs--understood as the view that the use of FAWs is wrong independently of its consequences--and focuses on the possibility of regulatory regimes that minimize suffering in practice. While the deontological critique of FAWs presents a serious challenge, it loses much of its force if the responsible use of FAWs can reduce harm. (31)

    A consequentialist approach focused on minimizing harm also makes less compelling the objection that the use of FAWs reduces accountability. While the "autonomous" nature of FAWs appears to distance decision makers from the harms they inflict, commanders remain responsible for the initial use of FAWs. A commander must give the order to deploy a FAW and set parameters for its use--for example, by instructing that a FAW has X mission and must operate within Y area. In this sense, there is no such thing as a. fully autonomous weapon. Any weapon will require human intervention at some point, if only to activate it. The commander is ultimately responsible for using a FAW within its programming and within legal limits. If humans must remain an integral part of the decision to take a life in order for a weapon to fulfill the condition of accountability, then FAWs satisfy this requirement.

    I focus, therefore, on the question whether, as a legal matter, FAWs can be regulated in ways that minimize the suffering that they cause. This Comment argues that they can, for two reasons. First, FAWs are highly amenable to regulation. As quasi- but never fully autonomous systems, FAWs are ripe for a regulatory scheme that provides standards for permissible usage and holds commanders accountable. Second, given the potential military utility of FAWs, states are more likely to comply with regulations than with an absolute prohibition. This point matters because, even if the critics are correct that FAWs will always violate international law, they are wrong to think that prohibition will avert these harms.

    1. FAWs May Be Used Lawfully

      Whether FAWs can be deployed lawfully depends in part on whether they can be used in a manner that avoids civilian casualties. Given the pace of technological development, it is too early to say that a FAW could never make the contextual and difficult decisions that soldiers must make when distinguishing between combatants and civilians. (32) As George R. Lucas, Jr. argues, the critical question is not whether FAWs can "be ethical," but whether they can perform at the level of a human soldier. (33) Human soldiers aren't perfect. (34) Indeed, robots may have a number of advantages over humans, including superior sensory and computational capabilities, a lack of such emotions as fear and anger, and the ability to monitor and report unethical behavior on the battlefield. (35) A robot might also have access to greater...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT