Automated warfare.

AuthorLucas, George R., Jr.

In this Article, I review the military and security uses of robotics and "unmanned" or "uninhabited" (and sometimes "remotely piloted") vehicles in a number of relevant conflict environments that, in turn, raise issues of law and ethics that bear significantly on both foreign and domestic policy initiatives. My treatment applies to the use of autonomous unmanned platforms in combat and low-intensity international conflict, but also offers guidance for the increased domestic uses of both remotely controlled and fully autonomous unmanned aerial, maritime, and ground systems for immigration control, border surveillance, drug interdiction, and domestic law enforcement. I outline the emerging debate concerning "robot morality" and computational models of moral cognition and examine the implications of this debate for the future reliability, safety, and effectiveness of autonomous systems (whether weaponized or unarmed) that might come to be deployed in both domestic and international conflict situations. Likewise, 1 discuss attempts by the International Committee on Robot Arms Control (ICRAC) to outlaw or ban the use of autonomous systems that are lethally armed, as well an alternative proposal by the eminent Yale University ethicist, Wendell Wallach, to have lethally armed autonomous systems that might be capable of making targeting decisions independent of any human oversight specifically designated "mala in se" under international law. Following the approach of Marchant, et al., however, I summarize the lessons learned and the areas of provisional consensus reached thus far in this debate in the form of "soft-law" precepts that reflect emergent norms and a growing international consensus regarding the proper use and governance of such weapons.

  1. CONCEPTUAL FOUNDATIONS OF ETHICS & LAW FOR UNMANNED SYSTEMS II. DEVELOPING APPROPRIATE HYPOTHETICAL CASE STUDIES III. UNDERLYING PHILOSOPHICAL CONSIDERATIONS IV. MORAL AND LEGAL IMPLICATIONS OF A LESS COMPLEX RESEARCH AGENDA V. ETHICAL PRINCIPLES FOR UNMANNED SYSTEMS RESEARCH AND DEPLOYMENT POLICY CONCLUSION I. CONCEPTUAL FOUNDATIONS OF ETHICS & LAW FOR UNMANNED SYSTEMS

    The period from 2007 to 2013 witnessed an enormous outpouring of work devoted to the ethical and (far less frequently) the legal implications of military robotics. The inspiration for these studies stemmed from both the tremendous advances in the technologies themselves and the consequent dramatic increase in their roles in the conduct of military conflicts in many regions of the world.

    These studies encompass Australian philosopher Robert Sparrow's inaugural essay, Killer Robots, and a subsequent, similarly titled book by Arman Krishnan. (1) A detailed and path-breaking survey of the ethical dilemmas posed by the increased use of such technologies prepared for the U.S. Office of Naval Research (ONR) by renowned computer scientist and roboticist, George Bekey, and his philosophy colleagues Patrick Lin and Keith Abney at California State Polytechnic University (2) heralded, in turn, the widely read and enormously influential treatment of the emerging ethical challenges and foreign policy implications of military robotics, Wired for War, by Brookings Institution senior fellow Peter W. Singer. (3)

    The vast majority of these works focus on the ethical ramifications attendant upon the increased military uses of robotic technologies, reflecting the relevant lack of attention by legal scholars to the status of robotics in international law. The current status of domestic and international law governing robotics, however, together with a range of proposals for effective future governance of these technologies, was recently articulated by Arizona State University Law Professor, Gary Marchant, and several colleagues in the Consortium on Emerging Technologies, Military Operations, and National Security (CETMONS). (4) The legal and moral implications of military robotics constituted the main focus of a special issue of the Journal of Military Ethics (5) and of subsequent anthologies edited by Lin, Abney, and Bekey and, most recently, Bradley J. Strawser, along with a book-length treatment of aerial robotics ("drones") from an operational perspective by Colonel M. Shane Riza, USAF. (6)

    It is worth pausing to reflect on what we have learned about the legal, moral, and policy implications of these trends as a result of these numerous and substantial efforts. First, while the technologies themselves are designed to operate in the domains of air, land, and sea, as well as in space, the majority of the discussion has centered on "unmanned" or remotely piloted, lethally armed aerial platforms (such as Predators and Reapers). That in turn stems from the highly effective use of these aerial platforms in surveillance operations, sometimes resulting in "targeted killing" of selected high-value adversaries by the United States and its allies. Indeed, it is often difficult to disentangle the discussions of aerial robotic technologies either from their controversial tactical deployment in such operations, or from the long-term strategic consequences of America's willingness to engage in such tactics. The tactical uses and strategic consequences of these policies involving unmanned systems, however, are quite distinct from the moral dilemmas posed by the vastly wider development and use of these systems themselves.

    It is particularly unfortunate that the otherwise important policy discussions and moral debates surrounding the longstanding practice of targeted killing tends to obscure the fact that some of the most effective and justifiable uses of military robotics have been in unarmed ground operations, ranging from exploration of booby-trapped caves in Tora Bora, to battlefield rescue and casualty extraction, to dismantling IEDs or assisting in humanitarian relief operations. (7) Meanwhile, some of the most promising future developments in military robotics will likely be realized in the maritime and underwater environment (in surface combat or anti-submarine warfare, for example), as well as when some of these systems return home from the warfront and arc employed in a variety of domestic or regional security operations (border security, immigration control, drug and human trafficking, kidnapping, or disaster response and relief following hurricanes, floods earthquakes and massive wildfires) to which scant attention has thus far been paid (apart from implied threats to individual privacy).

    During the Cold War, for example, it was often the case that submarines from rival superpowers engaged in intelligence, surveillance, and reconnaissance (ISR) missions in contested waters, keeping an eye on the adversary's movements and interests in a particular sector, and perhaps playing games of "cat and mouse" and even "chicken," to assess everything from noise detection to the maneuverability and other technical capabilities of different models of submarines as well as to test the effectiveness of coastal defenses. With the demise of the superpower rivalry and the Cold War, however, it has been some time since any naval force could routinely expend the resources necessary to continue such Tom Clancy-like, macho underwater scenarios. They are simply too risky and resource intensive.

    Our strategic focus, meanwhile, has shifted from the Atlantic to the Pacific, and from Russia to China, as our treaty partners like Japan, South Korea, and the Philippines contend with one another and with the Chinese mainland for control of resource-rich areas of the South China Sea. Here a more typical scenario would involve an underwater ISR mission near the Diayu/Sinkaku islands, carried out by the United States in support of the interests of one of our principal allies, like Japan or South Korea. That operation today can be more efficiently and cost-effectively undertaken by deploying a single-manned vessel as an ISR command center, equipped with a variety of unmanned underwater vehicles (UUVs), each programmed to operate semi-autonomously in carrying out a series of task-oriented maneuvers in much the same way, and even following much the same command or decision-tree script that human commanders would have followed in an earlier era.

    In an underwater "runtime environment," for example, robots behave, or are programmed to behave, with about the same degree of autonomy as a human commander of an individual manned ISR platform: that is, the operational or mission orders are for either type of vehicle to search, find, report, and either continue the mission or return to the command center (or to another specified rendezvous point). (8) This kind of mission can prove to be dull, dirty, routinely dangerous (for the manned platform), and certainly boring, until, that is, an adversary's submarine is observed carrying out exploratory mining surveys on the ocean floor. In a plausible "war-game" scenario, we might posit that the adversary then attempts to evade detection by fleeing into a prohibited marine sanctuary under the administrative control of yet another party to the dispute (e.g. the Philippines). The hypothetical semi-autonomous UUV would then face exactly the same legal and moral dilemma as would confront the human commander of a conventional manned submarine under otherwise-identical circumstances: namely, to continue the military mission of tracking the enemy or to refuse to violate international law and norms by hesitating to enter these prohibited, "no-go" waters. Standard operating procedures and "standing orders" defining the mission would require the human commander to "contact operational headquarters" for clearance, or else discontinue the mission. The UUV can relatively easily be programmed to do likewise, thereby incorporating the constraints of law and morality within the parameters of the rules of engagement defining this well-defined (and what I have elsewhere termed "highly scripted") mission. (9)

    This seems a clear and...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT