Autonomous weapons and human responsibilities.

Author:Beard, Jack M.
Position:I. Introduction through IV. The Search for Individual Responsibility for the Actions of Autonomous Weapons A. The International Criminal Law Framework, p. 617-647
  1. INTRODUCTION II. THE TECHNOLOGY OF AUTONOMOUS WEAPONS AND THE DIMINISHING ROLE OF HUMANS A. Autonomy, Artificial Intelligence, and the Inexorable March of "Smart" Machines B. A Continuum of Autonomy in Military Technologies 1. "Fire-and-Forget" Weapons 2. Autonomous Defense Systems 3. The Approaching Horizon: Autonomous Combatant Systems III. THE SEARCH FOR A FRAMEWORK TO GOVERN AUTONOMOUS WEAPON SYSTEMS IV. THE SEARCH FOR INDIVIDUAL RESPONSIBILITY FOR THE ACTIONS OF AUTONOMOUS WEAPONS A. The International Criminal Law Framework B. Autonomous Weapon Systems and the Search for Criminal Culpability 1. Manufacturers, Designers, Engineers, and Programmers 2. Military Personnel and the Employment of Complex Machines 3. The Question of Command Responsibility 4. Blame Everyone? Blame the Machine? V. STATE RESPONSIBILITY FOR AUTONOMOUS MACHINES AND THE PROBLEMS OF DISTINCTION AND PROPORTIONALITY A. Human Judgment Along the Continuum of Autonomy B. Higher End of the Continuum: The Uncharted Legal Territory of the Future C. Human Judgment, Practical Controls, and Legal Accountability VI. CONCLUSION I. INTRODUCTION

    Although remote-controlled robots flying over Afghanistan, Pakistan, Yemen, and other countries may now dominate reports on new military technologies, robots that are capable of hunting and killing enemies on their own are quietly but steadily moving from the theoretical to the practical. Early versions of these weapon systems are already widely deployed by military forces around the world, and more advanced ones are on their way. U.S. military officials view such machines as a crucial part of their future fighting forces. (1) Well-funded efforts are thus underway in the United States and other countries to build a wide variety of robots that are designed to "think, see and react increasingly like humans." (2) The results of these efforts are new generations of weapon systems that display greater and greater levels of autonomy.

    The advent of autonomous war-fighting machines has raised various concerns in the international community and increasingly now generates objections from international and non-governmental organizations. In a Report to the U.N. Council on Human Rights, the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Mr. Christof Heyns, argued that the deployment of lethal autonomous robots (LARs) "may be unacceptable because no adequate system of legal accountability can be devised, and because robots should not have the power of life and death over human beings." (3)

    Fearing a future in which "killer robots" threaten humanity, several non-governmental organizations have issued calls for autonomous weapon systems to be outlawed, with one group's leader stating that "[o]ur concern is that humans, not sensors, should make targeting decisions." (4) However, many existing weapon systems already have sensors that "make targeting decisions." As discussed below, robots under the sea already detect, identify, and fire torpedoes at enemy vessels on their own; on the surface of the sea and on the land, various sophisticated weapon systems autonomously engage missiles, aircraft, ships, and an increasing variety of other targets; in the skies above, unmanned combatant aerial vehicles are being tested with a goal to replace manned fighter bomber aircraft and ultimately conduct missions in an autonomous mode.

    For clarity in assessing the implications of existing, emerging, and future military technologies, this Article focuses on lethal military machines and weapon systems that may be described as "autonomous" to the extent that they have the ability (in varying degrees) to sense, control, and act without external human intervention or control. (5) The official definition used by the U.S. Department of Defense (DoD) refines this concept a bit further, stating that an autonomous weapon system is one that, "once activated, can select and engage targets without further intervention by a human operator." (6)

    For a wide variety of reasons, autonomous weapon systems are the next logical and seemingly inevitable step in the continuing evolution of military technologies. (7) New and ever-more sophisticated versions of autonomous military machines are, in fact, being so rapidly developed and deployed that they risk outpacing efforts to evaluate interrelated legal, ethical, and societal concerns. Foremost among these concerns are questions about who or what will be accountable for the damaging actions of these weapons in armed conflicts, particularly when they harm civilians. This Article assesses these problems of accountability in the context of state responsibility and individual culpability for actions of autonomous weapons, particularly under the International Humanitarian Law (IHL) framework, also referred to as the Law of Armed Conflict and the Law of War.

    Many of the risks, dangers, and challenges of future autonomous weapon systems are already present in existing, widely-deployed systems. Rather than a rampage by rogue robots in some futuristic Hollywood production, the real threat presented by these systems comes in the form of a slow, creeping, and continuous movement to autonomous war-fighting capabilities in increasingly complex technological conflicts. Machines and computers continue to take on more and more important roles in all aspects of these conflicts, and as they do, the precise level of human involvement or control over them continues to become more and more diminished and uncertain.

    This diminishing level of human control will continue to raise increasingly difficult questions about both state and individual accountability for the actions of autonomous weapon systems. While state and individual accountability involve different legal regimes, they sometimes share some key components, particularly in applying the IHL framework to determine whether states, through their military forces and commanders, have violated key obligations designed to protect civilians and civilian objects. These similar but different dimensions of the IHL framework are particularly important in giving full meaning to the fundamental obligations incumbent on states and military commanders to distinguish between targets by attacking only military objectives and to attack in such a way as to avoid excessive civilian casualties.

    Setting aside moral, ethical, and broad societal concerns to focus on the legal accountability of states and underlying human responsibilities for autonomous weapons, this Article argues that the central legal requirement relevant to determining such accountability for violating the most important IHL obligations protecting the civilian population (relating to discrimination and proportionality) is a meaningful connection to the effective exercise of human judgment. This piece further argues that the elusive search for individual culpability for the actions of autonomous weapons foreshadows fundamental problems in assigning responsibility to states for the actions of these machines, pointing inescapably to a legal requirement for human judgment in applying the most complex tests under the IHL framework. Lastly, it argues that access to human judgment already appears to be emerging as the deciding factor in establishing practical restrictions and framing legal concerns with respect to the deployment of the most advanced autonomous weapons.

    Part II of this Ardele provides a brief overview of autonomous weapons, their underlying technologies, the inevitable growth of these systems and their missions, and the ever-diminishing role of humans across a spectrum or continuum of autonomy. Part III assesses the search for a framework to govern autonomous weapon systems and examines key issues related to the application of IHL obligations to these systems. Part IV explores the extraordinarily challenging search for individual (and human) responsibility for the actions of autonomous weapons. As noted, while state and individual human responsibilities involve some distinctly different issues, the search for criminal responsibility nonetheless provides important perspectives on key challenges associated with state responsibility for the actions of autonomous machines.

    Part V explores the problem of state responsibility for ensuring that autonomous weapons comply with IHL obligations, focusing particularly on the observance of the cardinal IHL principle of distinction (or discrimination) and the related requirement of proportionality. This assessment illustrates how the legal requirement of human judgment--as distinct from ethical or moral requirements--underlies both individual and state responsibilities under the IHL framework for compliance with its most important principles. In making this assessment, Part V draws on the examination of existing and future autonomous military technologies discussed in Part II in order to identify the legal connection, or lack thereof, between human judgment and lethal machines along a continuum of autonomy. Part VI concludes with an examination of the role that human judgment may continue to play in establishing practical, and ultimately legal, barriers to the deployment of autonomous weapons as they incrementally proceed to higher and higher levels of autonomy.


    1. Autonomy, Artificial Intelligence, and the Inexorable March of "Smart" Machines

      To describe a machine as truly autonomous raises serious philosophical questions about the nature of humans and machines that lie beyond the scope of this work. The most developed definitions of autonomy encompass concepts such as complete self-governance and the ability to make decisions as a free and independent moral agent. In the narrower context of existing and emerging military weapon systems, autonomy can be said to describe "the capacity of a machine to operate in the real-world environment without any form of external (human)...

To continue reading