170 MILITARY LAW REVIEW [Vol. 218
AUTONOMOUS WEAPONS AND THE LAW OF ARMED
CADET ALLYSON HAUPTMAN*
Control. Human beings have an innate, insatiable desire to control
the world around them. Much of this desire comes from a sense of self-
preservation embedded in the human subconscious. Thus, it is counter-
intuitive that humans are also obsessed with automation. We want our
gadgets to cook, clean, read, dictate, count, and solve problems for us.
Now, we must decide if we want them to fight for us as well. While
most international prohibitions on weapons specifically prohibit what
weapons do, the issue of automation raises a fundamentally different
concern. The issue is not what effect a weapon can achieve but, rather,
how it achieves effects in a way that does not transgress the fundamental
principles of the Law of War (LoW).
The discussion about how LoW should address autonomous weapons
is overdue. These weapons already exist, at least to the point of being
mostly autonomous. The Department of Defense (DoD) defines an
autonomous weapon system as one that, “once active, can select and
engage a target without further intervention by a human operator.”1 This
article uses the terminology “in-the-loop,” “on-the-loop,” and “out-of-
the-loop” to describe the human role in a system’s ability to acquire and
attack a target. Under this terminology, in-the-loop systems require a
human to actively engage a target; on-the-loop systems can engage a
target autonomously but can be stopped by a hman operator; and out-of-
the-loop systems act completely without human input.
Recent media stories have highlighted the viewpoints of anti-
automation activists who maintain banning autonomous weapons entirely
* Cadet, U.S. Military Academy at West Point; Rhodes Scholarship finalist (2013);
Rotary Scholar; Battalion Commander, U.S. Corps of Cadets; Policy & Doctrine, Fort
Meade, Maryland; Judge Assistant, Northampton District Court, New York.
1 U.S. DEP’T OF DEF., DIR., AUTONOMY IN WEAPON SYSTEMS 13 n.3000.09 (21 Nov.
2012). This definition also includes human-supervised systems that allow operators to
override the system. This article refers to the “law of war” (LoW) and “law of armed
conflict” (LOAC) interchangeably.
2013] AUTONOMOUS WEAPONS & LOAC 171
would solve any possible problems.2 Yet, it is difficult to convince
innovators to abandon new and exciting technologies. Autonomous
technology and the issues associated with them already exist, and the
international community must decide how to govern their development
and the way they are used as the technology progresses. This article will
begin by outlining the principles of the law of armed conflict (LOAC). It
will then examine the laws governing weapons. Next, it will review
existing and developing autonomous weapons technology, and finally,
the article will explore the moral principles important to determining the
answer to this question. Ultimately, it concludes that until technology is
advanced enough to mirror human decision making processes, humans
must remain a part of the “kill chain” for the foreseeable future, but that
possibility of autonomous weapons that can follow LOAC are possible.
II. Legal Foundation
A. The Four Principles
The LOAC revolves around four core principles: distinction,
proportionality, military necessity, and unnecessary suffering. Because
distinction and proportionality are the most germane to reviewing the
capabilities of robotic systems, they will be discussed at length below.
At this point in time, a human’s decision to employ robotic systems
would presumably account for the principles of military necessity and
unnecessary suffering, although there may come a time when these
higher-level decisions could be automated. Robots developed in the
foreseeable future would account for the military necessity principle
through the human decision on how to program and when to deploy a
robot, and the unnecessary suffering principle would be incorporated into
the human decision of how to arm the robot. Still, it is useful to
introduce these terms.
2 Brid-Aine Parnell, Killer Robots Could Be Banned by the UN Before 2016, (18 Nov.
2013), available at http://www.forbes.com/sites/bridaineparnell/2013/11/18/killer-
robots-could-be-banned-by-the-un-before-2016/. Multiple lobby groups, such as Article
36, have lobbied the United Nations (UN) to add autonomous weapons to next year’s
Convention on Certain Conventional Weapons agenda. Article 36 is one of forty
organizations involved in the Campaign to Stop Killer Robots, aimed at banning fully