Autonomous Weapons and International Humanitarian Law: Advantages, Open Technical Questions and Legal Issues to be Clarified

Author:Marco Sassoli
Position:Professor of International Law at the University of Geneva, Switzerland, Director of the Department of Public International Law and International Organization, Associate Professor at the Universite du Quebec, Montreal, Canada and Commissioner within the International Commission of Jurists
Pages:308-340
 
FREE EXCERPT
International Law Studies 2014
308
A
I
Autonomous Weapons and International
Humanitarian Law: Advantages, Open Technical
Questions and Legal Issues to be Clarified
Marco Sassóli*
I. INTRODUCTION
utonomous weapons are subject to great controversy. They have been
defined as weapon systems
1 “that can learn or adapt [their] functioning in
response to changing circumstances in the environment in which [they are]
deployed.”2 Such systems, once developed, should, through sensors that
give them situational awareness, be able to identify both legitimate targets
and hopefully civilians/civilian objects that may potentially suffer incidental
* Professor of International Law at the University of Genev a, Switzerland, Director
of the Department of Public International Law and International Organization, Associate
Professor at the Universite du Quebec, Montreal, Canada and Commissioner within the
International Commission of Jurists. The author would like to t hank Ms. Yvette Issar,
doctoral candidate at the University of Geneva, for her valuable research assistance, criti-
cal remarks and for having revised this text.
1. Hin-Yan Liu, Categorization and Legality of Autonomous and Remote Weapons Systems, 94
INTERNATIONAL REVIEW OF THE RED CROSS 627, 63536 (2012).
2. International Committee of the Red Cr oss, International Humanitarian Law and the
Challenges of Contemporary Armed Conflict: Report Prepared for the 31 st International Conference of the
Red Cross and Red Cre scent 39 (2011), available at http://www.icrc.org/eng/resources/docu
ments/report/31-international-conference-ihl-challenges-report-2011-10-31.htm.
Autonomous Weapons and IHL: Advantages, Questions and Issues Vol. 90
309
effects of attack. Identification would then trigger corresponding action
through processors or artificial intelligence that would “decide . . . how to
respond . . . and effectors that carry out those ‘decisions.’”
3 Ideally, auton-
omous weapons would select and engage targets without ongoing human
intervention in an open environment under circumstances which are un-
structured and dynamic. At present, no weapon system possesses such ca-
pabilities. The absence or presence of human intervention is, however, a
relative distinction, as is the distinction between humans “in,” “on” or “out
of the loop,” which is therefore not very helpful.4 Despite the system’s au-
tonomy, human beings will inevitably be involved, either in overseeing the
operation of the weapon, or at least in producing and programming the
weapon systems. There is agreement that, although these systems do not
yet exist, they could be developed within twenty years. Many request that
they be purely, simply and preventively banned, specifically because their
use would not be consistent with international humanitarian law (IHL).5
The 117 States parties to the UN Conventional Weapons Convention6 have
agreed to hold this year a four-day intergovernmental meeting to explore
questions related to lethal autonomous weapon systems with a view to po-
tentially drafting a Protocol VI to the Convention.7 Even the United States,
which is among the most technologically advanced States in this field, cur-
rently requires that such weapon systems “be designed to allow command-
ers and operators to exercise appropriate levels of judgment over the use of
3. Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Rep ort, ¶
39, U.N. Doc. A/HRC/23/47 (Apr. 9, 2013) (by Christof Heyns) [hereinafter Heyns].
4. PETER W. SINGER, WIRED FOR WAR: THE ROBOTICS REVOLUTION AND CON-
FLICT IN THE 21ST CENTURY 12427 (2009).
5. HUMAN RIGHTS WATCH, LOSING HUMANITY: THE CASE AGAINST KILLER RO-
BOTS (2012), available at http://www.hrw.org/reports/2012/11/19/losing-humanity; Eu-
ropean Parliament Resolution on the Use of Armed Dr ones ¶ H.2(d) (2014/2567(RSP))
(Feb. 25, 2014); Peter Asaro, On Banning Autonomous Weapon Systems: Human Rights, Automa-
tion, and the Dehumanization of Lethal Decision-Making, 94 INTERNATIONAL REVIEW OF THE
RED CROSS 687 (2012).
6. Convention on Prohibitions or Restrictions on the Use of C ertain Conventional
Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate
Effects, Oct. 10, 1980, 1342 U.N.T.S. 137.
7. Meeting of the High Contracting Parties to the Conventio n on Prohibitions or Re-
strictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be
Excessively Injurious or to Ha ve Indiscriminate Effects, Geneva, Switz., Nov. 1415,
2013, Final Report, ¶¶ 18, 32, U.N. Doc. CCW.MSP/2013/10 (Dec. 16, 2013), available at
http://daccess-ods.un.org/TMP/4361535.01272202.html.
International Law Studies 2014
310
force,
8 which means that it is not admissible for producers to program
machines that make final decisions as to targets against which force shall be
used.
It is perhaps because I have been confronted in actual armed conflicts
with so many violations committed by human beings, but inevitably never
with atrocities by robots9 (although admittedly, they did not exist in the
armed conflicts I witnessed), that my first feeling is not skepticism, but
hope for better respect of IHL.10 Only human beings can be inhuman and
only human beings can deliberately choose not to comply with the rules
they were instructed to follow. To me, it seems more reasonable to expect
(and to ensure) a person who devises and constructs an autonomous
weapon in a peaceful workplace to comply with IHL than a soldier on the
battlefield or in a hostile environment. A robot cannot hate, cannot fear,
cannot be hungry or tired and has no survival instinct.11 “Robots do not
rape.”12 They can sense more information simultaneously and process it
faster than a human being.13 As the weapons actually delivering kinetic
force become increasingly quicker and more complex, it may be that hu-
mans become simply too overwhelmed by information and the decisions
that must be taken to direct them.14 Human beings often kill others to
avoid being killed themselves. The robot can delay the use of force until
the last, most appropriate moment, when it has been established that the
target and the attack are legitimate. Certainly, there may be technical fail-
ures, but all those who drive cars and every traffic policeman know that
8. U.S. Department of Defense, DoD Directive 3000.09, Autonomy in Weapon Sys-
tems 3(a) (2012).
9. The terms “atrocity” or “violation,” like many other terms which describe human
behavior used in this article, are admittedly not appropriate for a machine.
10. For a detailed, powerful, but not always entirely convincing plea, see RONALD C.
ARKIN, GOVERNING LETHAL BEHAVIOR IN AUTONOMOUS ROBOTS (2009). For a positive
assessment and call for regulation, see Kenneth Anderson & Matthew Waxman, Law and
Ethics for Autonomous Weapons Systems: Why a Ban Won’t Work and How the Laws of War Can ,
HOOVER INSTITUTE (Apr. 9, 2013), http://www.hoover.org/publications/monographs
/144241.
11. Ronald C. Arkin, Ethical Robots in Warfare, GEORGIA INSTITUTE OF TECHNOLOGY
(Jan. 20, 2009), http://www.cc.gatech.edu/ai/robot-lab/online-publications/arkin-re v.pdf;
Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Interim Report, ¶ 20
(by Philip Alston), transmitted by Note of the Secretary-General, U.N. Doc. A/65/321 (Aug. 23,
2010) [hereinafter Alston].
12. Heyns, supra note 3, ¶ 54.
13. SINGER, supra note 4, at 127.
14. Arkin, supra note 11; SINGER, supra note 4, at 128.

To continue reading

FREE SIGN UP