INTRODUCTION: AUTOMATION AND AUTONOMY I. MACHINE FUNCTIONING AND THE DIFFERENCE BETWEEN AUTONOMY AND AUTOMATION A. Understanding "the Loop" : OODA and How Machines Work FIGURE 1--Boyd's OODA Loop B. Toward a Distinction Between "Autonomy" and "Automation". 1. The First Attribute of Autonomy: Frequency of Operator Interaction 2. The Second Attribute of Autonomy: Tolerance for Environmental Uncertainty 3. The Third Attribute of Autonomy: Level of Assertiveness C. The Autonomy Spectrum II. DRONE WARFARE: YESTERDAY, TODAY, AND TOMORROW A. Yesterday's Drones B. Today's Drones 1. Aerial Drones 2. Land-Bound Drones 3. An Automated, Non-Autonomous Fleet C. Tomorrow's Drone III. LAW AND ETHICS FOR AUTONOMOUS SYSTEMS A. Using OODA to Regulate Drones B. The Moral Limits of Drone Technology CONCLUSION INTRODUCTION: AUTOMATION AND AUTONOMY
Drones have revolutionized warfare. They may soon transform civilian life too. America's military efforts abroad are spearheaded by unmanned aerial vehicles--Predators, Reapers, Global Hawks--with capabilities once only dreamed of by science fiction writers. Drones are simply "the only game in town" to fight hard-to-find terrorists in the tribal regions of Afghanistan and Pakistan, according to former CIA Director Leon Panetta. (1) And drones have already been introduced over our domestic skies, patrolling the U.S.-Mexico border (2) and assisting with law enforcement efforts. (3) Congress has voted to accelerate this trend, directing the Federal Aviation Administration to rethink restrictions on the domestic use of drones by 2015. (4)
As amazing as today's drones may seem, they are just the "Model T" of robot technology. (5) Most are souped-up, remote-controlled airplanes; they still have a human pilot, but he or she now sits at a military base rather than in the cockpit. Today's drones do not think, decide, or act on their own. In engineering speak, they are merely "automated."
The drones of tomorrow are expected to leap from automation to "autonomy." Tomorrow's sophisticated machines will have the ability to execute missions without guidance from a human operator. They will increasingly be used alongside--as well as in the air above--people. Drones will augment civilian life: Some countries are experimenting with robotic prison guards (6) and in-home caregivers. (7) Robotic warehouse workers, ambulance and taxi drivers, and medical assistants are in the works, too. (8) Today's automated drones raise difficult policy questions, but those questions will seem pedestrian compared to the issues created by tomorrow's autonomous systems.
Regulations for the drones of today should be crafted with an eye toward the technologies of tomorrow. Policymakers must better understand how the next generation of autonomous drones will differ from the merely automated machines of today. The distinction between automation and autonomy is vital. Today, humans are still very much "in the loop."9 Humans generally decide when to launch a drone, where it should fly, and whether it should take action against a suspect. As drones develop greater autonomy, however, humans will increasingly be "out of the loop." Human operators will not be necessary to decide when a drone (or perhaps a swarm of microscopic drones) takes off, where it goes, how it acts, what it looks for, and with whom it shares what it finds.
Today's debate about humans, autonomy, and "the loop" relies on language too imprecise to draw out and analyze the relevant differences between drones and predecessor technologies. What does it mean for a human to be "inside," "outside," or "on" the loop? And why does it matter? Further confusing the issues, the debate over "drones" covers a tremendous range of technologies, including not only those deployed today but also their even more sophisticated progeny of tomorrow.
It is difficult to agree on a definition of "autonomy" even when the word is applied to humans. (10) Autonomy has been variously described as "a submission to laws which one has made for oneself," (11) as the state achieved when one "is acting from principles that we would consent to as free and equal rational beings," (12) and as a "second-order capacity of persons to reflect critically upon their first-order preferences, desires, wishes, and so forth." (13) Various forms of autonomy appear in the Supreme Court's decisions on controversial and deeply divisive issues, including sex, (14) contraception and abortion, (15) and the existence and scope of a right to die. (16) "About the only features held constant from one author to another," one thinker despaired, "are that autonomy is a feature of persons and that it is a desirable quality to have." (17) Small wonder that confusion pervades discussions about when an advanced technological system is "autonomous," and what the implications of autonomy might be. The discussion is fraught with terms that are both loaded and vague. The stakeholders in debates about the appropriate uses of drones are varied, (18) the technology is developing rapidly, (19) and the resolution of the questions it raises will reverberate on a global scale. (20) Without a basic understanding of the technology driving the key issues and a common vocabulary to engage those issues, domestic and international policymakers risk speaking past one another and causing frustration, if not hostility. It is difficult to structure a thoughtful debate and design an effective regulatory regime without first understanding drone technology, including how it functions and how it differs from yesterday's weapons of war. But we must understand this technology because, as drone expert Peter Singer has observed, drones' "intelligence and autonomy is growing.... The law's not ready for all this." (21)
Language useful to the policy and lawmaking process has already been developed in the same places as drones themselves--research and engineering laboratories across the country and around the globe. We introduce this vocabulary here to explain how tomorrow's drones will differ from today's, outline the policy issues posed by the drones of tomorrow, and suggest possible approaches to regulation.
Part I of this Article offers a detailed discussion of how drones make decisions and explains the distinction between automation and autonomy. Part II tracks the development of drones past, present, and future, and demonstrates through examples how drones are expected to evolve from automated to autonomous machines. Part III highlights the difficult policy questions posed by tomorrow's drones, and explains how policymakers can use this Article's models of autonomy and of drone decision-making to craft smart, targeted regulations that protect both our security and our privacy. The Article then concludes.
Autonomy is no longer solely a feature of humans. Whether it is a desirable quality for machines to have will be one of the most important public policy debates of the next generation.
MACHINE FUNCTIONING AND THE DIFFERENCE BETWEEN AUTONOMY AND AUTOMATION
To explain autonomy and how it differs from automation, we first explore how machine decision-making processes operate. The "OODA Loop" (22) is an especially effective tool for understanding complex systems, including aerial drones carrying lethal payloads, for it offers a language shared by engineers, the military, and the public. (23) When commentators debate whether drone warfare leaves humans "in the loop," "out of the loop," or simply "on the loop," it is the OODA Loop they are talking about. (24)
Understanding "the Loop": OODA and How Machines Work
Why did American F-18 fighter planes get the better of Soviet MiG-5 jets during the Korean War? Air Force pilot and military strategist John Boyd's answer to this question transformed the military's approach to victory in battle. (25) Boyd's insight was that in a dogfight the advantage lay with the fighter pilot who could make faster and more accurate decisions than his opponent, and who was able to throw his opponent's decisionmaking "loop" out of sync. (26)
Boyd distilled human decision-making using a four-step process: Observe, Orient, Decide, Act. (27) In Boyd's "OODA Loop," a person first observes the world around her, gathering data about her environment through the array of human senses. (28) Second, she orients herself, or interprets the information she has gathered. (29) Third, she weighs the potential courses of action based on the knowledge she has accumulated and decides how to act. (30) Fourth and finally, she acts, or executes the decision she has made. (31)
Boyd's elegant theory is still used by the military today. (32) It has gained purchase in other fields too, including in business, sports and engineering--basically "anywhere a competitor seeks an edge." (33) Engineers have borrowed the concept to illustrate the way machine systems operate, make decisions, and interact with the world. (34) For example, Thomas Sheridan, an engineer and a leading scholar of autonomy and robotics, suggests a four-stage information-processing model that tracks OODA: (1) Information Acquisition; (2) Information Analysis; 3 Decision Selection; and (4) Action Implementation. (35)
The OODA Loop is not flawless. Even its proponents admit that the four-stage model is a "gross oversimplication" of both human and robot information processing, in part because the four stages overlap in time. (36) The "loop" is not a clean linear process because it includes constant feedback and integration among the different stages. (37) By Still, the OODA Loop offers a useful lens for understanding system design. (38) It also allows a relatively straightforward comparison of systems based upon their technological capabilities. (39)
In all its complexity, the OODA Loop appears as follows:
[FIGURE 1 OMITTED]
To see how the OODA Loop works in practice, begin with a human being--call him Dave (41) --who is walking down a road and encounters a large boulder blocking...