The Automated Administrative State: a Crisis of Legitimacy

Publication year2021

The Automated Administrative State: A Crisis of Legitimacy

Ryan Calo

Danielle Keats Citron

THE AUTOMATED ADMINSTRATIVE STATE: A CRISIS OF LEGITIMACY


Ryan Calo*
Danielle Keats Citron**

The legitimacy of the administrative state is premised on our faith in agency expertise. Despite their extra-constitutional structure, administrative agencies have been on firm footing for a long time in reverence to their critical role in governing a complex, evolving society. They are delegated enormous power because they respond expertly and nimbly to evolving conditions.

In recent decades, state and federal agencies have embraced a novel mode of operation: automation. Agencies rely more and more on software and algorithms in carrying out their delegated responsibilities. The automated administrative state, however, is demonstrably riddled with concerns. Legal challenges regarding the denial of benefits and rights—from travel to disability—have revealed a pernicious pattern of bizarre and unintelligible outcomes.

Scholarship to date has explored the pitfalls of automation with a particular frame, asking how we might ensure that automation honors existing legal commitments such as due process. Missing from the conversation are broader, structural critiques of the legitimacy of agencies that automate. Automation abdicates the expertise and nimbleness that justify the administrative state, undermining the very case for the existence and authority of agencies.

Yet the answer is not to deny agencies access to technology that other twenty-first century institutions rely upon. This Article points toward a positive vision of the administrative state that adopts tools only when they enhance, rather than undermine, the underpinnings of agency legitimacy.

[Page 798]

Introduction.............................................................................................799

I. Replacing Values Compromised.................................................805
II. Justifying the Administrative State.........................................811
A. Responding to Agency Skepticism: Governance in a Complex World........................................................................................ 811
B. Deference to Algorithms?......................................................... 816
III. The Looming Legitimacy Crisis...................................................818
A. Lessons from Litigation ............................................................ 818
B. Undermining Functionalism..................................................... 832
IV. Toward a New Vision of the Administrative State................835

Conclusion.................................................................................................844

[Page 799]

Introduction

In 2016, the Arkansas Department of Human Services decided to make a change.1 Rather than having a nurse visit disabled residents at home to assess their care needs, the agency hired a software company to build an algorithm that would automate the determination.2 The agency hoped to save money.3 Instead, administrators found themselves in federal court.4

Arkansas' new system proved cruel and illogical. The Kafkaesque system decreased the home care of an amputee because he had no "foot problems."5 As a result of the automated system's dysfunction, severely disabled Medicaid recipients were left alone without access to food, toilet, and medicine for hours on end.6 Nearly half of Arkansas Medicaid recipients were negatively affected.7 Obtaining relief from the software-based outcome was all but impossible.8

A federal court enjoined the state agency from using the automated system after a damning narrative emerged. Agency officials admitted they did not know how the system worked.9 The authors of the algorithm and the software vendors were similarly unable, or unwilling, to provide an explanation.10 On cross-examination in open court, the agency and its partners admitted not only that they failed to detect the errors that the litigants uncovered, but also that in many instances they lacked the expertise necessary to do so.11

Administrative agencies are a constitutional anomaly. They are permitted to exist, we are told, because the world is complicated and requires expertise and discretion beyond the capacity of legislatures.12 And yet more and more agency officials are admitting—sometimes in open court—that they possess neither. Agencies are invested with governing authority (over the objections of many)

[Page 800]

due to their unique capabilities and knowledge, and now they are turning that authority to machines.

Since the turn of the millennium, inadequately resourced federal and state agencies have turned to automation for a variety of reasons but notably to contain costs.13 A little over a decade ago, the problems associated with automating public-benefits determinations came into view.14 In the public benefits arena, programmers embedded erroneous rules into the systems, more often by mistake or inattention than by malice or intent.15 Systems cut, denied, or terminated individuals' benefits without explanation in violation of due process guarantees.16

Challenging automated decisions was difficult because systems lacked audit trails that could help excavate the reason behind the decisions.17 Judicial review had limited value in light of the strong psychological tendency to defer to a computer's findings.18 These problems affected hundreds of thousands of people (often the most vulnerable), wasted hundreds of millions of dollars, and produced expensive litigation.19 Agencies spent millions to purchase automated systems.20 And they spent millions more to fix the problems those systems created.21

Despite these concerns, agencies have continued to adopt—often via third-party vendors—automated systems that defy explanation even by their creators. New York officials are still using the defective algorithm litigated in Arkansas

[Page 801]

despite its clear deficiencies.22 Idaho's health and welfare agency commissioned its own budget software tool to allocate the number of hours of home care for disabled Medicaid recipients.23 That algorithmic tool also drastically cut individuals' home care hours without meaningful explanation and faced challenge in court.24

The pattern is hardly limited to health administration. State agencies have deployed algorithms and software to evaluate public school teachers in Texas, to assess and terminate unemployment benefits in Michigan, and to evaluate the risks posed by criminal defendants in D.C., Wisconsin, and elsewhere.25

Nor is the pattern limited to the states. The Department of Homeland Security has long deployed an algorithmic system—the so-called No-Fly List—to try to prevent terrorists from traveling.26 This data-matching program has misidentified many individuals, in part because it uses crude algorithms that could not distinguish between similar names.27 Thousands of people got caught in the dragnet, including government officials, military veterans, and toddlers.28 The U.S. government would not say if one was on the list and provided no explanation for no-fly decisions.29

An increasingly wide variety of federal agencies leverages algorithms and automation in carrying out their statutorily committed duties. The IRS, SEC, USPS, and myriad other federal agencies are using machines in one manner or another.30 A recent report shows that nearly half of all agencies use, or are investigating the use of, artificial intelligence.31 Just last year an Executive Order

[Page 802]

directed all federal agencies to explore the potential efficiencies of artificial intelligence.32

Agencies are listening. A January 2019 request for proposals from the U.S. Department of Health and Human Services sought a contract to coordinate artificial intelligence procurement, describing the contract as "the next logical step to integrating [intelligence automation and artificial intelligence] into all phases of government operations."33

The turn toward automation in recent decades has not gone unchallenged. Scholars have repeatedly pushed back against governmental use of software and algorithms to arrive at decisions and goals previously carried out by people. "The human race's rapid development of computer technology," observed Paul Schwartz thirty years ago in a related context, "has not been matched by a requisite growth in the ability to control these new machines."34 In 2008, one of us (Citron) offered an extensive framework for evaluating and responding to agency reliance on technology.35 In recent years this discourse has burgeoned into a full-blown literature spanning multiple disciplines.36

Yet the challenges posed to the automated administrative state to date tend to proceed from a very specific frame: the problem of automation arises when a machine has taken over a task previously committed to a human such that guarantees of transparency, accountability, and due process fall away.37 This frame follows a tendency in law and technology generally to focus on how machines that substitute for humans undermine certain legally protected values or rights. The discussion of how best to restore due process in light of computer-

[Page 803]

driven decision-making is an example.38 The debate around liability for driverless cars is another.39

The 2017 article Accountable Algorithms is illustrative of the literature.40 "Important decisions that were historically made by people are now made by computer systems," the authors write, and "accountability mechanisms and legal standards that govern decision processes have not kept pace with technology."41 In other words, many consequential government decisions were once made by people, attended by accountability mechanisms suited to people. Now that machines make these decisions, law or technology must change to restore the rights and values afforded individuals under the previous arrangement. The authors suggest legal and technical...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT