IMPOVERISHED ALGORITHMS: MISGUIDED GOVERNMENTS, FLAWED TECHNOLOGIES, AND SOCIAL CONTROL.

AuthorValentine, Sarah

TABLE OF CONTENTS Introduction 365 I. New Tools Built on Past Prejudice 370 A. Government Malfeasance 371 B. Problematic Analytics 378 C. Inaccurate and Discriminatory Data 387 II. Forces Driving Government Adoption of Big Data Analytics 393 A. Deference to Science and Technology 394 B. Historical Success of Technology as Domestic Social 399 Control Mechanism C. Private Sector Profitability 404 III. Toward Government Accountability 408 A. Litigation 408 B. Regulation 419 C. Activism and Organizing 423 Conclusion 426 INTRODUCTION

Governments at all levels, from the local to the federal, are increasing their reliance on algorithmic decision-making technologies. (1) Helpful as algorithms may be, they inevitably target marginalized populations and exacerbate the social stratification and vast inequality that already exists in our society. Simply put, algorithms are mathematical processes for solving defined problems. (2) Algorithmic decision-making technologies encompass a wide variety of big-data analytic systems, (3) including predictive analytics (4) and machine learning. (5) As these technologies grow more and more sophisticated, human decision-making in the areas of criminal justice, public benefits, and child welfare is rapidly being replaced by technologies that few understand and many in positions of power mistakenly believe are infallible. (6) When deployed to control and contain vulnerable populations, these systems dehumanize the people they target and impoverish standards of due process and justice.

Public awareness of the potential dangers that arise from misuse of big data is increasing, (7) and legal academia has begun to grapple with how this technology upends civil rights and privacy. (8) However, current discussions tend to elide how these algorithmic technologies increasingly become tools of social control, used to maintain rigid and historical demarcations of class and race. For example, while police officers have long had the ability to use their own judgment to decide if there is the articulable suspicion necessary to stop and frisk someone, (9) today that articulable suspicion may be guided by an algorithm that neither the police officer nor anyone else in the police department understands or can explain. (10) Similarly, while caseworkers have always had to make decisions about whether or not a family qualifies for public benefits or whether there is sufficient risk of harm to remove a child, (11) now those decisions are guided and sometimes determined by opaque and inexplicable predictive analytics. (12)

Although individual decisions made by police officers or caseworkers can be biased or wrong, those decisions are traceable to an individual actor in particular circumstances. Individual decisions can be disputed in court, with those affected able to challenge the circumstances or evidence the police officer or caseworker relied on. Big data analytics is altering how these kinds of governmental decisions are made and this, in turn, weakens the ability for those harmed to effectively challenge those decisions. Big data systems are often touted as more cost efficient and objective methods of governmental decision-making concerning vulnerable populations. (13) However, this focus on efficiency only glorifies savings over proper services, and the belief that hyper-surveillance and predictive analytics can solve deep issues of bias and discrimination is misguided at best.

We live in a country that consigns a large part of its population to an underclass, a permanently marginalized group contained, controlled, and criminalized purportedly for the protection of everyone else. (14) Over the past several decades, government has coercively leveraged the welfare, foster care, prison, and deportation systems to control residents of neighborhoods devastated by the systemic withdrawal of public resources. (15) When these vulnerable populations seek assistance, the state they encounter not only often fails to support them, but it also actively targets them with punitive social control mechanisms. (16) The criminal justice and social welfare systems are now fused to better control and contain marginalized populations such as the poor, the disabled, and communities of color. (17)

What happens when government introduces algorithmic decision-making systems into an already repressive environment? It increases its capacity to dominate vulnerable communities by making it almost impossible to challenge system errors. (18) It reinforces historical discrimination by relying on inaccurate and biased data. (19) It further destroys our country's already meager social safety net by ceding more regulatory power to private companies whose focus is profit. (20) Most dangerously, it allows governments to hide these negative effects behind the veneer of technological infallibility.

Technology is not neutral, and governmental reliance on big data analytics has the capacity to further erode fundamental relationships between the governing and the governed. (21) Unchecked governmental use of algorithms as social control mechanisms (22) is dangerous to many of our core democratic beliefs about due process and equality, especially when the technology is used to target already marginalized populations. Governmental adoption of these technologies is inherently political, not only because it impacts the use of governmental resources, but also because it reinforces some of the worst aspects of our current justice system. Big data analytics provides the state a degree of control over marginalized populations that is unrivaled in American history. (23) To confront the increasingly authoritarian application of big data analytics, progressive lawyers, policymakers, and advocates must not only understand the technology and how it reinforces oppression, but also must engage with the socioeconomic forces that drive governments to adopt technological systems of social control.

The layering of algorithms on top of the already complex social structures underpinning our problematic justice system may seem like a significant shift in practice to those of us not steeped in technology. However, no matter how complex the mathematics, these systems are less revolutionary than they are the logical evolution of past strategies that governments have used to control marginalized people. (24) Thus, many of the tools used to challenge governmental overreach in the past provide the foundations with which to oppose big data analytics--algorithms of social control themselves, both impoverished and impoverishing.

This Article discusses various aspects of how and why governments use algorithmic decision-making systems as a mechanism of social control. It also explores potential avenues of resistance before government reliance on these systems becomes unassailable. Part I addresses issues of governmental malfeasance in implementing big data technologies and discusses how the systemic flaws in the deployment of seemingly "objective" tools can do harm to vulnerable populations. Part II discusses some of the prominent sociopolitical factors driving governmental adoption of big data technologies.

Finally, Part III considers how litigation, regulation, and political activism can be combined to address the harms caused by governmental deployment of these systems.

  1. NEW TOOLS BUILT ON PAST PREJUDICE

    Governments have always relied on. the surveillance technology of the day. (25) Historically, the burdens of surveillance have fallen hardest on poor and marginalized populations. (26) Still, the wholesale adoption of big data analytics is unique, (27) even as it shares troubling historical roots with government use of other technologies. Unprecedented levels of public and private surveillance (28) have created what is commonly called "big data," (29) an almost unfathomable amount of searchable and sharable data on every individual and community in the country.

    The sheer amount of collected data from hyper-surveillance is stunning, but it is the analytics that weaponizes this information, allowing governments to "profile, police, and punish the poor." (30)

    Advances in computational science have created the ability to capture, collect, and combine everyone's digital trails and analyze them in ever-finer detail. (31) It is this merging of big data with advanced analytics that facilitates social control through algorithmic decision-making systems. As political scientist, data scholar, and activist Virginia Eubanks explains:

    Forty years ago, nearly all of the major decisions that shape our lives--whether or not we are offered employment, a mortgage, insurance, credit, or a government service--were made by human beings. They often used an actuarial process that made them think more like computers than people, but human discretion still ruled the day. Today, we have ceded much of that decision-making power to sophisticated machines. Automated eligibility systems, ranking algorithms, and predictive risk models control which neighborhoods get policed, which families attain needed resources, who is short-listed for employment, and who is investigated for fraud. (32) But algorithmic decision-making, built on imperfect science and implemented using terribly flawed data sets, is generally hidden, opaque, and unknowable--thus often unchallengeable, making it a stealth weapon of social control governments find hard to resist. (33) Limiting the harmful effects of big data analytics requires advocates to recognize the political implications of these systems. It also requires an understanding of just how much can, and does, go wrong with algorithmic decision-making technology. From government malfeasance in adopting and implementing the technology, to the problematic analytics and inaccurate data inherent in their design, the flaws are serious, systemic, and most often ignored.

    1. Government Malfeasance

      This section explores the ways in which government administrators...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT