Layered Opacity: Criminal Legal Technology Exacerbates Disparate Impact Cycles and Prevents Trust

AuthorBen Winters
PositionCounsel at the Electronic Privacy Information Center (EPIC)
Pages327-348
Layered Opacity: Criminal Legal Technology
Exacerbates Disparate Impact Cycles and
Prevents Trust
Ben Winters*
INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 327
I. THE TOOLS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329
A. Predictive Policing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 330
1. Case Study: Chicago Police Department . . . . . . . . . . . . 332
B. Risk Assessment Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 335
1. Case Study: Idaho Department of Corrections . . . . . . . . 335
C. Issues Common to All of These Tools. . . . . . . . . . . . . . . . . . . 337
1. Bias . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337
2. Accuracy, Transparency & Training Concerns . . . . . . . . 338
II. THE OPACITY PROBLEMS COMMON TO THE TOOLS, AND WHY THEY ERODE
TRUST . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 339
III. THE CYCLE: HOW THESE TWO MAIN TYPES OF TOOLS ARE LAYERED AND
INTERCONNECTED, AND WHY THEY MUST BE TREATED THIS WAY ...... 342
IV. AUTOMATED DECISION-MAKING AND GOVERNANCE OF THESE TOOLS . . 344
CONCLUSION. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 348
INTRODUCTION
Disparate impacts based on race, gender, socioeconomic status, and ethnicity
from algorithmic Statistical Analysis Tools (tools) manifest in, among others,
day-to-day law enforcement (police forces patrolling, stopping people, making
arrests, and being present) and Risk Assessment Tools employed throughout the
broader criminal legal cycle in the United States.
In law enforcement, tools such as Predictive Policing and ShotSpotters inform
the allocation of resources in patterns that perpetuate rather than confront bias.
Tools such as Facial Recognition, Automated License Plate Readers, and both
public and private surveillance systems exacerbate those effects. Individuals in
over-policed areas continue to be stigmatized, targeted, and arrested more as a
result. Predictive policing tools obfuscate the distinction between arrest data and
offense data and create a multi-faceted self-fulfilling prophecy. The results of
Risk Assessment Tools are inextricably linked to this trend because certain inputs
into these tools, which can lead to harsher treatment at every stage for an accused
* Ben Winters is Counsel at the Electronic Privacy Information Center (EPIC). Research supporting
this piece was largely supported by the Philip M. Stern Foundation through an Equal Justice Works
Fellowship completed at EPIC from 2019-2021. This piece reflects developments in the law through
2020. © 2021, Ben Winters.
327
individual, include neighborhood arrest data and socioeconomic figures, in addi-
tion to other proxies for protected classes.
The developers of these tools conceal the inner-workings of their programs, of-
ten embracing over-broad trade secret protections and the culture of opacity in
technology and government. The black boxremains hidden not only from the
public but often from the agencies employing them.
1
This opacity diminishes
accountability, transparency, trust, and the exercise of a complete criminal
defense, to the particular detriment of defendants in protected classes. It also
embraces and encodes, rather than confronts, the reasons these biased effects
exist. Advocates work to achieve the most basic levels of transparency regarding
systems used in a given jurisdiction to some success, but the burden should fall
on the users and purchasers of the technologies to articulate and publish the pur-
pose of these tools, make them transparent, and evaluate the propriety of their
use.
The specific concerns and solutions to ameliorate negative effects vary from
tool to tool, but they both operate in, reflect, and support the same biased system.
Both predictive policing tools and risk assessment tools effectively criminalize
poverty as a result of the factors they use to make the determination they aim to
make.
2
As research using both types of tools continue to show these disparate impacts,
scholarship on algorithmic accountability and governance has ballooned, forcing
1. See, e.g., State v. Loomis, 881 N.W.2d 749, 761 (Wis. 2016) (Northpointe, Inc., the developer of
COMPAS, considers COMPAS a proprietary instrument and a trade secret. Accordingly, it does not
disclose how the risk scores are determined or how the factors are weighed. . .. Thus, to the extent that
Loomis’s risk assessment is based upon his answers to questions and publicly available data about his
criminal history, Loomis had the opportunity to verify that the questions and answers listed on the
COMPAS report were accurate. Additionally, this is not a situation in which portions of a PSI are
considered by the circuit court, but not released to the defendant. The circuit court and Loomis had
access to the same copy of the risk assessment.).
2. This piece uses predictive policing systems and risk assessment tools as two exemplars used by
government actors to illustrate the degree of interconnectedness. Many of the same principles discussed
here are transferrable and applicable to other technologies that vary greatly in who uses them, who
develops them, their purpose, and use procedures. See also TAWANA PETTY, MARIELLA SABA, TAMIKA
LEWIS, SEETA PE~
nA GANGADHARAN & VIRGINIA EUBANKS, OUR DATA BODIES: RECLAIMING OUR DATA
INTERIM REPORT (2018). In Our Data Bodies, the authors feature excerpts from interviews describing
the cycle of injustice:
The collection, storage, sharing, and analysis of data as part of a looping cycle of injustice that
results in diversion from shared public resources, surveillance of families and communities, and vio-
lations of basic human rights. Connected to the experience of power and powerlessness, the theme of
set-upconcerns how data collection and data-driven systems often purport to help but neglect and
fail Angelinos. Interviewees described these set-ups as trapsor moments in their lives of being
forced or cornered into making decisions where human rights and needs are on a chopping board.
When using social services to meet basic needs or expecting that a 9-1-1 call in an emergency will
bring health and/or safety support into their homes or communities, our interviewees spoke about
systems that confuse, stigmatize, divert, repel, or harm. These systems—or the data they require—
give people the impression of helping, but they achieve the opposite. They ask or collect, but rarely
give, and that leads to mistrust, disengagement, or avoidance. Furthermore, systems perpetuate vio-
lent cycles when they are designed to harm, criminalize, maintain forced engagement. Id. at 20.
328 JOURNAL OF NATIONAL SECURITY LAW & POLICY [Vol. 12:327

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT