The solution to the pervasive bias and discrimination in the criminal justice system: transparent and fair artificial intelligence

AuthorMirko Bagaric, Jennifer Svilar, Melissa Bull, Dan Hunter, and Nigel Stobbs
PositionDean of Law, Swinburne Law School, Melbourne/J.D., University of Tennessee College of Law/Professor, Law School, Queensland University of Technology Law School/Dean, Faculty of Law, Queensland University of Technology/Senior Lecturer, Law School, Queensland University of Technology
Pages95-148
THE SOLUTION TO THE PERVASIVE BIAS AND
DISCRIMINATION IN THE CRIMINAL JUSTICE SYSTEM:
TRANSPARENT AND FAIR ARTIFICIAL INTELLIGENCE
Mirko Bagaric*, Jennifer Svilar**, Melissa Bull***, Dan Hunter****, and
Nigel Stobbs*****
ABSTRACT
Algorithms are increasingly used in the criminal justice system for a range of
important matters, including determining the sentence that should be imposed on
offenders; whether offenders should be released early from prison; and the loca-
tions where police should patrol. The use of algorithms in this domain has been
severely criticized on a number of grounds, including that they are inaccurate
and discriminate against minority groups. Algorithms are used widely in relation
to many other social endeavors, including flying planes and assessing eligibility
for loans and insurance. In fact, most people regularly use algorithms in their
day-to-day lives. Google Maps is an algorithm, as are Siri, weather forecasts,
and automatic pilots. The criminal justice system is one of the few human activ-
ities which has not substantially embraced the use of algorithms. This Article
explains why the criticisms that have been leveled against the use of algorithms
in the criminal justice domain are flawed. The manner in which algorithms oper-
ate is generally misunderstood. Algorithms are not autonomous machine applica-
tions or processes. Instead, they are developed and programmed by people and
their efficacy is determined by the quality of the design process. Intelligently
designed algorithms can replicate human cognitive processing, but they have a
number of advantages, including the speed at which they process information.
Also, because they do not have feelings, they are more objective and predictable
than people in their decision-making. They are a core component of overcoming
the pervasive bias and discrimination that exists in the criminal justice system.
INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
I. CURRENT PROBLEMS WITH HUMAN-DECISION MAKING IN THE CRIMINAL
JUSTICE SYSTEM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
II. THE NATURE OF ARTIFICIAL INTELLIGENCE . . . . . . . . . . . . . . . . . . . . . 104
* Dean of Law, Swinburne Law School, Melbourne. © 2021, Mirko Bagaric, Jennifer Svilar, Melissa Bull,
Dan Hunter, and Nigel Stobbs.
** J.D., University of Tennessee College of Law. Jennifer is an attorney in the Commercial Litigation Group
at Butler Snow LLP.
*** Professor, Law School, Queensland University of Technology Law School.
**** Dean, Faculty of Law, Queensland University of Technology.
***** Senior Lecturer, Law School, Queensland University of Technology.
95
A. Overview of Algorithms and Artificial Intelligence . . . . . . . . . 104
B. Google Maps, Weather Forecasts, and Automatic Piloting . . . 105
1. Google Maps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
2. Weather Forecasting . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
3. Automatic Piloting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
III. CURRENT USE OF ALGORITHMS AND AI IN THE CRIMINAL JUSTICE SYSTEM 108
A. Policing and Detection of Crime . . . . . . . . . . . . . . . . . . . . . . 108
1. Predictive Policing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
2. Automated Visual Monitoring . . . . . . . . . . . . . . . . . . . . . 113
B. Bail . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
C. Sentencing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
D. Parole . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
IV. CRITICISMS OF ALGORITHMS AND AI IN THE CRIMINAL JUSTICE SYSTEM AND
RESPONSES TO THE CRITICISMS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
A. Policing and Detection of Crime . . . . . . . . . . . . . . . . . . . . . . 125
1. Validity and Accuracy Concerns . . . . . . . . . . . . . . . . . . . 125
2. Privacy and Liberty Concerns . . . . . . . . . . . . . . . . . . . . . 127
B. Bail . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
C. Sentencing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
D. Parole and Probation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
V. REFORM RECOMMENDATIONS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
A. Transparency and Accountability. . . . . . . . . . . . . . . . . . . . . . 140
B. Algorithmic Fairness. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
C. Enhanced Predictability and Consistency . . . . . . . . . . . . . . . . 146
CONCLUSION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
INTRODUCTION
Algorithms are increasingly being used in the criminal justice system. They are
used by police to predict where the next crime will happen and by courts to deter-
mine whether a defendant will commit a future crime. Pretrial risk assessment
algorithms evaluate whether a criminal defendant poses a threat to public safety or
will fail to show up to court for bail proceedings.
1
After adjudication, algorithms
are relied on to predict whether a defendant will recidivate, which is an important
consideration in sentencing determinations and parole decisions.
2
Risk scores pro-
vided by these tools may also be used during incarceration to determine what level
of security a prisoner requires.
3
In all of these scenarios, risk assessment tools are
used because they provide the objectivity and efficiency that humans cannot,
4
and
1. Sarah Brayne & Ange
`le Christin, Technologies of Crime Prediction: The Reception of Algorithms in
Policing and Criminal Courts, 68 SOC. PROBS. 608, 611 (2021).
2. Id.
3. Id.
4. Id. at 615.
96 AMERICAN CRIMINAL LAW REVIEW [Vol. 59:95
proponents believe that bringing objective regularity to the criminal justice system
will lead to more accountability and transparency for decisions that impact peo-
ple’s lives.
5
Today, more than sixty kinds of risk assessment tools are being used in the criminal
justice system
6
because of a focus toward evidence-based practices.
7
These sys-
tems, in theory, could lead to more uniformity in criminal sentencing, as [m]odels
and algorithms can process and review vast amounts of data in order to identify and
trace these factors and signify their weight and force in causing criminal behavior.
8
Some risk assessment tools rely on machine learning algorithms, which allow the
machines to train themselves on how to best process the data.
9
The use of algorithms in the criminal justice domain is, however, heavily
criticized. The fact that these systems can rewrite their own processing structure
indicates that they might rely on assumptions about relationships between differ-
ent categories of data that may remain hidden even to the systems’ designers,cre-
ating a black box.
10
Another key criticism is that they discriminate against
already disadvantaged groups. A new coalition has formed
11
to argue that algo-
rithms are neither accurate nor objective, and the burden of these shortcomings is
disproportionately borne by historically marginalized groups . . . .
12
A 2019 sur-
vey showed that nearly three in five Americans believe algorithms make bias in-
evitable.
13
According to the same survey, 56% of Americans disapproved of
using an algorithm in decisions regarding parole.
14
This is a remarkable observation given that it has been established that Black
Americans have for decades been discriminated against by human decision-makers
in the form of police officers, prosecutors, and judges.
15
Human decision-making
5. Matt Henry, Risk Assessment: Explained, APPEAL (Mar. 25, 2019), https://theappeal.org/risk-assessment-
explained/.
6. Kia Rahnama, Science and Ethics of Algorithms in the Courtroom, 2019 J.L., TECH. & POLY 170, 174
(citing Alyssa M. Carlson, The Need for Transparency in the Age of Predictive Sentencing Algorithms, 103 IOWA
L. REV. 303, 309 (2017)).
7. Id. at 173 (citing Carlson, supra note 6, at 305).
8. Id. (citing Carlson, supra note 6, at 309).
9. Id. at 174.
10. Id. (citing Brent Daniel Mittelstadt, Patrick Allo, Mariarosaria Taddeo, Sandra Wachter & Luciano
Floridi, The Ethics of Algorithms: Mapping the Debate, BIG DATA & SOCY, July–Dec. 2016, at 1, 6).
11. According to Barabas, there is an influential community of researchers from both academia and industry
who have formed a new regulatory science under the rubric of ‘fair, accountable, and transparent algorithms’
(‘FAccT algorithms’).Chelsea Barabas, Beyond Bias: Re-Imagining the Terms of Ethical AIin Criminal
Law, 12 GEO. J. L. & MOD. CRITICAL RACE PERSP. 83, 96 (2020).
12. Id.
13. Jens Ludwig & Cass R. Sunstein, Discrimination in the Age of Algorithms, BOS. GLOBE (Sept. 24, 2019,
5:00 AM), https://www.bostonglobe.com/opinion/2019/09/24/discrimination-age-algorithms/mfWUxRH8O
dm6IRo3PZRLdI/story.html (citing Aaron Smith, Public Attitudes Toward Computer Algorithms, PEW RSCH.
CTR. (Nov. 16, 2018), https://www.pewresearch.org/internet/wp-content/uploads/sites/9/2018/11/PI_2018.11.
19_algorithms_FINAL.pdf).
14. Smith, supra note 13, at 3.
15. See infra Part I.
2022] THE SOLUTION TO DISCRIMINATION IN THE JUSTICE SYSTEM 97

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT