THE MISSING ALGORITHM: SAFEGUARDING BRADY AGAINST THE RISE OF TRADE SECRECY IN POLICING.

AuthorWon, Deborah

TABLE OF CONTENTS INTRODUCTION I. Law Enforcement Algorithmic Systems and Trade Secrecy A. What Are Algorithmic Systems? B. What Are Trade Secrets and What Can They Shield from Disclosure? II. When Due Process and Trade Secrets Clash A. The Brady Doctrine: Human Witnesses and Their Algorithmic Counterparts A. The Brady Doctrine: Human Witnesses and Their Algorithmic Counterparts 1. The Rise of Trade Secrecy in Policing 2. The Need to Treat Algorithmic and Human Sources Alike B. Additional Forms of Resistance to Brady Disclosure of Algorithmic Systems 1. Prosecutor's Knowledge or Possession 2. Material Either to Guilt or to Punishment III. THE MISSING ALGORITHM INSTRUCTION: AN AT-TRIAL SAFEGUARD A. The Missing Witness Instruction B. The Missing Algorithm Instruction Conclusion INTRODUCTION

Algorithmic policing is on the rise. Investigative tools like facial recognition, DNA genotyping, and predictive policing systems are increasingly--and effectively--being marketed by private technology companies as the best way to police efficiently under tight budget constraints. (1) But because algorithmic systems are built by humans, they exhibit human fallibilities, including racial and gender bias, inconsistency, and error. (2) Despite their similarities, algorithmic systems and humans are treated differently when used against a criminal defendant in court. Humans are subject to adversarial scrutiny; algorithmic systems are not. (3) The difference is attributable to trade secrecy, a form of intellectual property protection designed to maintain "standards of commercial ethics" and "encourage]] invention." (4) When invoked in the criminal context, however, trade secrecy shields algorithmic systems from adversarial scrutiny-even when the Constitution mandates it. (5)

It is perplexing that algorithmic systems receive heightened protection as intellectual property given that defendants' due process rights do not change simply because the investigatory source is digital, not human. This is particularly troubling given the government's affirmative duty under Brady v. Maryland to disclose helpful evidence to the criminal defendant. (6) The Brady obligation, established by the Supreme Court to ensure that the defendant receives a full and fair trial under the Due Process Clause, requires that the prosecutor turn over any information "favorable" to the defendant that is in the prosecutor's constructive "possession," so long as that evidence is "material" to the defendant's case. (7) Favorable information includes both impeachment (8) and exculpatory (9) evidence. The prosecutor must actively search for favorable evidence not just in their own possession, but also in the possession of any member of the "prosecution team." (10) The prosecutor's duty to search and disclose is "ongoing." (11)

However, prosecutors need only disclose favorable evidence that is "material." The standard for determining materiality may differ depending on whether a prosecutor's Brady compliance is evaluated before or after the trial's conclusion. The post-trial materiality standard is well established. A defendant that raises a Brady challenge after trial must show prejudice--that there is a "reasonable probability" disclosure of the evidence would have resulted in a different verdict. (12) The pretrial materiality standard, however, lacks uniformity. Some courts have imported the post-trial prejudice requirement into the pretrial standard, while others have rejected it and require only that the evidence be favorable. (13) Those courts with a mirrored pretrial standard therefore allow prosecutors to determine whether evidence is Brady material based on their own predictions of the eventual trial's outcome. The pretrial prejudice requirement thus makes it easy for prosecutors to manipulate Bradys materiality threshold; prosecutors can cobble together any number of arguments that disclosing evidence would not affect the ultimate verdict. Thus, to maintain the central tenet of Brady--that defendants be treated fairly as they face the machinery of the state--defendants should not be required to show prejudice before trial to obtain favorable evidence. (14)

This Note argues that for purposes of Brady disclosures, courts should view law enforcement algorithms as analogous to human witnesses and should accordingly implement an at-trial "missing algorithm" remedy when trade secrecy is invoked. Part I provides the factual and legal background of law enforcement algorithms and trade secrecy protections to place the question in context. Part II analyzes Brady and its progeny and concludes that algorithmic information falls within a prosecutor's duty to disclose favorable material to the defendant. Part III proposes that courts adopt a missing algorithm rule, allowing juries to draw reasonable and limited inferences to safeguard defendants' due process rights when their access is limited by intellectual property protections.

  1. LAW ENFORCEMENT ALGORITHMIC SYSTEMS AND TRADE SECRECY

    Machine-learning algorithms are increasingly executing government functions. One context in which algorithmic systems are proliferating is law enforcement and prosecution. (15) As public attention to automated decision-making has increased, terms like "algorithm," "machine learning," and "predictive policing" have become buzzwords, often used loosely and interchangeably. (16) Section I.A provides definitions of those terms and a high-level explanation of how an "algorithmic system" works. Section I.B summarizes U.S. trade secrecy law and explains which components of an algorithmic system can be shielded under trade secret protections.

    1. What Are Algorithmic Systems?

      To understand what a law enforcement algorithmic system is, it is necessary to first unpack what an algorithmic system is. An algorithmic system involves several separate technical components. As the term suggests, the foundation of the system is the "algorithm," a specified series of logical steps used to accomplish some task. (17) The algorithm is operationalized by source code. Source code is a series of letters, numbers, and punctuation that give the computer instructions on how to act in accordance with the algorithm. (18) Systems vary greatly in how many lines of source code they contain.

      A "machine-learning" algorithm is an algorithm that is "taught" on training data to perceive patterns and to subsequently become better at discerning new patterns when exposed to new information. (19) Training data is a collection of examples from which the algorithm is instructed to extract logical rules. (20) "Verification" and "test" data sets are then used to score and refine the performance of the algorithm. (21)

      In addition, an algorithmic system requires inputs to produce a desired output. The input is the information fed into the algorithm, and the output is the information created by applying the algorithm to the input data--for example, whether a person's picture has a match in a facial recognition database. Though every algorithmic system requires some kind of input and produces an output, they vary widely depending on the purpose or design of the system and can also vary over time. (22)

      These technical components--the algorithm, training data, input, and output--are limited by policy decisions implemented by the designer or the user. (23) Indeed, policies, which are the result of human choices, "govern how both technical and human components of th[e] system should behave." (24) These policy limits might take the form of certain prohibitions built into the algorithm, or they might instruct the user on how to act upon or interpret an algorithm's output. (25)

      A "law enforcement algorithmic system," as used in this Note, is an algorithmic system that is used by government entities like police departments for surveillance, investigation, or prosecution purposes. (26) One increasingly common example of a law enforcement algorithmic system is facial recognition technology. (27) A facial recognition algorithm, broadly speaking, is trained to identify faces by analyzing images in a historical dataset. (28) A police officer or analyst can then input an image into the algorithm. (29) The output is a series of similar photos, usually with a probability ranking to denote the likelihood of a "match." (30) Depending on the policies in place, the police officer may or may not act upon the similar photos by finding and detaining any identified individuals.

      Each technical component of an algorithmic system involves human judgment. The sequential logical steps embodied by the algorithm and operationalized by the source code are written and designed by humans. (31) The training data is selected by humans. The input is chosen and the output is interpreted by humans. Thus, each component risks human error and human bias. (32)

      For example, the historical datasets on which facial recognition and other machine-learning algorithms are trained are often skewed by race and gender. (33) One dataset, deemed the "gold standard benchmark for face recognition," was found to be approximately 83.5 percent white and 77.5 percent male. (34) Unsurprisingly, researchers have found that algorithmic systems are one hundred times more likely to misidentify African and Asian Americans than white individuals, (35) and are also more likely to misidentify women than men. (36) Illustrating this problem, one study tested Rekognition, Amazon's facial recognition tool, and found that it incorrectly identified twenty-eight members of Congress as people from a criminal database. (37) Nonwhite members of Congress were disproportionately misidentified, at about 40 percent of false positives, despite making up only about 20 percent of Congress. That false positives are much higher for racial minorities is particularly concerning given that minorities are also more likely to be subjected to facial recognition searches for law enforcement purposes. (38)

      Training data...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT