We Don't All Look the Same: Police Use of Facial Recognition and the Brady Rule.

Date01 April 2022
AuthorBrown, Jaylla

TABLE OF CONTENTS I. INTRODUCTION 331 II. BACKGROUND 334 A. What Is Facial Recognition? 334 B. Problems with Facial Recognition 335 1. Operational Flaws 335 2. Algorithmic Flaws 337 C. Problems with Law Enforcement Use of Facial Recognition 341 D. Facial Recognition in Courts: Lynch v. Florida 343 E. What Is the Brady Rule? 344 III. FACIAL RECOGNITION EVIDENCE IS BRADY MATERIAL FOR A MISIDENTIFICATION DEFENSE 346 A. Evidence of Poor Operating Choices Taken by Police Departments when Using Facial Recognition Qualifies as Brady Material 346 1. Evidence Indicating Poor "Human Review" 346 2. Evidence of Police Overreliance on Facial Recognition Technology 347 B. Evidence of Poor Algorithmic Quality Constitutes Brady Material 348 1. The Name of the Algorithm 348 2. Other Matches Produced by the Algorithm 349 3. The Confidence Scores of Other Matches Produced 349 4. The Probe Photo Used to Conduct the Search 350 C. Facial Recognition Ensures Fair Treatment: It Is Not a Governmental Burden 350 IV. CONCLUSION 351 I. INTRODUCTION

Julia and Rosie Williams, two sisters from Farmington Hills, Michigan, were two and five years old when they watched their father get wrongfully arrested in their front yard on January 9, 2020. The girls looked on in tears as they saw their father pull into their driveway and immediately be handcuffed by police. It would be thirty hours until the Williams sisters saw their father again. After a day of disappearance, he told his family that he was arrested because of a computer error.

Their father was brought downtown to the police station to be questioned by detectives in a small room. While in this room, the detectives showed him two grainy stills taken from surveillance footage and a picture of his previous driver's license. In response to him telling the detectives that the man in the pictures from the surveillance footage was not him, a detective responded, "I guess the computer got it wrong too?" The father took a picture from the surveillance footage, held it next to his face and said, "I hope you don't think all Black people look alike." Despite his protest, Mr. Williams was detained and later released on bail. Luckily, his case was dismissed at his arraignment hearing because there was a second witness who had not identified Mr. Williams as the defendant.

Notwithstanding the dismissal of Mr. Williams' case, his daughters still live with the trauma of seeing their father get arrested for a crime he did not commit based on flawed facial recognition technology. But what would have happened if her father had to go to trial? Would he have access to the evidence he needed to defend himself in court? How would his lawyer build a case without any knowledge of the system that misidentified her client? Would the prosecutor be kind enough to inform the defense of the role facial recognition played in convicting him?

All of these questions arise when police cannot identify who they saw perpetrating a crime, so they rely on facial recognition to help them identify an unknown face. (1) While investigating a crime, the police can photograph a suspect and then use facial recognition to search that image against a database of mugshots and driver's licenses to help them identify that suspect by name. (2)

The fallible nature of facial recognition makes it particularly dangerous when used by law enforcement. Police sometimes use this technology in a manner that can be likened to a "virtual line-up." (3) However in this line-up, a human does not point to the suspect, an algorithm does. (4)

Many factors can influence the accuracy of this line-up. Most algorithms require human operation, so the operator's competence and lack of bias are crucial. (5) Additionally, there are factors that affect the accuracy of the algorithm itself. Facial recognition algorithms have higher rates of misidentification for Native Americans, African Americans, and Asian Americans. (6) They also have higher error rates for identifying women in comparison to men. (7) The least accurate error rates are most commonly seen in subjects who are female, Black, and eighteen to thirty years old. (8) Facial recognition technology performs worst on darker-skinned females, with the highest rate of error at 34.7%. (9) The darker the skin, the more errors, and gender orientation makes algorithm accuracy even more difficult to achieve. (10)

Given the substantial risk of misidentification for women and Black people by facial recognition, defendants should be able to challenge these factors in order to argue that they have been falsely matched based on their race or gender. If the operation of a system or the algorithm itself is flawed, then the identification decision is flawed. If a defendant can produce evidence that exposes a faulty identification, they can argue that the system identified the wrong suspect. This is impossible if the defendant does not have access to that evidence. If the prosecution is aware of any materially exculpable evidence for the accused, there is a Constitutional obligation to disclose it. (11) But, if the prosecution fails to do so, the defense is handicapped. (12)

In Brady v. Maryland, the Supreme Court held that nondisclosure of exculpatory evidence to the defendant violates the Due Process Clause of the Fourteenth Amendment, which entitles defendants to the right to a fair trial. (13) Scholars have suggested the Brady rule poses a doctrinal solution for access to facial recognition evidence. (14) However, this Note focuses specifically on Brady as a solution for defendants who have been misidentified by the technology based on their race or gender. These defendants are most likely to be misidentified by facial recognition and pursued as suspects by law enforcement. (15) The purpose of this Note is to demonstrate how evidence of racial or gender disparities impacting the accuracy of facial recognition technology qualifies as Brady material that the prosecution is obligated to disclose.

Despite defendants' need to access evidence about whether facial recognition was used in order to challenge its accuracy and to prevail on a misidentification defense, the Florida First District Court of Appeal ruled that defendants are not even entitled to view photos of other potential suspects identified by a facial recognition search that led to their arrest. (16) The court reasoned that because there is no reasonable probability the result of a trial would change if this evidence was disclosed to a defendant, there is no defendants' right to disclosure under Brady. (17) This opinion comes from Lynch v. State, where the court ultimately sentenced a Black man to eight years in jail for selling cocaine in 2016. (18) Lynch planned to use other photos that the facial recognition software produced alongside his to prove that he had been misidentified. (19) He argued that since the other matches were also potential suspects returned by the system, they would cast doubt on his identification as the defendant. (20) The court rejected Lynch's argument and he was never able to see the other photos produced by the system. (21)

The facial recognition system that identified Lynch, along with the pictures of four other potential suspects he was never able to see, is called the Face Analysis Comparison and Examination System (FACES). (22) Pinellas County Sheriff Department in Florida launched FACES in 2001, and since then it has become one of the most advanced statewide facial recognition systems in the country. (23) In 2020, the Department indicated that there were no plans of discontinuing the use of FACES despite the recent criticism that police use of facial recognition technology has received. (24)

This Note will explain why police use of facial recognition technology for criminal identification should be defined as exculpatory evidence that prosecutors have a duty to disclose under Brady. Part II, Section A will explain what facial recognition is and how it works. Section B will outline the racially discriminatory implications underlying facial recognition systems. Section C will discuss how law enforcement uses facial recognition. Section D will detail the Lynch case which illustrates how a Florida court has treated facial recognition as evidence in criminal court. Section E will explain what the Brady rule is. Part III will assert why facial recognition technology evidence qualifies as Brady material for minorities and women of color. Part III, Section A will explain why police misuse of facial recognition qualifies as Brady material for said defendants. Finally, Section B will explain why evidence of poor algorithm quality qualifies as Brady material.


    1. What Is Facial Recognition?

      Facial recognition is a form of biometrics that was created in the mid-1960s. (25) Biometrics is a technical term for body measurements and calculations such as DNA and fingerprints. (26) Biometrics is used to compare one piece of information to a dataset in order to determine someone's identity. (27) Where biometrics could involve a fingerprint analysis--comparing one fingerprint against a database of fingerprints to find a match--facial recognition aims to verify a person's identity by comparing a face against a dataset of other faces to produce a match. (28) The face that is compared to the dataset is called a probe image, which can be sourced from a photograph or video. (29)

      Before the software can match someone's face to others in a given database, an algorithm is used to find the person's face within the reference image. (30) Then, the system reads the geometry of the face to determine key characteristics such as the distance between the eyes and the distance from the forehead to the chin. (31) Those characteristics make up a "facial signature" which is a mathematical formula that the system can understand. (32) After the facial signature is created, the system "normalizes" the face by scaling, rotating and aligning it to...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT