CODING OVER THE CRACKS: PREDICTIVE ANALYTICS AND CHILD PROTECTION.

AuthorGlaberson, Stephanie K.

TABLE OF CONTENTS Introduction 309 I. Child Protective Decision-Making 311 A. Decision Points in a Child Protective Case 311 B. Flaws in Human Decision-Making 317 1. The Social Science of Agency Decision-Making 318 2. Disproportionality 320 3. Foster Care "Panics" 322 C. Decision-Making Models Employed by Child Welfare 324 Systems II. Predictive Analytics in Child Welfare 327 A. Predictive Analytics: An Overview 328 B. The Current State of Affairs 331 1. The Allegheny Family Screening Tool 332 2. Eckerd Rapid Safety Feedback 335 III. Assessing the Models 336 A. Accuracy 337 1. Flaws in the Data: Garbage In, Garbage Out 337 2. Error in the Model: Not All Errors Are Created 338 Equal 3. Zombie Predictions 343 B. Fairness 344 1. Bias in, Bias out 345 2. Privacy Concerns 348 3. Transparency 350 C. Misuse 352 1. Acknowledging Limitatgions 352 2. The Human Role 355 IV. Preliminary Recommendations 356 A. Prioritize Transparency 357 B. Create Enforceable Legal Constraints and Guidance 359 C. Confront Bias 360 D. Focus on Prevention 361 Conclusion 362 INTRODUCTION

Child welfare authorities across the nation are engaged in a high-stakes project every day: they must predict which children might be at risk of harm in their homes, and whether and when authorities can, or should, intervene. Erroneous predictions carry grave consequences. Agency failures to intervene that result in harm to children at the hands of their caregivers are highly publicized. (1) Less publicized, but equally grave and much more common, are harms that result from unnecessary agency interventions, both those that separate children from their families and those that do not, each of which risks inflicting lifelong trauma. (2) This risk-prediction project relies heavily on human decision-makers, who often handle crushing caseloads under high stress, with little training, limited time, and imperfect information. Bias inevitably creeps into these vital human-powered decisions, resulting disproportionately in the breakup of poor families and families of color. (3)

In this age of automation and artificial intelligence, a tempting new prospect has emerged: using the "magic" of predictive analytics to forecast whether and when children need the government to intervene. Algorithmic prediction systems already are impacting many areas of life, from employment, to college admissions, to tax audits, to the criminal justice system. (4) State and local governments are using algorithmic models to attempt to predict future dangerousness of alleged offenders in setting terms of release or bail, to forecast likely recidivism through sentencing prediction tools, and to project likely probation violations. (5) Additionally, law enforcement agencies nationwide are attempting to use "predictive policing" to "stop[] crime before it happens," modeling not only where and how to allocate resources, but increasingly who to police. (6)

Child protective authorities are now dipping their toes into these waters as well. Agencies are building and deploying tools that pull together vast quantities of data stored by various government entities, and return a "risk score" purporting to predict the likelihood of child maltreatment. (7) When constructing these predictive tools, developers must make myriad complex, value-laden, and ultimately human decisions that touch on some of the most fundamental--and unanswered--questions in child welfare policy. (8) These questions range from how to define child maltreatment, to how much risk we are willing to tolerate, to how we value different types of error, to how we address bias and disproportionality in child welfare practices. As Cathy O'Neil, author of Weapons of Math Destruction, wrote, "models are opinions embedded in mathematics." (9)

This Article argues that the advent of predictive analytics risks simply coding over the cracks in the foundation of our child welfare system. Unless careful attention is paid to the assumptions, biases, and realities of our child welfare system at this critical juncture, algorithmic decision-making risks perpetuating and magnifying existing problems. The Article proceeds in four parts. Part I describes the "decision points" that a child welfare agency encounters as a child or family moves through the system. It considers the evidence showing that this human-based decision-making system is deeply flawed. Part II defines predictive analytics and explains the fundamentally human process of developing a machine learning algorithm. This Part also identifies those jurisdictions that are implementing or considering instituting predictive risk tools, highlighting those developed by Allegheny County, Pennsylvania and the Florida non-profit, Eckerd Kids. Part III analyzes the risks posed by predictive analytics as they are introduced into child protective decision-making. It addresses the ways in which predictive analytics can introduce error into decisions, further embed old prejudices and biases into new systems, and generate new risks for children and families. Finally, acknowledging that predictive analytics already are taking hold, Part IV provides some preliminary recommendations for ways in which advocates, scholars, and officials can attempt to ensure that any predictive tools employed where they work and live are developed responsibly and in line with community values.

  1. CHILD PROTECTIVE DECISION-MAKING

    To assess the role predictive analytics may play in child welfare, and the risks and benefits such tools offer, it is important to review the decisions that child welfare authorities are called upon to make in the course of a child protective case. This Part first will provide an overview of the various "decision points" that occur throughout a child protective case. It then will discuss the methods agencies generally use to arrive at decisions and conclude by reviewing the evidence that the current system of decision-making is flawed and must be improved. (10)

    1. Decision Points in a Child Protective Case

      As children and their families move through the child protective system, moments arise when individuals, agencies, and courts are called upon to make vital decisions. Oft-discussed examples include whether to report suspected maltreatment or whether a child should be removed from or returned to his or her home, but there are a multitude of other decisions made along the way. This Article refers to each of these moments as "decision points." (11) At each of these points, the quality of the decision may have grave consequences for the life of the child or family.

      In most instances, child welfare authorities begin their involvement in the life of a family when someone makes a phone call to report suspected abuse or neglect to state authorities. (12) The decision whether to report, therefore, is the first decision point in the process. Many states permit community reporters to remain anonymous. (13) State law often mandates that professionals who work closely with children and families, like teachers, social workers, and doctors, must report child abuse or neglect if they become aware of it. (14) In many places, the laws obligating mandated reporters to take action are vague and quite broad. (15)

      Regardless of whether a mandated reporter or some other individual places the call, it goes to the same place: a centralized call center, (16) where a hotline worker screens the call. (17) This brings us to the second decision point: the call screener must decide whether the allegations of abuse or neglect warrant an investigation. Call screeners usually have latitude to "screen in" the call, meaning to forward it to a local child protective office for investigation, to simply take note of the call and add it to a central file kept on the child or family, or to reject the call altogether. (18)

      If the worker "screens in" the call, he or she refers the family to a local agency office for an investigation. (19) Generally, the agency has a specific amount of time in which to complete its investigation and make a determination as to whether the allegations of abuse or neglect are supported or warrant further intervention. (20) This is the third decision point. Different jurisdictions use different terms for the determinations that can result at this stage, some marking cases "indicated" or "unfounded," others marking them "substantiated" or "unsubstantiated." (21) Regardless of the terms used, the legal standard of proof to justify government intervention generally is low. (22) In many states, if the investigation uncovers "some credible evidence" that the child is at risk of harm, the case worker may "substantiate" the report. (23) If the investigating team determines that the report is "unfounded," the case is closed, and--at least in theory--the allegations are put to rest. (24)

      If a report is "substantiated," the agency has some discretion in how it chooses to move forward. (25) Federal law requires that state agencies provide "reasonable efforts" to "prevent or eliminate the need for removing the child from the child's home." (26) States put this requirement into action in a variety of ways, including providing "preventive services" to families, where appropriate. For some families, the investigating worker might meet with the parents, offer support such as home cleaning, counseling, or other services, and conclude the interaction. For others, the agency will remain involved in the family's life for months or even years through an informal service relationship. (27) In either case, the availability of preventive services that meet the needs of the family will be important to ensuring positive outcomes. In many cities and states, however, preventive services that meet the needs of the community may not be readily available, or there may be long waitlists for enrollment. (28) In these services-only cases, the agency will encounter one more decision point: whether and when to close the case and cease its...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT