Code of silence: how private companies hide flaws in the software that governments use to decide who goes to prison and who gets out.

AuthorWexler, Rebecca

One day in early January, a letter appeared on my desk marked DIN92A5501, an inmate's identification number from the Eastern Correctional Facility in upstate New York. The author, Glenn Rodriguez, had drafted it in upright, even letters, perfectly aligned. Here, in broad strokes, is the story he told:

Rodriguez was just sixteen at the time of his arrest, and was convicted of second-degree murder for his role in an armed robbery of a car dealership that left an employee dead. Now, twenty-six years later, he was a model of rehabilitation. He had requested a transfer to Eastern, a maximum-security prison, in order to take college classes. He had spent four and a half years training service dogs for wounded veterans and eleven volunteering for a youth program. A job and a place to stay were waiting for him outside. And he had not had a single disciplinary infraction for the past decade.

Yet, last July, the parole board hit him with a denial. It might have turned out differently but, the board explained, a computer system called COMPAS had ranked him "high risk." Neither he nor the board had any idea how this risk score was calculated; Northpointe, the for-profit company that sells COMPAS, considers that information to be a trade secret. But Rodriguez may have been stuck in prison because of it.

Proprietary algorithms are flooding the criminal justice system. Machine learning systems deploy police officers to "hot spot" neighborhoods. Crime labs use probabilistic software programs to analyze forensic evidence. And judges rely on automated "risk assessment instruments" to decide who should make bail, or even what sentence to impose.

Supporters claim that these tools help correct bias in human decisionmaking and can reduce incarceration without risking public safety by identifying prisoners who are unlikely to commit future crimes if released. But critics argue that the tools disproportionately harm minorities and entrench existing inequalities in criminal justice data under a veneer of scientific objectivity.

Even as this debate plays out, the tools come with a problem that is slipping into the system unnoticed: ownership. With rare exceptions, the government doesn't develop its own criminal justice software; the private sector does. The developers of these new technologies often claim that the details about how they work are "proprietary" trade secrets and, as a result, cannot be disclosed in criminal cases. In other words, private companies increasingly purport to own the means by which the government decides what neighborhoods to police, whom to incarcerate, and for how long. And they refuse to reveal how these decisions are made--even to those whose life or liberty depends on them.

The issue has been percolating through criminal proceedings for years. I work for the Legal Aid Society of New York City defending criminal cases that involve computer-derived evidence. I regularly see defendants denied information that they could use to cross-examine the evidence against them because it's a trade secret.

Right now, in Loomis v. Wisconsin, the U.S. Supreme Court is deciding whether to review the use of COMPAS in sentencing proceedings. Eric Loomis pleaded guilty to running away from a traffic cop and driving a car without the owner's permission. When COMPAS ranked him "high risk," he was sentenced to six years in prison. He tried to argue that using the system to sentence him violated his constitutional rights by demoting him for being male. But Northpointe refuses to reveal how it weights and calculates sex.

We do know certain things about how COMPAS works. It relies in part on a standardized survey where some answers are self-reported and others are filled in by an evaluator. Those responses are fed into a computer system that produces a numerical score. But Northpointe considers the weight of each input, and the predictive model used to calculate the risk score, to be trade secrets. That makes it hard to challenge a COMPAS result. Loomis might have been demoted because of his sex, and that demotion might have been unconstitutional. But as long as the details are secret, his challenge can't be heard.

What surprised me about the letter from Eastern was that its author could prove something had gone very wrong with his COMPAS assessment. The "offender rehabilitation coordinator" who ran the assessment had checked "yes" on one of the survey questions when he should have checked "no." Ordinarily, without knowing the input weights and predictive model, it would be impossible to tell whether that error had affected the final score. The mistake could be a red herring, not worth the time to review and correct.

Glenn Rodriguez had managed to work around this problem and show not only the presence of the error, but also its significance. He had been in prison so long, he later explained to me, that he knew inmates with similar backgrounds who were willing to let him see their COMPAS results. "This one guy, everything was the same except question 19," he said. "I thought, this one answer is changing everything for me." Then another inmate with a "yes" for that question was...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT