The racist algorithm?

AuthorChander, Anupam
PositionBook review

THE BLACK BOX SOCIETY: THE SECRET ALGORITHMS THAT CONTROL MONEY AND INFORMATION. By Frank Pasquale. Cambridge and London: Howard University Press. 2015. P. 218, $35.

INTRODUCTION

A pie chart satirizing Google's research and development expenditures imagines a largely tripartite division: omniscience, omnipresence, and omnipotence. (1) Uber offers its staff what it calls "God View," a real-time view of where all its users are going in a city. (2) In his new book, The Black Box Society: The Hidden Algorithms That Control Money and Information, Frank Pasquale (3) worries about the efforts of Silicon Valley companies to create a god in the machine. In Pasquale's forceful telling, the pie chart is not satire, but rather audacious ambition; he quotes Google's cofounder Sergey Brin, "[T]he perfect search engine would be like the mind of God" (p. 187).

We are increasingly living in a Matrix that most of us do not perceive. Pasquale is our Neo, compelling us to see the invisible digital overlords surrounding us. Pasquale does not allege the near coming of some dystopian fantasy of a networked Borg entity keen to enslave the galaxy or a Skynet bent on terminating humanity. Rather, he worries that decisionmaking on everything, from credit to employment to investments to even dating, is passing from humans to computers. (4) And these computers are remote and invisible, their algorithms protected from scrutiny by trade secret law, invisibly and relentlessly manipulating us for the benefit of corporate profit or worse (pp. 6-14). Pasquale shows that corporations often rebuff efforts to examine the algorithms they employ, and the law abets corporations in this task (Chapter Five).

Pasquale is part of a line of recent scholarship attacking the increasing role of automated algorithms in our lives--indeed, legal scholars are increasingly sounding the alarm on this unfettered algorithmic control. Jonathan Zittrain worries that a company like Facebook could even decide an election without anyone ever finding out. (5) Ryan Calo warns that companies may be manipulating us through advertising. (6) Call this the problem of algorithmic manipulation. (7)

I will argue that despite his careful and important account, Pasquale's "black box society" frame lends itself to a misdiagnosis of the discrimination problem likely to lie in algorithmic decisionmaking. This misdiagnosis leads to the wrong prescription--namely, an often-quixotic search for algorithmic transparency. (8) Furthermore, the transparency that Pasquale's argument can be read to support is the wrong sort: a transparency in the design of the algorithm. (I should make clear that Pasquale himself is more nuanced, calling for a discussion of the kinds of transparency we should demand; he asks: "How much does the black box firm have to reveal? To whom must it reveal it? And how fast ...?" (p. 142).) Even a transparent, facially neutral algorithm can still produce discriminatory results. (9) What we need instead is a transparency of inputs and results, which allows us to see that the algorithm is generating discriminatory impact. If we know that the results of an algorithm are systematically discriminatory, then we know enough to seek to redesign the algorithm or to distrust its results. The distinction is similar to the evidentiary difference between demonstrating disparate treatment and demonstrating disparate impact. (10) My central claim is this: if we believe that the real-world facts, on which algorithms are trained and operate, are deeply suffused with invidious discrimination, then our prescription to the problem of racist or sexist" algorithms is algorithmic affirmative action. Thus, the problem is not the black box, which is often more neutral than the human decisionmaker it replaces, but the real world on which it operates. We must design our algorithms for a world permeated with the legacy of discriminations past and the reality of discriminations present.

The importance of getting this right is clear. Facebook now owns a patent on a process by which a user can be denied a loan because of the creditworthiness of his or her friends. (12) IBM purports to offer an algorithm that can distinguish refugee from terrorist, "the sheep from the wolves." (13) Retailers can increasingly target certain shoppers for discounts. (14) Law enforcement officers are using "predictive policing" algorithms to identify "hot people" who might have a greater propensity to commit crime. (15) Judges are employing algorithms in sentencing. (16) As Pasquale describes, Google's search and Facebook's presentation algorithms determine what information we see (p. 82). The possibilities of discriminatory manipulation are legion. (17) Pasquale worries that the rise of algorithmic decisionmaking will make racism and other discrimination even more difficult to ferret out, hidden behind subtle manipulations that are nearly impossible to discern for ordinary citizens not privy to the internal computer code (p. 38). "It [c]reates [i]nvisible [p]owers," he warns (p. 193).

Pasquale's warning comes at a time when the #BlackLivesMatter campaign and other recent events have made the reality of racial- and gender-based discrimination in our society painfully clear. (18) In one famous experiment, job applicants with white-sounding names, such as Emily, received 50 percent more callbacks than those with African American-sounding names, such as Lakisha. (19) A study of emails sent to mortgage loan originators asking for loans found that African American--sounding names effectively reduced an applicant's credit score by 71 points (on a scale going up to 750). (20) A 2012 federal government report found that both African Americans and Asian Americans were shown 17.7 percent fewer homes than equally qualified white Americans. (21) A 2015 federal government study found yet other invidious discrimination in housing: housing providers tell deaf or hard-of-hearing homeseekers about fewer units than similar homeseekers who are not deaf. (22) Homeseekers who use wheelchairs are more likely to be denied an appointment to view rental housing in buildings with accessible units, and when given an appointment, are less likely to be shown suitable housing units than homeseekers who are ambulatory. (23) In a society where discrimination affects opportunities in innumerable ways, we must worry about the migration of discrimination to decisionmaking by algorithm.

This Review proceeds as follows. Part I reviews Pasquale's argument that our emerging black box society will increase discriminatory manipulations. It argues that, contrary to Pasquale's argument, instead of seeing algorithms as likely to increase intentional discrimination, the law has turned to algorithms to reduce the invidious discriminations that result from human decisionmakers with unfettered discretion. Through the example of sentencing guidelines, this Part demonstrates that law has preferred highly specified algorithmic decisionmaking in order to reduce the discriminatory results of open-ended human judgment. Part II argues that because of the real-world discrimination upon which the algorithms learn and operate, discrimination is still likely to emerge from automated algorithms that are designed in racially or gender-neutral fashion. Part III introduces the remedy of algorithmic affirmative action to combat the problem of viral discrimination--designing algorithms in race- and gender-conscious ways to account for existing discrimination lurking in the data.

  1. ALGORITHMIC MANIPULATION

    Pasquale deploys two striking Platonic metaphors to illustrate his concerns. First, he sees the data industry as wearing a ring of invisibility: "Black box insiders are protected as if they are wearing a Ring of Gyges--which grants its wearers invisibility but, Plato warns us in The Republic, is also an open invitation to bad behavior" (p. 190). Second, Pasquale posits the rest of us ordinary people as prisoners in Plato's allegory of the cave, forced to stare at a stony wall "flickering shadows cast by a fire behind them" (p. 190). Pasqaule concludes:

    [We prisoners in the cave] cannot comprehend the actions, let alone the agenda, of those who create the images that are all [we] know of reality. Like those who are content to use black box technology without understanding it, [we] can see mesmerizing results, but [we] have no way to protect [ourselves] from manipulation or exploitation (p. 190). Given the persistence of widespread racial and gender discrimination in the twenty-first century, should we not expect algorithms often programmed by racist and sexist programmers to manipulate us towards accepting racist and sexist decisions? Are programmers likely to manipulate algorithms to exacerbate existing discrimination in society? For a half-dozen reasons, I believe the answer is no. (Pasquale, I should note, does not suggest either rogue programmers or malign bosses, but the concern about algorithmic manipulation might be interpreted that way.)

    First, because much of societal discrimination is subconscious or unconscious, it is less likely to be encoded into automated algorithms than the human decisionmakers that the algorithms replace. (24) Much of recent research into racial bias has moved toward exposing its existence without focusing on whether it is conscious or not. Implicit association testing has revealed the prevalence of bias across the community. (25) As Jerry Kang writes, "[W]e may all be infected in ways we cannot admit, even to ourselves." (26) Research focused on implicit bias often posits that the bias is unconscious. (27) The Supreme Court in 2015 recognized that "unconscious prejudices" can motivate discrimination. (28) Unconscious discrimination is far less likely to manifest itself through the process of programming than through the process of decisionmaking. Programming requires a step-by-step writing process that depends on a conscious...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT