CAN ALGORITHMS RUN THINGS BETTER THAN HUMANS? WELCOME TO THE RISE OF THE ALGOCRACY.

AuthorBailey, Ronald

POLICE IN ORLANDO, Florida, are using a powerful new tool to identify and track folks in real time. Video streams from four cameras located at police headquarters, three in the city's downtown area, and one outside of a recreation center will be processed through Amazon's Rekognition technology, which has been developed through deep learning algorithms trained using millions of images to identify and sort faces. The tool is astoundingly cheap: Orlando Police spent only $30.99 to process 30,989 images, according to the American Civil Liberties Union (ACLU). For now the test involves only police officers who have volunteered for the trial.

But the company has big plans for the program. In a June meeting with Immigration and Customs Enforcement (ICE), Amazon Web Services pitched the tech as part of a system of mass surveillance that could identify and track unauthorized immigrants, their families, and their friends, according to records obtained by the Project on Government Oversight.

Once ICE develops the infrastructure for video surveillance and real-time biometric monitoring, other agencies, such as the FBI, the Drug Enforcement Administration, and local police, will no doubt argue that they should be able to access mass surveillance technologies too.

Amazon boasts the tool is already helping with everything from minimizing package theft to tracking down sex traffickers, and the company points to its terms of use, which prohibit illegal violations of privacy, to assuage fears.

As impressive as Rekognition is, it's not perfect. The same ACLU report found that a test of the technology erroneously matched 28 members of Congress with criminal mugshots. Being falsely identified as a suspect by facial recognition technology, prompting police to detain you on your stroll down a street while minding your own business, would annoy anybody. Being mistakenly identified as a felon who may be armed would put you in danger of aggressive, perhaps fatal, police intervention.

Are you willing to trust your life and liberty to emerging algorithmic governance technologies such as Rekognition? The activities and motives of a police officer or bureaucrat can be scrutinized and understood by citizens. But decisions made by ever-more-complex algorithms trained on vast data sets likely will become increasingly opaque and thus insulated from public oversight. Even if the outcomes seem fair and beneficial, will people really accept important decisions about their lives being made this way--and, as important, should they?

ENTER THE WITNESS

IN NICK HARKAWAY'S gnarly near-future science fiction novel Gnomon, Britain is protected by "the perfect police force"--in a pervasive yet apparently benign total surveillance state-called the Witness. "Over five hundred million cameras, microphones and other sensors taking information from everywhere, not one instant of it accessed initially by any human being," explains the narrator. "Instead, the impartial, self-teaching algorithms of the Witness review and classify [the inputs] and do nothing unless public safety requires it....It sees, it understands, and very occasionally it acts, but otherwise it is resolutely invisible."

When it comes to crime, the Witness identifies incipient telltale signs of future illegal behavior and then intervenes to prevent it. The system "does not take refuge behind the lace curtain of noninterference in personal business....Everyone is equally seen." The result is that it delivers "security of the self to citizens at a level unprecedented in history," and "all citizens understand its worth."

The Witness is a fictional example of what National University of Ireland Galway law lecturer John Danaher calls algocracy--algorithmic governance that uses data mining and predictive/descriptive analytics to constrain and control human behavior. (Broadly speaking, an algorithm is a step-by-step procedure for solving a problem or accomplishing a goal. A mundane example is a recipe for baking a cake.)

The exponential growth of digital sensors, computational devices, and communication technology is flooding the world with data. To make sense of all this new information, Danaher observes, humans are turning to the impressive capabilities of machine-learning algorithms to facilitate data-driven decision making. "The potential here is vast," he writes. "Algorithmic governance systems could, according to some researchers, be faster, more efficient and less biased than traditional human-led decision-making systems."

Danaher analogizes algocracy to epistocracy--that is, rule by the wise. And epistocracy is not too dissimilar from the early 20th century Progressive idea that corruptly partisan democratic governance should be "rationalized," or controlled by efficient bureaucracies staffed with objective and knowledgeable experts.

If rule by experts is good, wouldn't rule by impartial, infallible computers be better? "Bureaucracies are in effect algorithms created by technocrats that systematize governance," argues James Hughes, executive director of the Institute for Ethics and Emerging Technologies. "Their automation simply removes bureaucrats and paper."

Of course, what makes the Witness potent is that when its ever-watchful algorithms spot untoward behavior, they can...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT