Privacy's trust gap: a review.

Author:Richards, Neil
Position:Book review

Obfuscation: A User's Guide for Privacy and Protest

By Finn Brunton and Helen Nissenbaum Cambridge and London: The MIT Press, 2015.

BOOK REVIEW CONTENTS INTRODUCTION I. OBFUSCATION AND THE INDIVIDUALISTIC STORY OF PRIVACY A. The Appeal of Obfu-scation 1. Obfuscation's Argument 2. A Pragmatic Rebuttal to Overly Simplistic Notions of Privacy 3. The Need for Privacy Outside Trustworthy Relationships B. Privacy Islands II. OBFUSCATION REQUIRES TRUSTWORTHY ALLIES A. Obfuscation's Call for a Lonely Revolution B. Obfuscation Requires Reliance on Designers C. Obfuscation Requires Cooperation from Confederates III. OBFUSCATION AS SECOND-BEST PRIVACY A. Obfuscation Promotes Distrust B. Legal Reform Is Not Hopeless IV. THE POTENTIAL OF TRUST A. A Theory of Privacy and Trust B. Privacy Problems from a Trust Perspective C. Promoting Trust in a Digital Society CONCLUSION INTRODUCTION

It can be easy to get depressed about the state of privacy these days. In an age of networked digital information, many of us feel disempowered by the various governments, companies, and criminals trying to peer into our lives to collect our digital data trails. (1) When so much is in flux, the way we think about an issue matters a great deal. Yet while new technologies abound, our ideas and thinking--as well as our laws--have lagged in grappling with the new problems raised by the digital revolution.

Reading between the lines in the debate over surveillance and data collection, it is easy to think that protecting privacy is all on you. Most privacy discussion is framed in individualistic terms. For example, we talk about an individual's "right to privacy" and whether that individual right has any meaning any more. Policymakers fight for a person's "individual control" over personal information. Companies promise to give consumers "personal choice" to empower their personal preferences about how their information is collected, used, and shared. It is as though we are all islands, each waiting to exercise our individual ability to protect our privacy against those who would surveil us, whether they are private companies or government agents.

Thinking of privacy as an issue of personal choice, preferences, and responsibility has powerful appeal. It resonates with American ideals of individualism, democracy, and consumerism. It flatters our sense of autonomy and accommodates our diverse notions of privacy and preferences for disclosure. For instance, you might not want to broadcast the details of your life on Instagram or Snapchat, but others might. Individualistic notions of privacy lead us to favor solutions that help us choose and put us in control of our own unique lives.

Yet there is a problem with this view of the digital world, and it is a problem of power. In the digital economy, the real power is not held by individual consumers and citizens using their smartphones and laptops to navigate the twists and turns of their lives, but by the large government and corporate entities who monitor them. The digital consumer is not like the classic American myth of the cowboy, a rugged and resilient island of autonomy set against the backdrop of the digital frontier. 011 the contrary, she is increasingly disempowered, marginalized, and subject to monitoring and sorting by powerful institutions about whose existence she may not know, and whose activities she may not be able to resist. In the digital world, we may heap responsibility on individual users of technology, but they lack options for protecting themselves. (2) This is another form of the "digital divide"--it is not merely that some people have access to technology while others do not, but that most people are vastly less powerful than the government and corporate institutions that create and control digital technologies and the personal data on which those technologies run.

If the monitored are responsible for protecting themselves, one possible strategy is to obscure their tracks, thereby turning the digital realm into a big game of hide and seek. In their book Obfuscation: A User's Guide for Privacy and Protest, Finn Brunton and Helen Nissenbaum put forward a manifesto for the digitally weak and powerless, whether ordinary consumers or traditionally marginalized groups who lack the knowledge or means to effectively protect their digital lives from monitoring. (3) They tell us at the outset that "[w]e mean to start a revolution with this book. But not a big revolution--at least, not at first." (4) Brunton and Nissenbaum develop the idea of obfuscation, which they define as "the deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection." (5) This can take many forms, but for consumers, it might include swapping their phone SIM cards with those of their friends or using software that buries genuine search engine queries in a crowd of irrelevant ones. (6) Brunton and Nissenbaum argue that obfuscation is necessary to counteract information power imbalances that occur "when data about us are collected in circumstances we may not understand, for purposes we may not understand, and are used in ways we may not understand." (7)

Obfuscation is attractive because it offers to empower individuals. It is a chance for people to strike back against forces that have the ability and incentive to exploit informational power for surveillance and data collection, whether government law enforcement agencies or Internet tracking and marketing companies. It carves out spaces for privacy against the powerful--a digital treehouse, French Resistance hideout, or Dagobah swamp. Obfuscation is a "weapon of the weak," offering the romantic promise of restoring some of the digital age's power imbalances in favor of the plucky underdog. (8) Obfuscation is appealing, even seductive, but we must ultimately put it in context. Obfuscation is a powerful idea, but as Brunton and Nissenbaum are careful to admit, it is only part of the larger privacy puzzle. (9)

Even with this caveat, obfuscation seems ill suited to be the stuff of revolutions, because privacy built 011 obfuscation can be at most a second-best kind of privacy. Instead of a first-best privacy in which rules and design ensure safe, sustainable processing of personal data, and personal control is properly treated as a scarce resource, obfuscation offers only the kind of privacy that requires the disempowered to grab it for themselves. As such, it falls into the all-too-common trap of thinking about privacy in primarily individualistic terms, leveraging the weak power of individuals rather than the strong power of law and society. It reinforces the standard narrative of privacy that emphasizes control, choice, and privacy self-management above all else--a narrative that is likely doomed to failure if we continue to accept it. (10)

This reinforcement of the default story can be a serious problem. How we think about legal problems matters a great deal, especially in areas like privacy where technologies, economics, and social norms are in flux. The frames and metaphors we use to describe issues like privacy are essential because they allow us to understand or confuse issues, problems, and potential solutions. (11)

Brunton and Nissenbaum are careful to position obfuscation as a realistic, affordable, and reliably good enough tactic to protect our privacy. This is a real and important contribution. A healthy dose of pragmatism regarding how to preserve our privacy is welcome in the modern climate, in which the Utopian dreams of some global regulators can sometimes create irrational and ineffective obligations regarding data. Consider, for example, the implications of a broad reading of the European "Right to Be Forgotten," which is sometimes described as creating an internet that could be edited like Wildpedia by individuals who do not like the facts reported about them in newspapers. (12) All too often, a privacy policy like this can make the perfect the enemy of the good by seeking to outright prevent or control data collection or surveillance, rather than to mitigate problems through regulations designed to serve human ends. More nuanced understandings of privacy are necessary to temper overambitious regulations which fetishize consent in ways that elevate form over function. Society's adoption of a more pragmatic approach to privacy can also ease the pressure on regulators to adopt draconian privacy rules such as data localization laws, which can provide cover for countries seeking to preserve their own economic interests, while providing few real benefits for citizens. (13) Brunton and Nissenbaum show that sometimes a bit of pragmatic privacy can be enough to do what is needed.

More fundamentally, however, pragmatism will not be enough if the conceptual foundation for protecting our privacy is deficient. In talking about the foundation for a privacy revolution, we can do better than making incremental improvements to the standard story of a highly individualistic, atomistic privacy. We must think about privacy instead as the rules which govern personal information and take into account more complex social contexts, the increasing importance of information relationships in the digital age, and our need to rely on (and share information with) other people and institutions to live our lives. (14) Information relationships are relationships in which information is shared in trust and in which the rules governing the information sharing create value and deepen those relationships over time. If privacy is increasingly about these information relationships, it is also increasingly about the trust that is necessary for them to thrive, whether those relationships are with other humans, governments, or corporations. Trust is particularly important for the large tech companies with which we increasingly share vast amounts of often-intimate data. For instance, the battles that Apple and Microsoft...

To continue reading