Untangling Privacy: Losses Versus Violations

Author:Jeffrey M. Skopek
Position:University of Cambridge Faculty of Law. J.D., Harvard Law School; Ph.D. (History and Philosophy of Science), University of Cambridge; A.B., Stanford University
Pages:2169-2231
2169
Untangling Privacy:
Losses Versus Violations
Jeffrey M. Skopek*
ABSTRACT: Increasingly powerful data mining and analysis technologies
are being used to learn information and make decisions about people across
all areas of life—ranging from employment and policing, to housing and
health insurance—and it is widely thought that the key problems with this are
privacy-related. There is also an emerging consensus in the literature that
privacy rights lack a unified core. This Article demonstrates that these are
both mistaken conclusions that derive from the conflation of privacy losses
and violations, and it develops a theory of privacy that untangles these
misunderstood concepts at the heart of privacy law. In clarifying the outcome-
based criteria for privacy losses and their relationship with the path-based
criteria for privacy violations, this theory provides value across two domains.
First, regarding the coherence of the law, it demonstrates that a unified theory
of privacy rights is possible despite significant disagreement about their
content. Second, regarding the law’s content, it challenges orthodox views
about how the aggregation, use, and inference of personal information violate
privacy rights.
*
University of Cambridge Faculty of Law. J.D., Harvard Law School; Ph.D. (History and
Philosophy of Science), University of Cambridge; A.B., Stanford University. For helpful
conversations and comments at various stages of this project, I wish to thank Jennifer Anderson,
Lionel Bently, Glenn Cohen, David Erdos, Urs Gasser, Stephen John, Greg Keating, Kathy
Liddell, Tim Lewens, John Murphy, Nicholson Price, and everyone who provided comments
when I presented this in workshops at the Department of History and Philosophy of Scien ce and
the Centre for Intellectual Property and Information Law at the University of Cambridge; the
Faculty of Law at Academia Sinica, Taiwan; and at the following conferences: The Methodol ogy
and Ethics of Targeting (University of Cambridge); Privacy, Data Protection and Data-Sharing
(University of Hong Kong); Legal Dimensions of Big Data in the Health and Life Sciences
(University of Copenhagen); Policy-Making in the Big Data Era (University of Cambridge);
Biodata World Congress 2016 (Wellcome Genome Campus); and Big Data, Health Law, and
Bioethics (Harvard Law School).
2170 IOWA LAW REVIEW [Vol. 105:2169
I.INTRODUCTION ........................................................................... 2171
II.A TAXONOMY OF PRIVACY SCHOLARSHIP .................................... 2176
A.NORMATIVE ACCOUNTS ......................................................... 2176
1.The Interests that Privacy Protects ............................. 2176
2.The Rights that Arise from Privacy Interests ............. 2177
3.The Domain of Privacy Rights .................................... 2179
B.DESCRIPTIVE ACCOUNTS ........................................................ 2180
III.CONFLATION ERRORS ................................................................. 2181
A.MISTARGETED CRITIQUE ....................................................... 2182
B.MISGUIDED SKEPTICISM ......................................................... 2185
1.Denying Coherence .................................................... 2185
2.Regrounding Coherence ............................................ 2187
IV. PRIVACY LOSSES .......................................................................... 2189
A.ACCESS ................................................................................. 2189
1.The Accessibility Objection ........................................ 2189
2.The Control Objection ............................................... 2193
3.The Automation Objection ........................................ 2196
B.EPISTEMIC MERIT ................................................................. 2199
1.Theories of Knowledge ............................................... 2200
2.Epistemic Warrant and Privacy .................................. 2203
C.TRUTH ................................................................................. 2206
V.PRIVACY VIOLATIONS .................................................................. 2209
A.THE PATH-BASED ELEMENT ................................................... 2210
B.DATA AGGREGATION AND USE ............................................... 2213
1.No Right Against Aggregation ................................... 2213
2.No Right Against Unconsented Use .......................... 2219
C.INFERENCES OF PERSONAL INFORMATION ................................ 2223
1.Fourth Amendment Confusion .................................. 2224
2.Problems with Restricting Inferences ........................ 2229
VI.CONCLUSION .............................................................................. 2231
2020] UNTANGLING PRIVACY 2171
I. INTRODUCTION
It is widely thought that the core problems posed by new technologies of
personal data mining and analysis, as well as their solutions, can be explained
in terms of privacy. Take, for example, the uses of personal data in these cases:
an employer rejects a job applicant on the basis of a health
trait inferred from non-health data in his application;
a landlord screens out applicants on the basis of a proxy for
religion;
the police aggregate data about a person’s public
movements, thereby discovering the person’s sexual and
political orientations;
an internet platform infers private facts about a person from
his browsing history and uses this to tailor content;
a government agency makes a decision about an individual
entitlement on the basis of an algorithmic assessment that it
cannot explain.
These and other related uses of personal data are widely seen as violating
privacy rights.1 This is often a mistaken diagnosis, however, which arises from
a failure to differentiate between privacy losses and privacy violations. To
understand and address the actual threats posed by new ways of accessing and
using personal data, it is necessary to step back and clarify what privacy is
—and what it is not.
For as long as privacy has been the subject of academic study, privacy
scholars have highlighted that it is an ill-defined concept,2 and for as long as
they have tried to clarify it, their definitions have been rejected by others as
being too broad, too narrow, or both.3 In light of this history, Dan Solove has
championed the growing view that we should abandon our attempt “to locate
1. See, e.g., Kate Crawford & Jason Schultz, Big Data and Due Process: Toward a Framework to
Redress Predictive Privacy Harms, 55 B.C. L. REV. 93, 95–106 (2014); Daniel J. Solove, A Taxonomy
of Privacy, 154 U. PA. L. REV. 477, 505–16, 518–21 (2006) [hereinafter Solove, A Taxonomy of
Privacy].
2. See, e.g., DANIEL J. SOLOVE, UNDERSTANDING PRIVACY 1 (2008) (“Privacy . . . is a concept
in disarray. Nobody can articulate what it means.”); Tom Gerety, Redefining Privacy, 12 HARV.
C.R.-C.L. L. REV. 233, 233 (1977) (“Privacy is a legal wall badly in need of mending.”); Jeffery L.
Johnson, Privacy and the Judgment of Others, 23 J. VALUE INQUIRY 157, 157 (1989) (comparing the
concept of privacy to “a haystack in a hurricane”).
3. In 1978, David O’Brien c oncluded that the unitary definitions of privacy that had been
developed by others were either “imprecise, or too broad, or too narrow.” David M. O’Brien ,
Privacy and the Right of Access: Purposes and Paradoxes of Information Control, 30 ADMIN. L. REV. 45,
62 (1978). Nearly 25 years later, Dan Solove reached the same conclusion: “The most prevalent
problem with the conceptions is that they are either too narrow or too broad. . . . Often, the same
conceptions can suffer from being both too narrow and too broad.” Daniel J. Solove,
Conceptualizing Privacy, 90 CALIF. L. REV. 1087, 1094 (2002) [hereinafter Solove, Conceptualizing
Privacy].

To continue reading

FREE SIGN UP