PRIVACY LAW'S FALSE PROMISE.

Author:Waldman, Ari Ezra
 
FREE EXCERPT

ABSTRACT

Privacy laws have never seemed stronger. New international, national, state, and local laws have been passed with the promise of greater protection for consumers. Courts across the globe are reclaiming the law's power to limit collection of our data. And yet, our privacy seems more in danger now than ever, with frequent admissions of nefarious data use practices from social media, mobile apps, and e-commerce websites, among others. Why are privacy laws, seemingly more comprehensive than ever, not working to protect our privacy? This Article explains.

Based on original primary source research--interviews with engineers, privacy professionals, and vendor executives; product demonstrations; webinars, blogs, industry literature; and more--this Article argues that privacy law is failing to deliver its promised protections because it is undergoing a process of legal endogeneity: mere symbols of compliance are standing in for real privacy protections. Toothless trainings, audits, and paper trails, among other symbols, are being confused for actual adherence to privacy law, which has the effect of undermining the promise of greater privacy protection for consumers.

INTRODUCTION I. THE SOCIAL PRACTICE OF PRIVACY LAW A. The Legal Experts B. Shifting Privacy Responsibilities II. UNDERMINING PRIVACY LAW A. Legal Endogeneity B. Legal Endogeneity in Privacy Law 1. Ambiguity and Process in Privacy Law 2. Framing Corporate Obligations Narrowly in Terms of Risk Avoidance 3. Symbols of Compliance 4. Managerialization of Privacy Law 5. Managerialization and the Perception of Compliance 6. Deference to Symbols in Privacy Law C. The Sociopolitical Narrative of Symbolic Privacy Compliance III. RECLAIMING PRIVACY LAW'S PROMISE A. Law Reform B. Rule-Making and Guidance C. Changes at the FTC D. Empowering Individuals E. Compliance Professionals CONCLUSION INTRODUCTION

The people we trust with our data are putting our privacy at risk. Facebook has long been cavalier about protecting personal information from third parties. (1) Mobile app platforms routinely sweep in user data merely because they can. (2) Manufacturers of toasters, (3) toothbrushes, (4) and sex toys (5) are wiring up everything to the Internet of Things, tracking intimate behaviors while giving hackers countless opportunities for mischief. (6) Facial recognition technology proliferates despite its Orwellian dangers. (7) Even academic researchers are mining intimate data without our consent. (8) Our privacy is in danger. And the laws that are supposed to protect us do not seem to be working. This Article explains why.

Privacy law--a combination of statutes, constitutional norms, regulatory orders, and court decisions--has never seemed stronger. The European Union's General Data Protection Regulation (GDPR) (9) has been called "comprehensive" (10) and "one of the strictest privacy laws in the world." (11) California's Consumer Privacy Act (CCPA) (12) stakes out similar ground. (13) The Federal Trade Commission's (FTC's) broad regulatory arsenal is putting limits on the collection, use, and manipulation of personal information. (14) The U.S. Supreme Court has started to reclaim the Fourth Amendment's historical commitment to curtailing pervasive police surveillance by requiring warrants for cell-site location data. (15) And the E.U. Court of Justice has challenged the cross-border transfer of European citizens' data, signaling that American companies need to do far more to protect personal information. (16)

This seems remarkably comprehensive. But the law's veneer of protection is hiding the fact that it is built on a house of cards. Privacy law is failing to deliver its promised protections in part because the corporate practice of privacy reconceptualizes adherence to privacy law as a compliance, rather than a substantive, task. Corporate privacy practices today are, to use Julie Cohen's term, managerial. (17) They prioritize innovation over regulation, efficiency over social welfare, and paperwork over substance. They also rely on new technologies to automate legal decisions. This Article provides the first picture of this growing privacy compliance market. Based on original primary source research into the ecosystem of privacy compliance, I argue that privacy law is experiencing a process of legal endogeneity: mere symbols of compliance are standing in for real privacy protections.

This development is new for privacy, but not new for the law. Legal endogeneity, as theorized by the socio-legal scholar Lauren Edelman, (18) describes how the law, rather than constraining or guiding the behavior of regulated entities, is actually shaped by ideas emerging from the space the law seeks to regulate. (19) It occurs when compliance professionals on the ground have significant power to define what the law means in practice. When given that opportunity, compliance professionals often frame the law in accordance with managerial values like operational efficiency and reducing corporate risk rather than the substantive goals the law is meant to achieve, like consumer protection or equality. This opens the door for companies to create structures, policies, and protocols that comply with the law in name only. (20) As these symbolic structures become more common, judges and policymakers defer to them as paradigms of best practices or as evidence for an affirmative defense or safe harbor, mistaking mere symbols of compliance with adherence to legal mandates. (21) When this happens, law fails to achieve substantive goals because the compliance metric--the adoption of symbols, processes, procedures, and policies within a corporate environment--may be orthogonal to actual progress. Edelman discussed legal endogeneity in the context of race and sex discrimination in the workplace, where the equality goals of Title VII of the Civil Rights Act were being frustrated by the ineffectual trainings, toothless policies, checklists, and disempowered diversity offices that compliance professionals created on the ground. (22) As this Article shows, the problem is far more pervasive than even Edelman suggested.

I present original research showing that privacy standards are being co-opted into corporate compliance structures that provide little to no protection. Each of the stages of legal endogeneity that Edelman noted is evident. Some of privacy law's most important tools--including privacy by design, consent requirements, and FTC consent decrees--are so unclear that professionals on the ground have wide latitude to frame the law's requirements, kicking endogeneity into high gear. Where rules are clear, they are so process-oriented that technological tools can create paper trails that may take the place of actual adherence to the law. And because those determining privacy law's meaning often reflect corporate or managerial--rather than consumer--interests, consumers more often than not lose out.

Scholars have documented the role that chief privacy officers (CPOs) (23) and engineers (24) play in implementing privacy law. But they are not alone. (25) Compliance professionals, marketing officers, outside auditors, human resource experts, and in-house and firm lawyers, just to name a few, "managerialize" privacy law, bringing in and prioritizing neoliberal values of efficiency and innovation in the implementation of privacy law. (26) They help shift the locus of legal decision-making from the legislature to the C-suite, think about privacy in managerial terms, and often create symbolic structures of compliance--from paper trails to formalistic, but insubstantial privacy checklists--with the goal of minimizing the risk of privacy litigation, investigation, and exogenous shocks, not of enhanced privacy protection for consumers.

There are, indeed, many privacy professionals working hard to protect consumer privacy. But they are pushing against a powerful tide. If left unabated, privacy's managerialization will have profound and troubling implications for privacy law, the technology industry, users, and society. As more technology companies paint creative pictures of their legal compliance, lawyers and judges become more likely to defer to the toothless structures companies create by either accepting them as evidence of substantive adherence to the law (27) or actually incorporating them into statutes, thereby undermining the capacity for law to achieve more robust privacy protections for users. (28) This does real damage to our quest for more privacy.

It also undermines the rule of law. The rise of merely symbolic structures neuters the ability of legislation to enact social policy: why pass a law to achieve positive social change if its goals are going to be frustrated in practice? Moreover, as the locus of legal decision-making shifts further away from policymakers to corporate managers, the substantive and procedural protections in the laws on the books may dissipate. (29)

This is a critical moment in the fight against the false hegemony of symbolic compliance in privacy law. Laws like the GDPR and the CCPA are still new, the FTC's agile approach to consumer protection can be redirected away from blind deference to corporate structures, and consumers have a chance to make their collective voices heard. We can still reverse course. There are roles for compliance and compliance professionals to play, but the use of process to undermine substance is not one of them. All levels of the consumer privacy ecosystem--from lawmakers to civil society to academics--can aid in this effort. Crafting an approach to privacy law that advances both corporate and consumer interests will require understanding how we got here; how lawyers, consultants, and technology vendors can do better; and how the social process of law can honestly translate privacy's laws on the books to real privacy protections on the ground. These are the goals of this Article.

Part I pushes back against the temptation to talk...

To continue reading

FREE SIGN UP