AuthorFowler, Leah R.
PositionHealth apps regulations

INTRODUCTION 3 I. PROTECTING PRIVACY IN DIGITAL HEALTH TECH 6 A. Health Apps 7 1. Examples of Health Apps 7 a. Femtech Apps 8 b. Mental Health Apps 9 c. Genetics Apps 11 2. Benefits of Health Apps 12 B. Market for Privacy in Digital Health Tech 15 1. Advertisements as Proxies for Consumer Preferences 16 2. Unilateral Amendment Clauses as Market Failure 19 C. Ubiquity of Unilateral Amendment Clauses 22 II. REGULATING DIGITAL HEALTH TECH 27 A. Current Law 27 1. Health Law and Regulation 27 a. HIPAA Privacy Rule 28 b. FDA Oversight 29 2. Contract Law 31 a. Illusory Promises 32 b. Unconscionability 32 c. The Preexisting Duty Rule 34 d. Promissory Fraud 35 3. Consumer Law 35 a. FTC Oversight 36 b. State Data Protection Legislation 38 B. Critiques of Unilateral Amendments 40 1. Generally 40 a. Suboptimal Consumer Decisionmaking 41 b. Incomplete Risk Information 42 c. Switching Costs 43 d. Contract Distancing and Lack of Notice 43 2. In Digital Health Tech 44 a. Incorrect Assumptions About Medical Privacy 44 b. Heightened Switching Costs 45 C. A Brief Defense of Unilateral Amendments 46 III. IMPROVING DIGITAL HEALTH TECH 47 A. Legislative Solutions 48 1. Federal Data Protection Legislation 49 2. Objections and Responses 52 B. Regulatory Solutions 53 1. Increased Federal Trade Commission Oversight 53 2. Objections and Responses 55 C. Judicial Solutions 56 1. Enhanced Duty of Good Faith 56 2. Objections and Responses 58 CONCLUSION 61 APPENDIX 63 A. Methodology 63 B. List of Apps Surveyed by Type 65 1. Femtech Apps 65 2. Mental Health Apps 65 3. Genetics Apps 65 INTRODUCTION

In January 2021, the Federal Trade Commission (FTC) settled a complaint against the period- and fertility-tracking app Flo. (1) Flo, like many similar technologies in the booming digital health tech industry, (2) collects and analyzes its users' data to provide them with information and recommendations about their personal health. (3) To gain consumers' trust, Flo assured its users that it would keep their highly intimate data--information about menstruation, mood, sex drive, and pregnancy symptoms--safe, away from the prying eyes of third parties. (4) Yet an expose in the Wall Street Journal revealed that the company had failed to keep its promises to consumers. (5) Flo was sharing its users' personal and identifiable data with numerous analytical and marketing firms without consumers' knowledge or consent. For example, Facebook received information that individual consumers were either having their periods or trying to get pregnant, allowing the social media platform to target its advertising to those users. (6) Consumers reported feeling "outraged," "victimized," and "violated." (7)

It is well known that the average consumer will rarely read the terms of service (ToS) or privacy policies when selecting a product. (8) Conversely, in the context of digital health tech--where the data at stake is often sensitive--users may be more inclined to actually shop for privacy and to choose a product based on a company's purported terms. Health app providers spend significant advertising dollars on proclaiming their products' commitments to privacy and data security to attract consumers. (9) These advertisements create a market for privacy in digital health tech, with users selecting apps based at least in part on the companies' promises regarding privacy and data security. The potential reliance on a company's privacy terms made Flo's transgression even more troubling. As one commentator on the recent controversy explained: "It's become even more cynical than just 'buyer beware'.... You did your homework. You read this app's privacy policy. You thought you were putting your data in a trusted place. And turns out that the company didn't take its obligation seriously." (10) The point is simple: if companies don't keep their promises, the data of even informed, responsible users will not be safe.

The FTC's responsibilities include policing "unfair and deceptive acts or practices" that harm consumers. (11) It filed a complaint against Flo because of the company's repeated deceptive statements to its users about their privacy. (12) As noted, Flo ultimately settled with the FTC. (13) As part of the settlement, the company agreed to a review of its privacy practices and vowed to obtain users' consent before sharing their data in the future. (14) The FTC had jurisdiction over Flo's actions because the company had effectively lied to its users. It said one thing and did another. But what if a health app could go back on its promises to consumers without violating its ToS or privacy policies?

Remarkably, Flo could have done just that. If the company had simply changed its ToS or privacy policy, it could have shared its customers' data without lying to them at all. Flo is one of the many companies that include unilateral amendment clauses in their agreements with consumers. (15) Under these provisions, companies can alter terms sometimes without even notifying users, let alone asking them for permission. And unilateral amendments are by and large legal. More often than not, courts are willing to enforce these onesided changes. (16) If Flo had simply changed its terms, the company might have been able to avoid running afoul of the FTC's prohibitions on deceptive trade practices. (17) In fact, consumers currently have very little legal recourse for challenging harmful unilateral amendments. The result is that even users who actively read ToS and privacy policies when selecting a product remain vulnerable to changes that happen without their knowledge and could compromise their privacy. Thus, unilateral amendment clauses undermine the market for privacy that exists in digital health tech. (18)

Scholars have long argued that one-sided changes to contract terms are both inefficient and unfair. (19) While unilateral amendment provisions may be problematic in a variety of contexts, (20) we maintain that they are especially troubling in the context of health apps. Flo is hardly alone in reserving the right to unilaterally amend its agreements. For this Article, we surveyed the ToS and privacy policies of thirty digital health tech companies. Nearly all of the companies reserved the right to change their ToS, and all of the companies reserved the right to change their privacy policies. (21) While most apps promised to at least notify users when modifications occurred, some put the responsibility of staying up to date on the individual consumers themselves. (22) And, because courts enforce unilateral amendments, the only choice for a savvy user who wishes to challenge a harmful unilateral amendment is to stop using the product. In the context of health apps, terminating use may mean abandoning weeks, months, or even years of potentially valuable personal data. Given the high stakes of digital health tech, consumers need stronger legal protections against potentially harmful one-sided changes.

This Article focuses exclusively on direct-to-consumer health apps. However, what we describe here provides only a snapshot of a much larger problem. In addition to buying products on the consumer market, individuals may also download health apps through their healthcare providers and their employers. These technologies have their own separate regulatory structures and raise their own unique sets of legal concerns. (23) Moreover, the issues that we identify are not confined to digital health tech. Consumers of other technologies are likewise at risk. Navigation apps, budgeting apps, and dating apps all collect sensitive, identifiable, personal data that many users would prefer to keep private. (24) Thus, while our focus is direct-to-consumer health apps, the legislative, regulatory, and judicial solutions that we propose could benefit other kinds of users subject to unwanted onesided changes.

This Article proceeds in three parts. Part I offers an introduction to health apps and argues that the proliferation of unilateral amendment clauses in that industry leads to market failures. Part II then turns to the current law governing one-sided changes and the critiques of unilateral amendment provisions, both generally and in the context of digital health tech. We also note the limited benefits of one-sided changes. In Part III, we discuss legislative, regulatory, and judicial innovations to better protect all consumers, not just the users of digital health tech. We focus on how these various kinds of interventions can give companies the flexibility that they need while ensuring that consumers have the legal protections that they deserve.


    Privacy may be of particular concern to users of digital health tech. Part I begins with a brief introduction to health apps, identifies the privacy issues that they may raise, and explores their potential benefits for consumers. We then turn to our original research assessing the advertising, ToS, and privacy policies of thirty health apps. We conclude that health app consumers may choose a particular service based on a company's promises to protect user data. Despite this reliance by consumers, almost every health app that we surveyed reserves the right to change their ToS and privacy policy without consent and--in some cases--without clear notice. We assert that this potential for one-sided changes distorts the market for privacy in digital health tech, leaving consumers of health apps and their most private data vulnerable.

    1. Health Apps

      Health apps collect and warehouse large amounts of highly sensitive user data. Consumers of digital health tech therefore have a strong interest in wanting to keep that information private. Despite the potential privacy risks, these technologies offer serious benefits for users, especially in the context of the United States' fragmented healthcare system. This Section introduces three different categories of health apps, considers their accompanying privacy...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT