ENJOINING NON-LIABLE PLATFORMS.

AuthorPerel, Maayan

TABLE OF CONTENT I. INTRODUCTION 2 II. THEORETICAL FRAMEWORK: PLATFORM LIABILITY 8 A. Direct Liability 9 B. Secondary Liability and Notice-and-takedown 11 1. The Limitations of Notice and Takedown 14 2. Example I: Live Streaming 16 3. Example II: Copyright Infringement by Foreign Websites 17 III. ENFORCEMENT-BASED SPEECH REGULATION BY PLATFORMS 20 A. Mandatory Removals 20 1. Court Orders Directed at Platforms as Third Parties 21 2. Platforms' Enforcement Obligations Under U.S. Law 23 B. Voluntary Removals 25 C. Content Moderation 26 1. Governmental Requests 28 IV. SPEECH REGULATION BY NON-LIABLE PLATFORMS #1: ENJOINING PLATFORMS AS NON-PARTIES IN CIVIL SUITS 30 A. Procedural Due Process 31 1. The Barriers 31 2. Possible Solutions 33 B. Prior Restraint on Speech 37 1. The Barrier 37 2. Possible Solution 38 C. Platforms' Legitimate Economic Interests 39 1. The Barrier 39 2. Possible Solutions 40 V. SPEECH REGULATION BY NON-LIABLE PLATFORMS #2: ALLOWING IN-COURT GOVERNMENTAL REMOVAL REQUESTS 41 A. Prior Restraint 43 B. The Takings Clause 44 VI. BALANCING THE TRADEOFFS 45 A. Judicial Oversight versus Innovation 46 B. The Rule of Law versus Flexibility 50 C. Public Safety versus The Free Flow of Information 52 VII. CONCLUSION 53 I. INTRODUCTION

The proliferation of online content involves dozens of platforms carrying material to countless recipients. (1) Platforms' role in spreading content raises serious concerns regarding their responsibility to restrict harmful speech. (2) Governments, civil societies, and activists around the globe contend that platforms should do more to protect our online sphere from poisonous content, such as hate speech and copyright infringement. (3) Questions of platforms' responsibility and liability are at the center of this discourse. Specifically, the safe harbor accorded to platforms under [section] 230 of the Communications Decency Act ("CDA") is under fire. (4)

In the United States, platforms enjoy strong and rather stable immunities from acts of infringement caused by their users. The CDA exempts Internet Service Providers ("ISPs") and some other online intermediaries from certain kinds of third party liability by determining that they are not "the publisher or speaker of any information provided by another information content provider." (5) Similarly, the Digital Millennium Copyright Act of 1998 ("DMCA") bars indirect copyright liability for ISPs who are acting only as a conduit and limits liability for web hosting and other service providers if they follow a prescribed notice-and-takedown procedure. (6) Similar immunities were also enacted under internet gambling and online pharmacy laws. (7)

While some legal scholars seem to be skeptical that making platforms liable for harmful activity on their services would be the right cure against poisonous content, others advocate that the interpretation of [section] 230 immunity is too broad, leaving "victims of online abuse with no leverage against site operators whose business models facilitate abuse." (8)

The U.S. legislature has also attempted to address the issue of harmful online content through the lens of platform liability. Congress has recently reduced [section] 230's immunity, and in 2018 passed the Allow States and Victims to Fight Online Sex Trafficking Act of 2017 ("FOSTA"), "designed to attack the online promotion of sex trafficking victims, in part, by, reducing [section] 230's scope." (9) Additionally, in October 2019 the House Subcommittee on Communications and Technology and the Subcommittee on Consumer Protection and Commerce held a hearing titled "Fostering a Healthier Internet to Protect Consumers." The main focus of the hearing was [section] 230 and whether it should be amended considering the scope of harmful online activity that the platforms have failed to address. (10)

Noticeably missing from the current discourse over harmful online content in the U.S. is the possibility of harnessing platforms' potential enforcement capabilities, regardless of questions of liability, to reduce some of the harms caused by online speech. This is mainly because in the U.S., the Free Speech Clause of the First Amendment restricts government regulation of private speech. (11) Despite concerns that "the real threat to free speech today comes from private entities such as Internet service providers, not from the Government," interfering with the editorial discretion of platforms is seen as a violation of platforms' First Amendment rights. (12)

However, platforms are a natural point of controlling the substance of online communications and hence are capable of preventing the dissemination of unlawful content. (13) They have the means "to intervene in the circulation of abhorrent content and at the moment of abhorrent behavior." (14) In Europe, the engagement of online intermediaries in enforcing the rights of individuals allegedly harmed by online speech has recently shifted to what Martin Husovec names "accountability without liability." (15) That is, non-liable online platforms in the European Union ("EU") are increasingly forced to assist rightsholders and target speech or speakers that violate their rights, even though these platforms were not involved in any unlawful way in disseminating that speech. (16)

In the U.S., however, the role of platforms in addressing illegal content is still defined according to liability theories. Content removals by non-liable platforms are currently conducted mostly on a voluntary basis. (17) Governmental agents and other authorized reporters can file requests with various platforms to remove allegedly illicit content from their services. (18) Nevertheless, platforms are not legally bound by such removal requests, so they can elect to partially or completely decline them. (19) Furthermore, platforms also engage in voluntary content moderation. They may enable or disable access to content by removing or blocking controversial content, or by terminating the accounts of particular speakers. (20) In this respect, they follow their internal policies regarding objectionable content (e.g., community guidelines) to satisfy their users and assure they spend as much time as possible on their services. (21)

Unless directly, vicariously, or contributorily liable for the harms caused by illicit content, platforms cannot be forced by courts to actively remove it. (22) This current state of affairs, however, completely ignores the natural position of platforms as doormen who govern the free flow of online information. (23) It fails to harness the tremendous power platforms could exercise as authorized law enforcers. As this paper sets forth, this failure is rooted in two main legal barriers. The first is procedural and concerns the statutory restriction on enjoining non-liable third parties. For decades, bedrock rules of equity and due process have defended non-liable third parties from being enjoined by courts since they are "strangers to the litigation." (24) In Blockowicz v. Williams, (25) the Seventh Circuit ruled that the fact that platforms are "technologically capable of removing" questionable content "does not render [their] failure to do so aiding and abetting," which is what is required under Rule 65 of the Federal Rules of Civil Procedure in order to enjoin non-parties. (26) Nevertheless, it is hard to ignore the fact that often times platforms, albeit not liable, are effectively the only entities in the position to stop the accelerating harm caused by illegitimate content going viral. (27)

The second legal barrier to making speech regulation by platforms mandatory is the absence of a formal legal procedure that would allow law enforcement agents to seek court orders that would force platforms to remove illegal content. (28) Outside the area of speech regulation, legal procedures that demand action on the part of platforms do exist. The Stored Communication Act ("SCA"), which established a legal procedure for a governmental entity seeking action on the part of the platform, is one example. (29) Nevertheless, within the area of speech regulation, enforcement by non-liable platforms is largely based on out-of-court submissions made directly to the platforms and is therefore mostly voluntary. (30)

This paper advocates making speech regulation by platforms mandatory. It promotes scrutinized removal of illegal content by non-liable platforms, governed by ongoing judicial review. (31) Specifically, this paper focuses on platforms' ability to remove illegal content, rather than on platforms' liability for the proliferation of such content. Accordingly, the paper proposes two legal fixes: first, allowing civil injunctions against non-liable platforms that enable the dissemination of tortious content, and second, establishing an open and transparent statutory procedure that will allow designated law enforcement agents to request courts to order platforms to remove content that was proved to be illegal by clear and convincing evidence.

The discussion proceeds as follows: Part II presents the governing theory of platform liability which is meant both to prevent and to address actions (or inactions) of platforms that unlawfully contribute to the dissemination of tortious content. Given the broad immunities accorded to platforms under [section] 230 of the CDA, this Part further explains the shortcomings of platform liability in addressing harmful online content. Part III introduces the current enforcement-based regime for speech regulation by platforms. It discusses content removals that are based on governmental removal requests submitted directly to platforms or on content moderation practices and highlights their voluntary nature. Platforms are not obliged to engage in these regulatory efforts, and therefore it is impossible to rely on them to sufficiently and legitimately address illegal content. This Part concludes that the removal of illicit content by platforms should become mandatory...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT