Delete and Repeat: The Problem of Protecting Social Media Users' Free Speech from the Moderation Machine.

AuthorKosakowski, Jacob

"[W]hen men have realized that time has upset many fighting faiths, they may come to believe ... that the ultimate good desired is better reached by free trade in ideas--that the best test of truth is the power of the thought to get itself accepted in the competition of the market, and that truth is the only ground upon which their wishes safely can be carried out. That at any rate is the theory of our Constitution." (1)

  1. INTRODUCTION

    As of January 2021, more than four billion people across the world actively use social media, averaging two hours and twenty-five minutes each day on social media platforms. (2) Facebook, created in 2004 by Mark Zuckerberg, maintains the largest social media platform with about 1.66 billion active users. (3) Twitter, founded by Jack Dorsey in 2006, cultivates roughly 152 million active users on its platform. (4) As the economic growth and societal influence of social media continues to flourish, many Americans increasingly rely on platforms like Facebook and Twitter for their daily news, social commentary, and individual expression. (5) With continued growth in user numbers and profits, commentators worry that an ever-increasing concentration of power and control in the hands of social media corporations may threaten free speech on the internet. (6)

    On the surface, social media platforms like Facebook and Twitter appear to promote free speech by allowing individual users to share content on a wide range of personal, social, and political topics. (7) While an individual user's content is often benign in nature, some social media users will trigger the implicit and explicit mechanisms used for content moderation based on that particular platform's rules of private governance. (8) Within the past two years, however, public awareness of platforms' content-moderation practices has increased through notable censorship complaints such as Parler's removal from Amazon computing services, the congressional big-tech hearings, and Democratic Senator Elizabeth Warren's complaints against Facebook. (9) Looking critically at social media as a mechanism for free speech, it becomes apparent that social media platforms are not implementing policies to protect individual speech, but rather to monetize users' attention. (10) Onerous speech regulation contradicts the idea that social media promotes free speech or a "marketplace of ideas," leaving the individual user vulnerable to the violation of their freedom of speech. (11)

    In the wake of these free speech concerns on social media, many legal writers have largely written off the First Amendment as a viable means of confronting social media censorship. (12) This notion extends from the idea that private entities are not subject to the constitutional constraints of the First Amendment. (13) Throughout its history, however, the Supreme Court of the United States has applied the concept of a "public forum" to public, semipublic, and private entities to accommodate the free speech rights of individuals based on the particular use of the property. (14) Likewise, the Supreme Court has constantly reimagined free speech rights in response to ever-changing media such as radio, television, and the internet. (15) While prior Supreme Court precedent oscillated on the First Amendment's application to private entities, the Court's decision in Packingham v. North Carolina (16) might redefine online speech platforms as public forums and revitalize the First Amendment's applicability to social media platforms. (17)

    This Note identifies the various problems surrounding the protection of individual users' First Amendment rights on the internet. (18) In Section II.A, the Note provides a broad overview of the internet's infrastructure and the protections the Communications Decency Act of 1996 (CDA) provides to social media platforms. (19) Section II.B also examines the history of the Supreme Court doctrine surrounding what constitutes a public forum, as well as the relationship between the First Amendment and different types of media. (20) In Part III, the Note analyzes the various legal issues leaving internet users' free speech rights unprotected from an internet platform's moderation practices. (21) Finally, Part IV of the Note argues for the need to develop legal tools and policies that protect individuals from the unprecedented power of social media platforms. (22)

  2. HISTORY

    1. The Internet and Content Moderation

      1. Two Layers, Two Purposes

        Broadly, the internet's infrastructure consists of two layers: a bottom layer where internet service providers operate and a top layer where internet applications run. (23) On the bottom layer, internet service providers operate on an end-to-end principle, which carries and routes users' data from one endpoint to another endpoint. (24) Internet Protocol (IP), the "open-standard networking protocol," allows the configuration of local area networks from all over the world to connect with one another, enabling each internet endpoint to have a unique IP address to accurately route data between any source and destination. (25) IP is the backbone of the internet's physical infrastructure because it allows online developers to design and develop new services and applications without having to rely on internet service providers to build any new physical or virtual infrastructure into the network. (26) On the top layer of the internet, online platforms, applications, and services run independently of the internet's underlying bottom layer. (27)

        The concept of neutrality was the main characteristic associated with the internet as an open information system. (28) In December 2017, however, the Federal Communications Commission (FCC) expressly reversed internet neutrality rules that required internet service providers to provide equal access to websites across the bottom layer of the internet. (29) Rather than carrying and treating all data equally, irrespective of content or purpose, internet service providers can now block, throttle, or prioritize access to specific platforms on the bottom layer. (30) Meanwhile, applications and services on the internet's top layer encourage content awareness, or the conscious interaction with intelligible words and images on online platforms. (31) With data surfacing as content in the form of tangible words or images, top-layer applications' desire to protect themselves from legal consequences and to protect users from exposure to harmful material compels them to moderate content appearing on their platforms and services. (32) Thus, top layer applications inherently become gatekeepers of information through the moderation of content. (33) While the bottom layer of the internet generally does not interact with individualized content, the top layer is more susceptible to content moderation due to potential public blowback for internet applications hosting disagreeable content. (34)

      2. Liability Shield or Responsibility Shield?

        On top-layer applications, such as social media platforms, there is a basic need for some degree of content moderation. (35) Moderation consists of structural mechanisms that govern an online community in order to facilitate collaborative conversation, enhance user cooperation, and prevent abuse. (36) Because social media platforms are privately owned businesses, owners of these platforms get to decide who can moderate content, what content is subject to moderation, and how moderation will occur. (37) Broadly speaking, a social media platform's moderator can support or hide posts, encourage or denounce content, and promote or ban users. (38) A moderator's decision will determine what users see, influence what the online community values, and reinforce the platform's content norms. (39)

        In the United States, the CDA protects social media platforms' decisions as to what content to moderate. (40) Initially, Congress added section 230 to the CDA to protect children from viewing inappropriate online material. (41) In doing so, section 230 protects internet applications from being treated as speakers or publishers of any user-generated content hosted on their platforms and immunizes these applications from liability for such content. (42) Likewise, section 230's "Good Samaritan" provision further protects internet applications from liability for the good faith moderation of "otherwise objectionable" content hosted on their services. (43) While section 230 immunizes platforms from claims alleging they are publishers and claims arising from their good faith efforts, Congress's elimination of the risk of liability for "mismoderation" encourages internet applications to moderate content. (44)

        The CDA provides top-layer applications with a liability shield, which in turn allows applications wide discretion to moderate online content. (45) As internet applications become increasingly prevalent fixtures of daily life, these companies will continue to engender a wider set of responsibilities towards society, ranging from economic to moral responsibilities. (46) In regard to content moderation, however, the CDA shields internet platforms from responsibility no matter the inconsistency or ineffectiveness of their moderation practices. (47) Thus, while the societal responsibility of internet platforms increases, laws such as the CDA will continue to emphasize liability protection for moderation practices and actually shield platforms from responsibility for inconsistent or ineffective content moderation. (48)

      3. The Invisible Police Force

        Content moderation is a definitional and indispensable aspect of what internet platforms do. (49) With ever-increasing activity and content online, internet platforms have expanded the use of private algorithmic tools as a means of moderating content. (50) In the context of online content moderation, an algorithmic tool is a computing code that automatically operates underneath internet platforms to implicitly influence the types of possible...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT