HOW CONTENT MODERATION MAY EXPOSE SOCIAL MEDIA COMPANIES TO GREATER DEFAMATION LIABILITY.

AuthorBone, Tanner

INTRODUCTION

On July 25, 2018, Vice News published an online article discussing allegations from various Twitter users that Twitter was "shadow-banning." (1) The users accused Twitter of secretly removing or hiding the accounts of several prominent, politically conservative users of the platform. (2) Twitter has chalked up the perceived shadow-banning to an algorithmic error, (3) but these allegations and others like them have led to broader conversations about content moderation on some of the world's largest social media platforms. Since Vice's article, accusations of politically motivated content curation have increased in both volume and frequency, including accusations from the President of the United States. (4) Companies like Twitter, Facebook, and YouTube have evolved into popular sources for breaking news and act as a primary means of communication for millions of people. (5) The expanding influence of social media has led some to call these platforms the new "town square." (6) Business executives are using social media to engage with consumers. (7) Political leaders are taking to social media to discuss new initiatives and policies. (8) Athletes and celebrities are using these platforms to provide social commentary and to galvanize supporters and fans. (9) In sum, social media has spread rapidly throughout society and is now present in most Americans' lives in innumerable ways. (10)

While the evidence of shadow-banning in this instance is speculative and contested, (11) these allegations and others have spurred many conversations about social media, content curation, and free speech. More specifically, scholars and politicians are asking how much censorship power is appropriate for these social media companies. (12) In this era of "fake news," "hot takes," and political polarization when society seems more and more reliant on social media, we must reevaluate how these companies regulate the content on their platforms. In particular, this means revisiting Section 230 of the Communications Decency Act (CDA) in order to ensure that these powerful companies foster a vibrant and open marketplace of ideas.

This Note will explain the critical distinction between "publishers" and "platforms," why social media entities are currently considered "platforms," and why the legal system should reevaluate the liability of social media entities based on how they moderate and regulate content. Part I of this Note will discuss the history of the common-law liability of content providers prior to the invention of the internet. It will also explore the history and rationale for enacting Section 230 of the CDA. Part II of this Note will explain the distinction between "publishers" and "platforms" as it relates to defamation liability. Further, it will discuss the rapid growth of social media during the internet age and its impact on communication and the spread of information. It will also discuss the cryptic and often vague algorithmic process that social media companies use to decide which content is visible to users. Part III of this Note will analyze the current liability of social media companies as a "platform" and will discuss the argument that social media is the twenty-first century's "town square." Part IV will explain three key pieces of recently proposed legislation that may affect Section 230 of the CDA. Part V of this Note will explain specific changes that social media companies must make to avoid the enhanced defamation liability of moving from the "platform" category to the "publisher" category. Part VI will discuss a few legislative and executive solutions to allow Section 230 of the CDA to reflect the current internet landscape by focusing on pushing social media companies toward transparent content-moderation practices.

  1. DEFAMATION AND THE CREATION OF THE COMMUNICATIONS DECENCY ACT

    To fully grasp the conversations and debate surrounding the legal implications of content-moderation processes such as "shadow-banning," it is necessary to understand defamation liability as it developed under the common law. It is also critical to understand how this liability developed and changed in order to foster the growth of the internet. This section will discuss defamation liability under the common law through the enactment of the Communications Decency Act.

    1. Defamation Under the Common Law

      Defamation liability arises when three elements are present: (1) there is an unprivileged publication; (2) the publication is false and defamatory; and (3) the publication is actionable irrespective of special harm or is the legal cause of special harm to the other. (13) Any person or entity that publishes a defamatory statement assumes liability equivalent to having been the initial purveyor of the statement. (14) For entities that provide information or content, the assumed liability is categorized into two principal categories: publishers and distributors. (15)

      Publishers, including book publishers and newspapers such as the New York Times, are liable for any content that appears in their publications. (16) The basis for this liability is that these publishers review and curat their content, so they have full knowledge of the material they were releasing to the public. (17) Thus, common-law publishers are held to a higher legal standard because of their subjective editorialization of content. (18) Conversely, distributors (non-digital "platforms") such as libraries, bookstores, and newspaper stands are not held liable for the content they disperse. (19) The rationale is that it is not feasible or practical to expect libraries and newspaper stands to know all of the content that they possess and or provide to the public because the content is voluminous and comes from a variety of sources. (20) Accordingly, distributors are not subject to defamation liability for the content of the sources they provided. Put in today's terms with real-world examples:

      Typically, publishers are considered to have editorial judgment, while platforms lack it. From this perspective, the Harvard Business Review, The Atlantic, and The New York Times are classic "publishers"--they present highly-curated content, and their editors invest a lot of time in its creation. Google, Facebook, and Twitter are classic "platforms"--they distribute other peoples' content without as much editorial oversight. But these differences are largely cultural. It's not technologically difficult for publishers to add platform-like elements, and vice versa. (21) Under the common law, websites that do not or cannot review all of their content operate as "platforms" or "distributors." (22)

    2. The Rise of the Internet and the Origin of Section 230 of the CDA

      Before 1996, the aforementioned common-law distinctions provided the basis for liability for internet content providers. This common-law standard--as applied to the internet--was called into question after Stratton Oakmont, Inc. v. Prodigy Services Co. (23) Prodigy was an internet company that hosted online bulletin boards with over two million subscribers and sixty thousand posts per day. (24) Prodigy oversaw the content of the bulletin boards and occasionally removed posts that were "offensive" or "in bad taste." (25) The court held that Prodigy's practice of moderating some of the content on its platform required that it assume liability for all of the content on its platform. (26) In other words, to be a platform and thus avoid liability under the common-law standard of defamation, there could be no content moderation whatsoever.

      The Stratton Oakmont decision in 1995 initiated a swift legislative response. In an attempt to "regulate obscenity and indecency online," Congress passed the Communications Decency Act just a year later. (27) In response to Stratton Oakmont, Congress addressed the question of internet liability through a proposed amendment to the CDA. (28)

      The amendment was intended to encourage free speech online by shielding interactive computer services from most common-law defamation liability. (29) The amendment was formalized as Section 230 of the CDA, which effectively grants internet service providers immunity for information provided by a third party, thus treating them differently than publishers in print. (30) Specifically, the language of Section 230(c)(1) stipulates that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." (31) This intent was highlighted in the Fourth Circuit's conclusion that it was Congress's intent in Section 230 to treat "distributors" differently than "publishers" under the law. (32) Other circuits have reliably agreed with the analysis in Zeran by refusing to hold internet entities liable for certain content published by third parties. (33)

      Congress enacted Section 230 at the dawn of the internet age, (34) and the statute has had a profound effect on the development of free speech on the internet over the past quarter of a century. (35) Section 230 has been referred to as one of the most critical pieces of legislation impacting the "freedom of expression." (36) This is unsurprising as its primary intent was to foster creativity and a competitive market in the internet space. (37) Section 230 defines both interactive computer services (38) and information content providers. (39) The difference between these terms plays a meaningful role in assessing defamation liability. (40)

      An interactive computer service is "any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions." (41) Conversely, information content providers may be, and often are, third-party users. (42) Section 230 effectively grants...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT