Section 230 of the Communications Decency Act of 1996: The Antiquated Law in Need of Reform.

AuthorMediavilla, Katherine
  1. INTRODUCTION

    Imagine a world without the information sharing-giants of Facebook, YouTube, Instagram, TikTok, Twitter, or Reddit. (1) It is nearly impossible to go a day without some type of exposure to content created, published, and shared on such platforms. (2) A world without these platforms would bear a striking resemblance to the world in 1996, when Congress passed the Communications Decency Act ("CDA"). (3) The CDA, often referred to as "the 26 words that made the internet," states that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." (4) In short, the CDA is a federal law that prevents websites, blogs, forums, and other sources of online information from being held liable for their users' speech. (5) The legal protections established in [section] 230 of the CDA are unique to the United States--European nations, Canada, Japan, and the vast majority of other countries do not provide such safeguards to internet companies. (6) Despite the high levels of internet access in these countries, the largest and most prominent online services are located in the United States. (7) Section 230 makes the United States desirable as a safe haven for internet providers who wish to provide controversial or politicized speech with a legal platform and environment which is favorable to free expression. (8)

    While proponents argue that [section] 230 fosters free speech, its privilege comes with many consequences and gives internet platforms the power to manifest social devastation. (9) Since its inception, [section] 230's protections have provided an outlet for hate speech, election interference, "fake news," and even content that is otherwise illegal in the United States--namely terrorism and sex crimes. (10)

    The vast prevalence of the internet in everyday life has forced lawmakers to confront previously unfathomable questions regarding accountability on internet platforms. (11) Social media channels have become avenues for users to share and receive daily news. (12) In today's age, in the midst of a pandemic and warfare, the spread of misinformation from ill-informed sources may endanger lives, incite hate, and perpetuate violence. (13) As individuals increasingly obtain news from online sources with unknown or nonexistent credibility, the heightened risk of public disinformation becomes extremely problematic, especially when individuals filter their social media feeds based on their prior beliefs. (14)

    At what point should internet platforms that are protected by an antiquated law be held to a higher standard? Should Facebook be held accountable to any extent for the Capitol riots, for which much of the planning occurred on its platform? (15) To what extent should Twitter face liability for enabling the recruitment of terrorists? (16) Should adult sites such as Pornhub be held responsible for facilitating the sexual exploitation of children? (17) Questions like these, which were completely unanticipated at the time of [section] 230's adoption, now seem to be the focus of many major internet-related discussions among lawmakers. Part II of this Note addresses the legal background and origins of the CDA. Part III highlights recent developments involving the CDA, including the law's legal protections, proponents of the law, as well as proposed methods of CDA reform. Finally, Part IV discusses the potential avenues for change.

  2. LEGAL BACKGROUND

    In 1996, as part of the Telecommunications Act, Congress enacted the CDA to address whether internet service providers should be treated as publishers or distributors of the content created by their users. (18) Congress's goal in enacting the Communications Decency Act was to regulate the accessibility to obscenity and indecency online, specifically toward children. (19) The CDA made it illegal to "knowingly send to or show minors obscene or indecent content online." (20) Following concerns about free speech and the availability of online platforms, Congress amended the CDA to add [section] 230. (21)

    Since its addition, there have been two amendments to [section] 230: (1) to require that interactive computer services notify customers about parental control protections, and (2) to except its application in certain criminal and civil cases related to sex trafficking or prostitution. (22) In its current form, [section] 230 provides that interactive computer service providers cannot be held liable for any action taken which the provider considers to be "obscene, lewd,... [] violent,... or otherwise objectionable." (23) Section 230 encourages providers to self-regulate the dissemination of offensive material over their services and allows providers to establish standards of decency without risking liability. (24)

    Under [section] 230, Congress found: (1) The rapidly developing array of Internet... represent[s] an extraordinary advance in the availability of educational and informational resources to our citizens; (2) These services offer users a great degree of control over the information that they receive, as well as the potential for even greater control in the future as technology develops; (3) The Internet... offer[s] a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity; (4) The Internet... [has] flourished, to the benefit of all Americans, with a minimum of government regulation; (5)... Americans are relying on interactive media for a variety of political, educational, cultural, and entertainment services. (25) Federal courts have interpreted [section] 230 to create expansive immunity against claims based on third-party online content. (26) As a result, internet platforms and their users frequently turn to [section] 230 and its protections to avoid liability in both federal and state litigation. (27) However, commentators and jurists have recently expressed concern that the broad immunity is beyond the intended scope of the law. (28) At the heart of [section] 230 is subsection (c), also known as the "Good Samaritan" provision, which established the elements for immunity under the CDA. (29) This section's language indicates that Congress did not intend to restrict [section] 230's immunity privileges only to defamation claims, but rather to extend immunity to civil claims of all kinds. (30) Subsection (c), which addresses the protections for "Good Samaritan" blocking and screening of offensive material, states:

    (1) TREATMENT OF PUBLISHER OR SPEAKER.--No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. (2) CIVIL LIABILITY.--No provider or user of an interactive computer service shall be held liable on account of-- (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in [subparagraph (A)]. (31) In Barnes v. Yahoo!, Inc., the Ninth Circuit created a three-prong test to determine [section] 230 immunity. (32) Under this test, "[i]mmunity from liability exists for '(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider.'" (33) Thus, a plaintiff must allege sufficient facts to show that the defendant was in fact the publisher or speaker of the content to get around the immunity clause. (34)

    Many courts grant broad immunity when a plaintiff seeks to hold an online entity liable for content posted by a third party. (35) Subsection (b) of [section] 230 claims that it is the United States' policy to: (1) promote the continued development of the Internet, (2) preserve the Internet's competitive free market, (3) encourage the development of technologies to maximize user control, (4) support parents to restrict their children's access to inappropriate material, and (5) ensure vigorous enforcement of federal criminal laws. (36)

  3. RECENT DEVELOPMENTS

    Online platforms are treated differently than traditional modes of media because of the immense quantity of content they produce. (37) A popular social media platform, such as Twitter or Facebook, may produce thousands of posts per second. (38) And many online platforms allow users to create individual pages, company pages, government pages, clubs, support groups, and many other pages. (39) The opportunities and limits are essentially endless. (40) In contrast, the Letters to the Editor page of The New York Times may have only five or ten letters on a page per day. (41) Section 230 attempts to account for the exceptional nature of the internet and seeks to encourage a marketplace of ideas. (42) Now a quarter-century since [section] 230's adoption, the "marketplace of ideas" has seen both important advancements and dangerous threats to...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT