Whose Lie Is It Anyway? Holding Social Media Sites Liable for Procedural Election Disinformation.

AuthorMarks, Jadyn

TABLE OF CONTENTS I. INTRODUCTION 381 II. BACKGROUND 383 A. Defining Disinformation 383 B. Two Legal Theories: The Marketplace of Ideas and Protecting Democracy 384 C. Exploring Facebook and Twitter's Approaches to Regulating Political Advertising and Why They Should Be Held Accountable 388 1. Twitter's Policies 390 2. Facebook's Policies 392 3. Parler's Policies 394 4. Comparisons 395 D. The Federal Trade Commission Has Jurisdiction in this Area Because There is Precedent for Regulation of Information and the Information at Issue Here Is Not Political Speech 396 III. ANALYSIS 400 A. Potential Legal Challenges 400 1. Delegation to the FTC Will Help Insulate Regulations from Judicial Scrutiny Under the Chevron Doctrine 400 2. First Amendment Jurisprudence in this Area Is Unclear 402 B. Federal Regulation Is Superior to State Regulation Because It Provides Uniformity 403 C. Section 230 of the Communications Decency Act Has Opened the Door for Regulations in this Area 404 D. Creating A New Agency Is an Inefficient and Inferior Solution 405 E. Public Policy Calls for Regulations in this Area 406 F. How the FTC Should Proceed with Regulations 406 IV. CONCLUSION 408 I. INTRODUCTION

The 2020 presidential election was a rollercoaster for the American people. From Facebook providing an election information center notification on posts pertaining to the election, (1) to Twitter flagging tweets from then-President Donald Trump, (2) social media sites have developed and enacted different policies to prevent the spread of political misinformation and disinformation. (3) These sites have taken encouraging steps toward protecting foundational principles of American democracy, but standards that vary site-by-site are insufficient to curb the onslaught of misinformation and disinformation that users are exposed to on a daily basis. Exposure to false information about procedural aspects of elections is especially worrisome for American democracy. To help prevent the spread of procedural election disinformation, Congress should authorize the Federal Trade Commission to promulgate regulations to prevent paid procedural election disinformation from circulating on social media sites.

In 1996, Congress passed the Communications Decency Act. This Act includes section 230, which has been frequently discussed by politicians, federal representatives, and the media throughout 2020 and 2021. (4) Section 230(c)(1) provides that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." (5) This provision essentially insulates service providers--including social media sites like Facebook and Twitter--from liability for third-party content, with some exceptions relating to criminal acts. (6)

While section 230 was more easily applicable in 1996 when the Internet was just beginning to develop, technological developments have now likely exceeded the bounds of what legislators imagined in 1996. Unfortunately, the legislation has not kept pace with the times, and as such, Internet Service Providers (ISPs) continue to be insulated from liability in questionable circumstances. One such circumstance is that many social media companies fail to meaningfully regulate procedural election advertising on their websites. This failure to regulate leads to the spread of disinformation and could have long-lasting effects on American democracy by disenfranchising eligible voters.

Permitting the unchecked spread of procedural election disinformation prompts significant concerns both with the First Amendment and with notions of a free democracy. John Stuart Mill's theory of the free marketplace of ideas contemplates that having an open forum for speech will allow individuals to exchange information and ideas, and over time, society will filter out inaccurate information from this exchange. (7) While this "marketplace of ideas" theory applied easily in a time where people were openly exposed to a variety of information and ideas, this theory is limited by the modern marketplace of ideas of social media. Social media users tend to consume content they find interesting and agree with, creating an "echo chamber," wherein users may only be exposed to the ideas with which they agree. (8) This problem is further exacerbated by algorithms which suggest content based on other content users have consumed. (9)

Procedural election disinformation also affects America's notion of a free democracy by suppressing voters and rendering them misinformed. (10) Inaccurate information about polling places, where and how to properly register to vote and to check your voter registration status, and other procedural aspects of participating in elections amounts to voter suppression. (11) Further, citizens may cast their votes based on inaccurate information about candidates and their platforms. (12) For example, a study surrounding the 2016 Presidential election found that undecided voters were more likely to vote for Donald Trump after being exposed to fake news stories about Hillary Clinton. (13)

Social media sites have taken varied and admirable steps to curb the spread of political and election-related misinformation and disinformation. However, because the procedures and policies vary from company to company, and sometimes from state to state, (14) there is no uniform approach. This could lead to the information sneaking into users' feeds if they use multiple social media applications with different policies. For example, a user could take a screenshot of a political advertisement on Facebook and share it on Twitter.

To combat the difficulties created by and the worrying consequences resulting from allowing unregulated paid procedural election disinformation to be promulgated on social media sites, Congress should pass narrowly tailored and specific legislation authorizing the Federal Trade Commission (FTC) to promulgate rules regulating this area.

This statutory authorization must be narrowly and specifically written to include only regulation in the area of paid advertising regarding procedural aspects of elections. Once the FTC receives congressional authorization, it will be able to promulgate regulations as it sees fit. However, it may want to hold hearings to garner information about the existing procedures and approaches of different social media sites to determine the framework for its regulations. These regulations would be centered around the social media sites and would determine substantive guidelines and regulations for displaying ads concerning procedural election information, rather than focusing on the entities purchasing the ad space.

This Note will first define disinformation in Part II, Section A, and will explore the legal theories that provide a framework for regulation in this area in Section B. In Section C, this Note considers the current regulatory frameworks of two popular social media sites, Facebook and Twitter, compares their approaches, and explains why regulation of social media sites as "middlemen" is appropriate. Section C will also contrast Facebook and Twitter's approaches with those of Parler. Section D will then establish the FTC's jurisdiction in this area. In Part III, Section A, the Note will consider why delegation to the FTC is superior to Congress regulating the area itself through legislation; Section B will explain why regulation at the federal level is superior to regulation at the state level; Section C offers considerations concerning how debate over section 230 has made this area ripe for change; and Sections D and E consider alternative solutions and public policy. Finally, Section F explores how the FTC should proceed with regulating this space.

  1. BACKGROUND

    1. Defining Disinformation

      Political misinformation and disinformation are popular topics, but each has a distinct meaning. Both misinformation and disinformation involve information that is false or out of context and is presented as factual. (15) However, disinformation is distinct in that it involves an intent to deceive. (16) Misinformation, by contrast, does not require an intent to deceive. (17)

    2. Two Legal Theories: The Marketplace of Ideas and Protecting Democracy

      There are two legal theories in First Amendment jurisprudence that support federal agency regulation of procedural information about elections. The first is John Stuart Mill's theory of the free marketplace of ideas. (18) Mill applied an economic analysis to speech and ideas, positing that information and ideas exist in a marketplace the same way that commercial products exist in a marketplace. (19) The competition of information and ideas in this marketplace naturally determines what ideas are true and acceptable, as the popular and widely accepted ideas will prevail over inaccurate ones. (20) Mill particularly believed that truth is better derived through this competitive marketplace than through any form of government censorship. (21)

      The Supreme Court has come to favor Mill's marketplace of ideas theory in its First Amendment jurisprudence. In particular, the Supreme Court favors counterspeech as the most effective solution to harmful speech. Justice Oliver Wendell Holmes first brought Mill's theory to light in his dissent in Abrams v. U.S. In Abrams, the defendants published and distributed pamphlets supporting Russia and criticizing capitalism. (22) Notably, this was not at a time when the United States was at war with Russia. (23) The defendants were convicted on counts of conspiracy to incite, provoke, or encourage resistance against the United States and conspiracy to curtail production of war materials. (24) The Supreme Court affirmed the defendants' convictions and rejected their defense that the convictions violated the First Amendment. (25) In a now-famous dissent, Justice Holmes criticized the majority approach, emphasizing that the government interest in restricting...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT