Social Media and Democracy After the Capitol Riot, Or, a Cautionary Tale of the Giant Goldfish

JurisdictionUnited States,Federal
Publication year2022
CitationVol. 73 No. 2

Social Media and Democracy after the Capitol Riot, or, A Cautionary Tale of the Giant Goldfish

Seth Oranburg

[Page 591]

Social Media and Democracy after the Capitol Riot, or, A Cautionary Tale of the Giant Goldfish


Seth Oranburg*

Lately, people have been finding giant pet goldfish in lakes across America.1 You may see these tiny fish swimming in bowls at the county fair, but left alone in a lake or large pond, where they are dropped perhaps by a well-meaning child, they can grow to 20 pounds or more—and destroy ecosystems.2 The goldfish is a cautionary tale that has been told time and again in different forms, like Pandora's box.

On January 6, 2021, a somewhat organized group of rioters overran and briefly took control of the U.S. Capitol.3 Social media clearly played a role in the riots at the Capitol that occurred on January 6, 2021.4 Those riots were deeply troubling for all who love America and the freedoms for which it stands.5 But the reactions by corporations to

[Page 592]

cancel social media accounts and even entire social media platforms is troubling, too.6 We must now face the reality that we have entrusted some of the most fundamental civil liberties to corporations that have obligations only to their shareholders, not to democracy.7 We the people are guaranteed freedom of speech in the public square.8 But we do not enjoy those same freedoms on the private social media networks that have replaced the town hall.9 As more and more of our communications and daily lives happen on private property—and make no mistake that Facebook's website is its private property—10 we increasingly trust corporations to protect our "inalienable" rights. It may surprise many that Twitter, Facebook, Instagram, YouTube, TikTok, Reddit, Discord, and other social media platforms are not subject to First Amendment constraints, because they are not state actors.11

These platforms do not "censor" speech in the technical sense, because only governments can censor.12 Private actors merely exercise editorial discretion, and they may do so virtually at will.13 In fact, our federal government has effectively deputized social media corporations to censor speech on their platform—even when platforms do so for pure profit motives.

[Page 593]

Social media platforms can exercise editorial discretion without incurring liability for third-party content (users' tweets, posts, grams, videos, hashtags, threads, etc.) thanks to so-called "Section 230 immunity," which provides that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."14 This means social media platforms like Twitter are not liable for defamatory or inflammatory tweets posted on their platforms.15

What, then, constrains social media platforms? Revenue and quarterly earnings reports drive corporate decision making. Platforms need to keep social media users plugged in, so users view as many advertisements as possible. Sometimes referred to simply as "eyeballs," users are targeted by armies of digital marketing teams whose only job is to keep things interesting. After the capitol riots, some cheered when Twitter suspended Donald J. Trump, or when Amazon suspended Parler from its web services. Parler has since sued Amazon, although Parler is likely to lose due to Amazon's immunity and discretion.16

But some worry about what this means for civil rights. The American Civil Liberties Union—an organization that called for Trump's impeachment—expressed concerns that these suspensions "should concern everyone when companies like Facebook and Twitter wield the unchecked power to remove people from platforms that have become indispensable for the speech of billions."17 These actions are certainly counter to the "free and open internet" principles that Google, Amazon, Facebook, and other tech giants have espoused since their founding.18 In fact, they argued that internet service providers should "treat . . . all

[Page 594]

bits equally," giving the same bandwidth to C-SPAN (which broadcasts public hearings) and PewDiePie (a popular YouTube personality whose videos contain misogynist and racist slurs).19

Now that the tech giants won the battle (but not the war) for so-called "net neutrality," they are using their vast "editorial discretion" to decide which speech they promote, and which speech they silence.

On January 11, 2021, Adam Mosseri, Facebook's head of Instagram (yes, Facebook owns Instagram) tweeted, "We're not neutral. No platform is neutral, we all have values and those values influence the decisions we make."20 This admission begs the question, what if social media corporations value wealth and power, and that influences their decisions as to who may speak and who may not?

And if so, how do we protect democratic freedoms in a world where speech is dominated by social media corporations? These are questions we will have to answer in the 2020s if American democracy is to survive. To answer this question, we first need to understand how we got to a legal status in which the world's largest social media corporations have privileges and immunities that exceed what traditional newspapers and reporters enjoy. Part I discussed below explains how the seeds of § 230 immunity were planted by the Supreme Court of the United States during the backlash against McCarthyism. Part II explains the inception and early development of § 230 itself, including its legislative intent. Part III discusses how the internet has changed radically since § 230 was promulgated in the 1990s, and why the law now distorts the market for social media and creates perverse incentives for social media corporations that make it less likely for these platforms to function as effective replacements for the public square. Part IV briefly concludes with a discussion on what a social media world without § 230 immunity might look like.

The Capitol Riot is America's giant goldfish moment. We have let social media grow too large by protecting the industry with § 230 immunity. We caught social media running amok in a big way in the Capitol Building. Crowd-think led people to believe they could save American democracy by trampling through its institutions. Twitter, the world's largest social media corporation, blamed President Donald Trump for instigating the rioters—and as a result banned the sitting President from the platform. Facebook followed suit. People called for

[Page 595]

the President of the United States to face charges for his tweets. Meanwhile, Facebook and Twitter are not liable for any harms caused by his viewpoints. In general, social media platforms are not liable for any views or obscenities expressed on their platforms, even if they are dangerous, because they are protected by § 230 immunity. This Article explores whether Facebook still merits this powerful immunity, or whether society would be better off if Facebook (now Meta) was responsible for spreading lies and hate.

Section 230 immunity began conceptually in 1959 as a protection for booksellers, who could never be expected to read all the books they sell, and thus gained immunity from obscenity code violations regarding any books in their store they did not know were obscene. In the 1990s, Congress reformed the Communications Act of 193421 to extend this immunity for third-party distributions of publications to internet social media platforms (Facebook, Twitter, etc.). Section 230 grants social media platforms immunity from harms caused by content posted on their site, just like Smith v. California22 grants booksellers immunity from obscene books in their stores.23

The problem is the logic does not fit because, unlike booksellers, social media platforms can and do read all the content on their platforms, via algorithms. Moreover, social media platforms prioritize the display of this content and even remove content its human editors dislike. Even if the motive is not sinister, it is still designed solely to maximize ad revenue by selling "eyeballs" (social media users are referred to as eyeballs) to advertisers. Social media platforms are not designed to create a public forum for well-reasoned debate, no matter what they claim, because they all have shareholders who demand the business meet quarterly revenue targets.

We should not rest our faith in democracy upon social media platforms. Like the goldfish in the lake, social media platforms are overgrown because we have placed them in an under-competitive sanctuary via § 230 immunity from liability. Now the social media platforms have grown too large and are crowding out other less profitable (from the perspective of internet eyeball ad revenue) sources of news and discussion. The traditional print media sources have gone bankrupt or gone digital, and even the digital ones must literally beg users to turn off their ad blockers so their journalists can get some share of the ad revenue. Put simply, government regulation protected social media platforms (the goldfish in this story), which grew overlarge

[Page 596]

and wrecked the ecosystem including the niche for traditional news media online.

The solution is to severely restrict and pull back on § 230 immunity for social media platforms. The law has created a set of incentives that led Facebook and Twitter to facilitate the Capital Riot and then totally escape any liability. With a liability regime like that, something similar is bound to happen again. And nothing like the Capital Riot should ever happen again. This Article attempts to explore where this immunity came from, whether it is still merited, and how we might move forward in this social media era.

I. The Foundation of Immunity for Third-Party Publishers

One of the fundamental principles of American democracy is freedom of the press. Protecting freedom of speech was one of the reasons America went to war against fascist Germany. But America's celebration of the triumph of...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT