TABLE OF CONTENTS INTRODUCTION 73 I. MAKING THE CONNECTION: MASS KILLINGS AND THE INTERNET 75 A. TAKING THE TALK: WHEN DEVIANT MINDS MEET ONLINE 78 I. MALE SUPREMACY--THE "GATEWAY DRUG" TO HATE GROUPS 79 B. WALKING THE WALK: WHEN ONLINE POSTS BECOME IRL TRAGEDIES 81 II. 18 U.S.C. [section] 875(C): INTERPRETING OLD LAW IN A NEW 82 WORLD A. WHAT CONSTITUTES A TRUE THREAT? 83 B. TRUE THREATS IN THE 20TH CENTURY 85 C. TRUST THREATS IN THE INTERNET AGE 86 I. VIRGINIA V. BLACK: HATE, BUT DON'S INTIMIDATE 87 II. LOWER COURT CONFUSION 89 III. ELONIS V. UNITED STATES: QUESTIONS HALF-ANSWERED 91 D. TRUE THREATS: DO INCEL POSTS PASS THE TEST? 95 III. SOLUTIONS 99 A. ANTI-THREAT LEGISLATION REFORM 99 I. CHANGES TO STATUTES 100 II. APPROPRIATE ANALYSIS OF 875(c) 101 B. FBI INVOLVEMENT 102 CONCLUSION 104 INTRODUCTION
Following a mass shooting in a Tallahassee yoga studio in late 2018, authorities struggled to find evidence linking the shooter, 40-year-old Scott Paul Beierle, to that studio or to any of the victims. (1) Upon investigation of Beierle's computer and Internet usage, his motives became clear. (2) The shooter had a history of posting YouTube videos and Facebook comments in which he described himself as a "misogynist" and expressed his desire for women to be punished for rejecting him. (3) In one of these videos, he praised and likened himself to Elliot Rodger, a man who killed six students on a college campus in 2014. (4) Rodger, like Beierle, identified as a part of the "incel" or "involuntary celibate" community, which has been recognized by the Southern Poverty Law Center as a "hate group" that subscribes to the "male supremacy" ideology. (5) Accordingly, both attackers regularly shared these ideologies and their desire for "retribution" publicly through online forums, where other like-minded users encouraged their violent fantasies. (6) It has become apparent that such extremist fringe groups, like the one that seemed to motivate the Beierle and Rodger attacks, have gone from concentrated, underground communities to a subject of global cognizance. (7) Given these findings, it is likely that the unadulterated use of online forums to share these harmful ideologies played a significant role in inciting these attacks. While 18 U.S.C. [section] 875 gives law enforcement the option to prosecute individuals for threatening others through online communication, the Supreme Court's analysis of the statute has made it nearly impossible for the prosecution to prevail in most cases. (8) It is not surprising that a law enacted in 1943, and last amended in 1994, is insufficient for addressing the advanced state of the Internet today. (9) One of the biggest difficulties law enforcement has faced in prosecuting under the federal anti-threat statute is the absence of a uniform standard of intent to decide whether speech constitutes a "true threat." (10) As a result, many online threats are not investigated thoroughly until and unless they result in a tragedy. (11)
Part I of this three-part Comment addresses the Internet's role in several recent mass killings. (12) It will describe how dangerous online communities have radicalized susceptible individuals, and when online posts within these communities cross the line from self-expression of an unpopular opinion to a "true threat" subject to criminal consequences. Part II examines the legal treatment of written and oral threats, both before the Internet and after the rise of social media. (13) It covers three prominent cases regarding "true threat" legislation: Watts v. United States, Virginia v. Black, and, most recently, Elonis v. United States. (14) Part III proposes possible ways to prevent the increasingly concerning problem of online communication leading to violence. This Comment, in agreement with several legal scholars, proposes that courts should analyze threats made online using a subjective intent standard and requiring that the poster acted recklessly in making this threat. (15) It also posits that in addition to changes in the judicial analysis of true threats, a compelling solution would be to create an FBI task force, similar to those currently used to identify child predators online, which would focus on distinguishing opinion or hyperbole from actual threats to human life made by these communities. The Comment concludes by asserting that, although online threats present law enforcement with a complicated First Amendment question, the combination of a proper analysis of existing threat statutes and the utilization of the government's resources to assess threats and prevent attacks can significantly contribute to the avoidance of future attacks.
MAKING THE CONNECTION: MASS KILLINGS AND THE INTERNET
Before discussing the problems courts face in online threat cases, it is important to recognize the connection between Internet activity and crimes of mass violence. This section will examine the creation of groups sharing inherently hateful or violent ideologies, specifically the "male supremacy" groups from which recent attacks have increasingly emerged. It will then provide examples of online posts which were actually proceeded by violent attacks, to illustrate what kind of online activity should be addressed.
After a mass killing, one of the first places law enforcement investigates is the killer's personal computer and online activity. (16) In 2015, as a response to an upswing in mass killings over the preceding decade, the FBI released a booklet entitled "Making a Prevention a Reality: Identifying, Assessing, and Managing the Threat of Targeted Attacks" (hereinafter "FBI Booklet"). (17) The FBI Booklet recognized the Internet's contribution to targeted attacks, (18) and even recommended the monitoring of a potential attacker's online activity in some cases. (19)
The publication of the attack prevention booklet was necessitated by the FBI's findings in their study of active shooter incidents in the United States between 2000 and 2013. (20) In June 2018, the FBI released their findings on the pre-attack behaviors of the shooters included in that study. (21) One of the pre-attack behaviors examined was "leakage," the term used for an attacker's disclosure to a third-party about "feelings, fantasies, attitudes or intentions that may signal the intent to commit a violent act." (22) In the age of social media, leakage often occurs through online interactions. (23) Additionally, the study noted the frequency of active shooters leaving "legacy tokens," including "social media postings... deliberately created by the shooter and delivered or staged for discovery by others, usually near in time to the shooting." (24)
In addition to, and often in conjunction with, threats of targeted attacks, the Internet has predictably become a breeding ground for existing hate groups and has even facilitated the formation of new extremist groups over the past few decades. (25) Thankfully, these formerly little-known fringe groups no longer go unnoticed by the general public. Since the tragic Columbine High School shooting in 1999 brought the media's attention to the teenage shooters' "ominously violent" web pages, the public has become increasingly aware of individuals using the Internet to "disseminate hate and promote violence." (26) Further, it is likely that Internet communication can promote hate-inspired violence in sociopathic individuals who participate in online hate groups, as the sense of community "emboldens them to carry out acts of violence." (27)
As recent incidents of violence have brought to light the problems created by unfettered communication between members of hate groups, the United States has struggled to find a proper way to address them without infringing on an Internet user's First Amendment right to free speech. (28) As Justice Holmes once noted, the "most stringent protection of free speech would not protect a man in falsely shouting fire in a theater and causing a panic;" in other words, there is no reason for the First Amendment to protect speech that incites fear in others. (29) Online posts that pose such problems are often rooted in certain ideologies that are created and spread through online discourse, which are discussed in greater detail in the following sections. (30)
TALKING THE TALK: WHEN DEVIANT MINDS MEET ONLINE
The Internet can promote violence in a way that is unique to any other mode of communication, which is why it is important for law enforcement to identify online threats, and for courts to properly analyze them. While online threats can occur in various contexts, they are prevalent in relatively small online communities designed to encourage violent thoughts and behavior. (31) In the aforementioned FBI Booklet, the FBI referred to the concept of "pronoid pseudo-communities" as one of the ways the Internet contributes to targeted attacks. (32) These communities can pose a threat when they bring together people who share a fascination with targeted violence. (33) The encouragement from online peers can increase a pronoid individual's sense of grandiosity, leading him to believe he is entitled to commit violence. (34)
Eventually, encouragement from an online hate group can lead to disinhibition. (35) A disinhibited individual can then easily use the Internet to research and plan violence. (36) The existence of an unmediated public forum in general increases the potential for inspiring violence, and for finding support for a violent plan of action. (37) In hate-inspired communities, interactions between members are likely to either foster group violence or normalize violence in individuals. (38)
Additionally, the Internet has made it easier to transmit threats of violence by offering perceived anonymity to a poster, which further increases disinhibition as the behavioral constraints of face-to-face conversations are no longer necessary. (39) Given these findings, it is apparent why hate groups, particularly those...