THE GOVERNMENT IS IN YOUR DIRECT MESSAGES: DOES NEW LEGISLATION ALLOW TECH COMPANIES TO SEARCH YOUR ONLINE COMMUNICATIONS?

AuthorArnold, Maura
  1. Introduction

    In 2019, technology ("tech") companies reported 45 million online photos and videos of sexual abuse in the past year. (1) These images and videos include children as young as three or four years old being sexually abused and even tortured. (2) While child sexual abuse materials ("CSAM") have existed throughout history, their circulation has grown exponentially in the last decade. (3) In response, tech companies, law enforcement agencies, and politicians have come together to pass legislation to combat the problem. (4)

    In 2008, the Providing Resources, Officers, and Technology to Eradicate Cyber Threats to Our Children Act ("PROTECT Our Children Act") passed to thwart CSAM on the internet, but was largely ineffective. (5) In response, the Senate Judiciary Committee introduced the Eliminating Abusive and Rampant Technologies Act ( "EARN IT Act") in March 2020. (6) EARN IT was created to encourage and incentivize tech companies to take CSAM more seriously. (7) In doing so, the EARN IT Act would establish a national commission to recommend best practices for identifying and reporting online exploitation. (8) Additionally, the national commission would allow for Congressional review of best practices, liability safe harbors for compliant companies, and recourse for survivors. (9) The suggested "best practices," however, have left many wondering if they allow law enforcement unfettered access to innocent civilians' private communications on social media platforms. (10)

    Child sexual exploitation is a serious problem that has been exacerbated by the Internet, which provides abusers widespread and quicker access to potential victims and anonymous platforms. Social media gives abusers ways to connect, which tech companies are largely ignoring. In the past, the federal government has worked with tech companies and law enforcement to address the problem. Legislation thus far has been largely ineffective, given the size of the problem, as well as lack of follow through and accountability on the part of government and law enforcement. The EARN IT Act, attempts to hold tech companies accountable for CSAM on their sites. The specific provisions may authorize technology companies to search citizens' private communications.

  2. History

    1. Roots of CSAM

      CSAM is not a novel issue, but the Internet gave abusers a platform to connect with others, to share materials, and to find victims. (11) While the Internet exacerbated the problem, it has existed since Greek and Roman times, when young children were sexualized through art and writing. (12) The invention of the camera in 1826 gave rise to the modern concept of "child pornography." (13) The production of CSAM slowed until the 1960s, when societal attitudes toward censorship and sexual values turned increasingly liberal. (14) In the 1970s, the demand for several child abuse films and magazines grew. (15) Distribution of CSAM, however, was difficult as law enforcement could easily apprehend offenders. (16)

      The market for CSAM exploded with the Internet, which changed the fundamental ways the market functioned. (17) Abusers created anonymous communities which provided easy and constant access to each other, as well as free material. (18) Messenger applications were a popular platform which developed into peer-to-peer sharing networks such as Gnutella. (19) This problem increased throughout the 1990s and early 2000s as mobile devices, search engines, file sharing, social networking, and messaging developed. (20) With the rapid growth of technology, abusers gained easier access to victims and illegal materials. (21)

      In 2018, there were 18.4 million reports of CSAM which included over 45 million images and videos. (22) This global problem takes root in the United States as tech companies, headquartered primarily in Silicon Valley, play an integral role in facilitating the spread of CSAM throughout the world by failing to manage CSAM. (23) Social networking sites made it possible to find victims and other abusers to make and circulate new content. (24) Although much of CSAM circulates on the dark web, it exists on all platforms, including mainstream sites. (25) Some child abuse groups utilize encryption and the dark web to share images of extreme abuse and teach others how to do the same. (26) Given the explosion of CSAM on the Internet and creative routes abusers utilize, the circulation of CSAM has become difficult for law enforcement to manage. (27)

      In 2020, the COVID-19 pandemic created a spike in CSAM. (28) The pandemic forced millions in their homes to avoid spreading the virus, and left many victims vulnerable and isolated with abusers. (29) The closure of schools, restrictions on international travel, and increased time on online created ripe circumstances for abusers. (30) In September 2020, The International Criminal Police Organization ("INTERPOL") released a report that noted increases in the amount of CSAM shared on peer-to-peer networks, the dark net, and self-generated materials. (31) The report cites more time spent at home, economic hardships, and opportunities for abuse as significant factors in the overall increase. (32) Officials call this recent surge "just the tip of a growing iceberg." (33)

    2. The U.S. Government & Law Enforcement Response

      In 2008, the "PROTECT Our Children Act" was passed. (34) This Act was a created a task force allowing law enforcement to expand its ability to investigate, apprehend, and prosecute abusers. (35) Despite the comprehensive efforts of the legislation, CSAM continued to expand. (36) An investigation by the New York Times revealed several issues including understaffed and underfunded law enforcement agencies, neglect on the part of the Justice Department, and a failure of tech companies to report violations to law enforcement. (37) Lack of consistent communication and cooperation between law enforcement, the federal government, and tech companies has been a hindrance to investigating, catching, and prosecuting abusers. (38)

      The federal government neglected many obligations under the PROTECT Our Children Act. (39) In 2019, the Justice Department published only two of the six required reports under the legislation. (40) Many felt as if the task force created under the legislation was not a priority. (41) The legislative efforts were underfunded, making efforts to follow through on obligations futile. (42) Responding agencies were unable to hire enough agents and prosecutors to keep up with the number of cases. (43) Ultimately, many of the failures under the legislation came down to lack of communication between the federal government and law enforcement. (44)

      The lack of communication extended further to cooperation issues between tech companies and law enforcement. (45) Tech companies are not required to monitor their platforms for child abuse images and are only required to report discovered materials. (46) Often, companies will only respond to law enforcement inquiries to state they no longer have any relevant record(s). (47) While the dark web and peer-to-peer file sharing are preferred methods of disseminating CSAM, the second most popular mode of acquiring child abuse materials is the surface web. (48) Unlike the dark web, which is only accessible via certain browsers, the surface web hosts content available to the general public. (49)

      Encryption is a serious hindrance to law enforcement investigations of CSAM. (50) Encryption takes readable text and scrambles it, so that it can only be deciphered by a person who has the decryption key. (51) Encryption addresses many serious and legitimate privacy concerns, as it can be used to protect anything from text messages to bank account information. (52) It is often used to guard information from hackers. (53) Both the sender and the recipient must use the encryption key to unlock the information, which law enforcement does not have access to. (54) In March 2019, Mark Zuckerberg announced Facebook's plan to encrypt its messenger service, which alone was responsible for nearly 12 million reports of CSAM. (55) Encryption has become an increasingly popular way to distribute child pornography for criminals as the anonymity keeps them steps ahead of police. (56) The anonymity of encryption not only makes it harder for law enforcement to investigate, but emboldens anonymous abusers to post more frightening imagery. (57) The use of end-to-end encryption by sites like Facebook could have dangerous ramifications. (58)

      While tech companies have made progress in reporting child abuse material, many see these efforts as too late. (59) For years companies were aware of CSAM, but for unclear reasons, chose not to address the issue. (60) Delayed reporting from companies made it difficult for law enforcement to investigate cases, as crucial evidence was no longer available. (61) One of the most problematic companies is the blogging website, Tumblr. (62) Many Tumblr users reported seeing CSAM frequently with no way to report them. (63) Other problematic companies include the search engine, Bing and photo sharing app, Snapchat. (64) Snapchat is particularly problematic as communications disappear after a certain amount of time, hindering investigations. (65) Fewer than two percent of crimes being investigated as a result of poor communication from tech companies. (66)

      In addition to a spike in abuse materials, the pandemic has made it more difficult for law enforcement to track abusers. (67) Additionally, there have been more delays in reporting. (68) Downsizing measures and staff cutbacks have also impeded police efforts. (69) Working from home, the processes of investigation, and general efficiency of reporting efforts have made investigations even more complex. (70) Notably, courts closed during the pandemic, delaying the ability to process cases. (71)

    3. Section 230 and its Impact on Technology Companies

      Big tech companies, such as Facebook, Google, Bing, and Tumblr, are...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT