Good Samaritans in cyberspace.

AuthorSilver, Keith
  1. INTRODUCTION

    One of the most salient and contentious issues associated with the fast-developing online industry is the liability of online service providers(1) for transmitting content created by others.(2) In an attempt to address part of the issue, Congress passed the Telecommunications Act of 1996,(3) which President Clinton signed into law on February 8, 1996. While the indecency portions of the Act have since been struck down as unconstitutional,(4) Section 230(c)(1) of the Act, known as the "Good Samaritan" Provision, remains in effect.(5) The Provision provides that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information provider."(6) Thus, even if an online service provider screens some of the content on its system, thereby acting as a Good Samaritan, it cannot be subject to "publisher" defamation liability for content that it transmits but does not create, such as Internet content and subscriber-generated content.(7)

    Congress included the Good Samaritan Provision in the Telecommunications Act to overrule Stratton Oakmont Inc. v. Prodigy Services Co.(8) as it might be applied to online service providers who block or screen "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable"(9) materials. In Stratton, an online service provider was subject to liability for defamatory material posted by another on the grounds that the provider exercised content control over its bulletin boards and was, therefore, considered a "publisher" rather than a "distributor" of the defamatory material.(10) Congress overruled Stratton because of its tendency to induce online providers to abandon all content control, which runs contrary to the Act's purpose of encouraging providers to screen and remove indecent matter from their systems.(11)

    This Article will demonstrate that by protecting providers who restrict or screen objectionable materials from liability as publishers, the Good Samaritan Provision will not achieve its intended result. Although it foreclosed "publisher" liability for others' content, the Provision left the door open for online providers to be held liable as "distributors" if they have reason to know of defamatory material by virtue of content control efforts.(12) By failing to perceive the special implications for the online medium of the link between content control and online provider liability, the Good Samaritan Provision is likely to perpetuate Stratton's deleterious effect of inducing providers to relinquish control over their systems, thereby undermining the purpose of defamation law, severely burdening the online information industry, and impeding First Amendment interests in the free flow of information.

    The second part of this Article proposes a revised form of the Good Samaritan Provision which addresses the problems outlined above by severing the link between content control and online provider liability. The proposed model for online defamation liability(13) is tailored to the unique features of the medium and is compatible with the interests of both the online industry and the public.

  2. A NEW AGE OF COMMUNICATION

    Since the advent of the telegraph, courts and legislators have tended to treat each new communications technology like the existing technology that it most closely resembles.(14) In the case of online service providers, initial attempts at regulating by analogy have been especially problematic, perhaps because it is not immediately clear which communications technology is most closely analogous. Online networks represent a revolutionary synthesis of several traditional communication media.(15) In offering their own content, online providers act much like a television station, newspaper, or magazine. By offering users e-mail to transmit private messages to other users, online services function similarly to the postal service. As the host of "chat" groups allowing simultaneous online discussions between two or more users, these services operate in the role of a telephone system.

    The extent to which online service providers control content varies widely.(16) Like traditional media, online providers control and edit the information they generate or contract with others to provide. Although subscriber-generated content is more difficult to control, some providers use software which automatically deletes vulgar or offensive language as it is transmitted.(17) Most Of the larger commercial services also employ gatekeepers or "moderators" who: (1) review some or all incoming messages before they are posted online to determine whether they are related to the topic to which the forum is dedicated; or (2) screen out material which is profane or which otherwise does not conform to standards established by the service.(18) However, due to the exponential growth of online traffic, as well as the speed of transmission, comprehensive review of subscriber-based content is becoming less and less feasible, even by those providers with the greatest monitoring resources.(19)

    Some aspects of online offerings defy any level of provider control. For instance, the very nature of live "chat" rooms prevents providers from prescreening such transmissions any more than a telephone company can screen telephone conversations.(20) Moreover, the linking feature which allows users to roam from network to network precludes the ability of the host network to monitor material accessed from another network.(21)

    Another type of material which is difficult for providers to control is material obtained from the Internet and transmitted through their systems. A key feature of the Internet is remote information retrieval,(22) which allows a user to search and retrieve information located on remote computers anywhere in the world. With millions of users using remote information retrieval to roam from network to network every day, it is technically impossible for the host network to monitor or screen all the material accessed from other networks.(23) Content control over newsgroups using Usenet,(24) a distributed message database system of voluntary rules for passing and maintaining newsgroups from server to server, is also very limited. Most Usenet newsgroups are unmoderated and have no central hub from which editorial control can be exercised.(25) If a particular message on a Usenet newsgroup is defamatory, server administrators are generally limited to terminating subscriptions to that newsgroup.(26)

    As online providers attempt to control the content transmitted by their systems, they face not only technical limitations, but legal limitations as well. For example, Chapter 119 of the Electronic Communications Privacy Act (ECPA)(27) prohibits the interception or disclosure of private electronic communications such as e-mail.(28) In addition, providers who offer third-party content as part of their service are sometimes prohibited by contract from editing or interfering with such content.(29)

    The limitations faced by online service providers trying to control content generated or accessed by subscribers point to what is perhaps the most fundamental difference between cyberspace and traditional media forms: the transformation and empowerment of the user from a passive consumer to a producer of information.(30) The relationship between producer and user online is fluid and reversible due to the interactive nature of online communication.(31) In addition, online providers offer communication forums to a virtually limitless and diverse number of information providers and consumers. This stands in contrast to the traditional electronic mass media, which must restrict the numbers of potential information producers due to spectrum scarcity.(32) Finally, the online relationships between information producers and users are more direct than in traditional forms of mass communication because they are largely unmediated by gatekeepers.(33)

    Thus, online service providers not only perform the tasks of many traditional communications media, such as the telephone and the post office, but also represent an entirely new medium with new legal challenges. As will be demonstrated, the provision of online services resists traditional centralized methods of legal regulation and calls for new ways of analyzing online service provider liability for transmitted content. Because this Article examines the liability issue in the context of defamation law, it is useful to begin with a brief overview of relevant defamation law principles.

  3. THE PUBLISHER/DISTRIBUTOR DISTINCTION IN DEFAMATION LAW

    The law of civil defamation serves "the public policy that individuals should be free to enjoy their reputations unimpaired by false and defamatory attacks."(34) While defamation law varies from state to state, general principles common to all jurisdictions may be elicited.(35)

    An essential element of a defamation claim is "publication."(36) Publication is generally described as the intentional or negligent communication of the allegedly defamatory statement to a third person.(37) Under this standard, "publishers" are regarded as persons or entities exercising such extensive control over the content at issue--either by creating, editing, or reviewing the content--that knowledge of the defamation can be fairly imputed as a matter of law.(38) In other words, publishers are deemed to have a "reason to know"(39) of defamatory matter by virtue of their editorial control. Entities such as newspapers and book publishers have traditionally fallen into the publisher category.

    The common law created a separate category of liability for "distributors," such as bookstores, libraries, and newsstands.(40) Unlike publishers, distributors are not presumed negligent; rather, they are subject to defamation liability only if it is proved that they "knew or had reason to know" of the defamation.(41) The notion that a distributor must...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT