Section 230 of the Communications Decency Act of 1996 ("Section 230")1 is touted by its supporters as the bedrock for freedom of online expression without which the meteoric growth of the internet would not have been possible. The statute's detractors view it as an enabler of disinformation that is undermining democracy, public health, and other aspects of society. The following provides a brief overview of Section 230, a brief comparison of the liability standards and regulatory oversight for online content, television and print, and a brief description of the regulatory approaches for online content in the European Union ("EU") and United Kingdom ("UK"). The following also provides some of the criticisms levelled at Section 230, and some of the recent reform initiatives.

BACKGROUND ON SECTION 230

What Is Section 230?

Section 230 was enacted in order to provide legal certainty in the wake of two conflicting judicial decisions.

Cubby, Inc. v. CompuServe, Inc.2 involved a defamation claim against CompuServe, a company that ran a subscription-based electronic information service, which included a journalism forum in which a third-party published a daily newsletter. Plaintiffs ran a competing online service, and brought a claim in the U.S. District Court for the Southern District of New York against CompuServe and others for alleged defamatory statements made in the newsletter. In ruling on a motion for summary judgment, the Southern District considered whether to apply: (1) the general "publisher" rule that one who repeats or republishes defamatory statements is subject to the same liability as the original publisher of the statements, or (2) the more difficult to satisfy (i.e., less onerous) liability standard applicable to "distributors" like bookstores and libraries, which requires proof that the distributor knew or had reason to know of the defamatory statements. In granting CompuServe's motion, the court viewed CompuServe's service as a type of electronic library, and thus held CompuServe to the less onerous liability standard applicable to distributors.

Four years later, in 1995, the New York Supreme Court in Stratton Oakmont, Inc. v. Prodigy Services Co.3 came out the other way, holding that Prodigy, which hosted electronic bulletin boards, should be treated as a "publisher" for purposes of defamatory statements a third-party made on one of the bulletin boards. The Stratton Oakmont court held that the differentiating factor from the Cubby case was the degree of control that Prodigy exercised over bulletin board content. The message to electronic information service providers that rely on third-party content was that engaging in content moderation is risky because it opens the door to you being found liable for defamatory statements in third-party content – with such exposure extending to all third-party content on your site, not just third-party content you have moderated.

Section 230 was enacted in 1996 in order to address concerns that the two decisions created a legal environment that favored leaving up illegal and objectionable content over removing it, and so disincentivized the development and utilization of content moderating technologies. Section 230 has two key provisions: Subsection (c)(1) and Subsection (c)(2).

Subsection (c)(1), which is sometimes referred to as the "publisher" safe harbor, can be thought of as providing online service providers with protection from claims for publishing content posted by third parties. It provides:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

An "interactive computer service" is "any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions." Social media companies are the quintessential interactive computer service, although the definition applies more broadly. An "inforrmation content provider" is, as the phrase suggests, "any person or entity that is responsible, in whole or in part, for the creation or development of information provided though the Internet or any other interactive computer."

Subsection (c)(2), which is sometimes referred to as a "Good Samaritan" safe harbor, can be thought of as providing online service providers with protection from their actions taken in good faith to block certain types of objectionable third-party content. It provides:

No provider or user of an interactive computer service shall be held liable on account of (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).

Subsection (e) provides that Section 230 has no effect on certain laws, including federal criminal laws, intellectual property laws, communications privacy laws or sex trafficking laws.

Download >> Section-230-of-the-communications-decency-act.pdf (arnoldporter.com)

Originally published in The Computer & Internet Lawyer Volume 39 " Number 9 " October 2022

Footnotes

  1. 47 U.S.C. § 230.
  2. 776 F. Supp. 135 (S.D.N.Y. 1991).
  3. 23 Media L. Rep. (BNA) 1794, 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995) (unpublished).

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.