Google's recent removal of Alex Jones' InfoWars from its Google Play service, because of false and misleading information it had been transmitting about the coronavirus, isn't an aberration. That kind of oversight is encouraged by a key Internet law that is now under attack from various fronts.

That law is section 230 of the federal Communications Act, which protects various activities of Internet service providers. Opponents of the law have recently claimed variously that it promotes monopolies by big tech companies, over-empowers social media or politically skews information on the Internet. In fact, the statute is quite simple, and one of its two key provisions simply encourages and allows Internet services to use their own good judgment in removing false, misleading or otherwise objectionable content from their services.

As we have discussed before, section 230 was passed in 1996, in the early days of the commercial Internet, to ensure that Internet intermediaries, such as the service providers who bring signals into our homes and offices, would not be liable for the content they transmitted. It immunized their liability for user content, just as had been done in the past for telephone companies.

But section 230 went farther, and also gave Internet intermediaries – which came to include website operators and search providers – the active go-ahead to oversee, curate and edit the content on their system, without incurring liability themselves. Congress granted this "Good Samaritan" protection to providers to allow them to filter out objectionable materials and make their services more reliable and useful to users.

Essentially, in section 230, Congress recognized that Internet intermediaries filled a unique role. In one sense, they brought valuable content to users, so to permit and encourage that transmission, Congress immunized them from basic publication liabilities stemming from such transmissions. But unlike phone companies, many Internet intermediaries, such as website operators and content portals, could see and assess the content they were carrying, some of which might be objectionable to users and harmful to the service's reputation. So the second part of section 230, the Good Samaritan provision, allowed services acting in good faith to remove or restrict access to objectionable material.

Such removals are different from government censorship, because First Amendment rights apply only against governments, not private parties. Indeed, private publishers and broadcasters ordinarily make editorial choices about what materials they make available to their users. The Good Samaritan provision applied that traditional editorial right to Internet services.

Google's action against InfoWars involved removal of an app from the Google Play store, probably on contractual grounds. But it nonetheless illustrates the kind of oversight that Congress expected and sought to encourage with section 230's Good Samaritan provision. The InfoWars site, which promotes nutritional supplements and broadcasts shows and videos featuring its founder, had been telling users that the coronavirus wasn't a significant threat – the kind of content an Internet publisher might not want to help disseminate.

Takedowns like this may be tied to an increasing realization among leading Internet providers that the credibility of information transmitted over their networks really matters, especially as to issues like public health. In a joint statement on March 16, Facebook, Google, LinkedIn, Microsoft, Reddit and Twitter vowed to jointly combat fraud and misinformation about the coronavirus.

Of course, section 230's Good Samaritan provision isn't the only basis under which intermediaries can remove content. User agreements often give the platform the right to remove a user or its content, and that is probably the basis for Google's action against InfoWars.

Whatever its basis, the action serves as a reminder that the Internet's reliability and usefulness doesn't come about because anyone can post anything. It depends more on intermediaries taking care that their services are used for useful, not harmful, information. And the much-criticized section 230 explicitly encourages that kind of care.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.