The New York Supreme Court ultimately disagreed. Unlike CompuServe, where there were no efforts to review the content, Prodigy took some steps on behalf of its users. It had software that screened for profanity, an emergency-delete function, and language advising its bulletin-board users that it would remove “notes that harass other members or are deemed to be in bad taste or grossly repugnant to community standards, or are deemed harmful to maintaining a harmonious online community” when they were brought to its attention.
Of course, far too many total posts went up to review them all. But the service did what it could. As if to show that no good deed goes unpunished, the court in the case reasoned that “PRODIGY’s conscious choice, to gain the benefits of editorial control, has opened it up to a greater liability than CompuServe and other computer networks that make no such choice.”
Yikes, many thought, this case is going to cause everyone online to abdicate control over their platforms for fear that efforts to at least remove the worst stuff they see, as best they can, will open them up to more liability.
The Communications Decency Act
Around that time, Congress was trying to pass a law to regulate obscenity and indecency online. It passed the Communications Decency Act, but the law didn’t last long. It was struck down for violating the First Amendment. Only an amendment to the legislation survived.
Section 230 had been added by then Representatives Ron Wyden and Chris Cox, who were alarmed by the precedent that they feared the Prodigy case would set.
In their view, it was fine for a publisher who printed something unlawful, like an article with a defamatory claim, to be held liable. But what about comments left by a reader beneath a digital article? Or posts on a site like Craigslist? Had platforms been strictly liable for anything illegal a user wrote, it would’ve made much of today’s internet impossible. Imagine if Facebook, Reddit, or Twitter was legally liable for every bit of content that their hundreds of millions of users posted.
Foreseeing that, Section 230 held that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
That brings us back to the media-men list. Some observers are confident that its creator qualifies for Section 230 immunity, and that the defamation lawsuit against her will be thrown out on those grounds.
In Tech Dirt, for example, Cathy Gellis writes:
In this case, the progenitor of the Google doc was an intermediary enabling other people to express themselves through the online service – in this case, the Google doc – she provided. Section 230 allows that intermediaries can come in all sorts of shapes and sizes, because its immunity is provided broadly, to any provider of an “interactive computer service,” which is “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server.” That’s what Donegan did with her Google doc: provide access to software to multiple users.
If anything is wrong with the content they contributed through this service, then they can be held responsible for it. But per Section 230, not Donegan.
Analysis of that sort may well spare Donegan from liability, assuming that she didn’t solicit, write, or substantively edit any defamatory accusations. But it’s premature to conclude that she’ll qualify for immunity absent more detailed information about exactly what role she played, if any, beyond creating the document, or if her case will be affected by any of the ways that Section 230 has been narrowed by the courts.