Section 230: A tool for social media companies facing accusations of silencing speech
2021 PRINDBRF 0453
By Anda Tatoiu, Esq., Mark Silverman, Esq., and Benjamin Chertok, Esq., Dykema Gossett PLLC
Practitioner Insights Commentaries
October 22, 2021
(October 22, 2021) - Anda Tatoiu, Mark Silverman and Benjamin Chertok of Dykema Gossett PLLC discuss the protections that online platforms have under Section 230 of the Communications Decency Act and how they differ from those granted to publishers.
The past couple of years have generated an influx of lawsuits and arbitrations involving social media companies and alleged silencing of free speech. As such accusations increase, it has become fundamentally important to understand the legal tools that these companies rely on for monitoring their platforms.
Recently in Florida, a federal judge blocked a state law meant to authorize the state to penalize social media companies when they ban political candidates. Florida's enactment of the law followed former President Donald Trump being blocked on Twitter, Facebook, and YouTube.
Two tech trade groups filed a lawsuit against Florida over the new law, arguing that the bill signed by Governor Ron DeSantis was unconstitutional. Florida would have been the first state in the nation to regulate how social media companies moderated online speech had the state prevailed.
One reason for this decision, and an incredibly powerful tool for social media companies to rely on when faced with allegations of illegally silencing speech, is Section 230 of the Communications Decency Act. Section 230 provides immunity to social media companies for claims such as violations of constitutional rights, reputational injuries, and others.1
More specifically, this statute affords protection to social media companies as it relates to content generated by other users of the platform and with respect to any decisions the social media company makes regarding the maintenance of accounts.
To be sure, the explicit text of Section 230 itself demands this protection. Section 230(c)(2)(a) reads that
No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected[.] (emphasis added).
This plain language could not be clearer. Section 230 protects social media companies in taking any action whenever the provider (the social media company) considers the users material to be objectionable in any way, irrespective of whether that material is constitutionally protected.
Courts across the country have interpreted "any action" rather broadly. Indeed, this includes a social media company's right to remove posts, and even to ban accounts. These types of posts are considered publisher conduct.
Often referred to as "the 26 words that created the internet," section (c)(1) of 230 reads that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." What does it mean then to be "treated as a publisher?" Essentially, a publisher is one who exercises control in broadcasting or repeating statements.
In the context of social media, the makers of those statements are the users and posters of content. The users make these statements on the social media company's "platform," which is simply the digital interface that these users actually interact with when using social media — it is what users see when they log into facebook.com or open their Instagram app.
As an analogy, think of books. The author of the book is the user posting content, the physical book itself is the platform. When thinking about Section 230, the question becomes: Are social media companies more like the publication company that exerts editorial control, or are social media companies more like the bookstore that makes the author/user's books available to whoever may want to read it?
These types of questions lead some legal scholars to distinguish publishers from distributors who merely disseminate statements with far less control over the content. In this distinction, the publisher is like a newspaper, which exercises full editorial control over its publications. Examples of distributors, on the other hand, may be bookstores and libraries.
Distributors generally enjoy greater legal protection against suits for offensive content compared to publishers who are more likely to be found liable for them. Before Section 230(c)(1), courts were split on whether internet companies were more like the newspaper or the library.2 In the year following Stratton Oakmont, and as a direct way of overturning of it, Section 230 declared once and for all that social media companies are not to be considered publishers.
So, since Section 230 protects social media companies from publisher liability over content from third parties and banning posts or accounts of third parties is considered publisher conduct, Section 230 allows for social media companies to perform these actions.
One California court recently made this clear in Murphy v. Twitter, Inc.3 There, the California Appeals Court ruled in favor of a platform provider, Twitter, after it had banned the plaintiff for repeatedly posting "tweets" that Twitter deemed hateful.
Murphy had made critical comments against the transgender community and had brought suit on her own behalf as well as on the behalf of other former Twitter users who had been similarly suspended. In the Court's words:
Murphy's claims all seek to hold Twitter liable for requiring her to remove tweets and suspending her Twitter account and those of other users. Twitter's refusal to allow certain content on its platform, however, is typical publisher conduct protected by section 230.4
On the federal level, the Ninth Circuit has put this even more succinctly. Simply put, Section 230 shields platform providers from liability over the decision to allow, remove, or edit posts generated by third parties.5
You may not be surprised to see such a case come out of California and the Ninth Circuit – but how about Florida? The Eleventh Circuit concurs.6
Moreover, Section 230 even explicitly preempts any state law action. Subsection (e)(3) of the statute itself reads "No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section." Indeed, many courts, including the Florida Supreme Court, have explicitly found that Section 230 preempts any actions brought under state law.7
In addition to Section 230, companies' member agreements can also play a pivotal role in determining the types of content publishers can post on a platform. If you find yourself dealing with these types of issues, contact counsel immediately to discuss what options might be available to you.
Notes
2 Compare Stratton Oakmont, Inc. v. Prodigy Services Co., (N.Y. Sup. 1995) (imposing publisher liability) with Cubby, Inc. v. CompuServe, Inc., 776 F. Supp. 135 (S.D.N.Y. 1991) ("CompuServe has no more editorial control over such a publication than does a public library. . .and it would be no more feasible for CompuServe to examine every publication it carries for potentially defamatory statements than it would be for any other distributor to do so.") (emphasis added).
5 See Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1105 (9th Cir. 2009) ("Subsection (c)(1), by itself, shields from liability all publication decisions, whether to edit, to remove, or to post, with respect to content generated entirely by third parties.") (emphasis added).
6 See e.g., Illoominate Media, Inc. v. Cair Fla., Inc., 841 Fed.Appx. 132, 134 (11th Cir. 2020) (affirming the district court in dismissing plaintiff's complaint reasoning that "Twitter's decision to ban [the plaintiff] was protected under Section 230 of the Communications Decency Act.").
7 Doe v. Am. Online, Inc., 783 So. 2d 1010, 1012, 1018 (2001) ("We specifically concur that Section 230 expressly bars "any actions" and we are compelled to give the language of this preemptive law its plain meaning.").
By Anda Tatoiu, Esq., Mark Silverman, Esq., and Benjamin Chertok, Esq., Dykema Gossett PLLC
Anda Tatoiu is a senior attorney in Dykema Gossett PLLC's business litigation group . Her diverse practice involves insurance defense, business torts, creditors rights, and residential and commercial mortgage foreclosure. She can be reached at [email protected]. Mark Silverman is co-leader of the firm's commercial mortgage-backed securities special servicer group, chair of its nationwide technology advisory committee, and a member of its financial industry group. He can be reached at [email protected]. Benjamin Chertok is an associate litigator. He assists clients with insurance defense, business torts, defending clients against abuse and misconduct allegations, real estate disputes, and contract litigation. He can be reached at [email protected]. All three authors are based in the firm's Chicago office.
Image 1 within Section 230: A tool for social media companies facing accusations of silencing speechAnda Tatoiu
Image 2 within Section 230: A tool for social media companies facing accusations of silencing speechMark Silverman
Image 3 within Section 230: A tool for social media companies facing accusations of silencing speechBenjamin Chertok
End of Document© 2024 Thomson Reuters. No claim to original U.S. Government Works.