Thought Leadership

Enigmatic Immunity for Internet Service Providers

Client Updates

With administrative changes potentially on the horizon, President Trump’s attempts to revoke protections for Internet service providers granted by section 230 of the Communications Decency Act are looking uncertain.1 A recent Supreme Court decision addressing section 230, Malwarebytes, Inc. v. Enigma Software Group USA, LLC, suggests changes for section 230 may be coming, even if no legislative or executive actions are taken. On October 13, 2020, the Supreme Court denied Malwarebytes’s petition for a writ of certiorari.2 Malwarebytes had requested review of the Ninth Circuit’s conclusion that section 230 of the Communications Decency Act does not immunize Malwarebytes from Enigma’s anticompetitive conduct claims.3 Writing for the Court, Justice Thomas only briefly addressed the decision to not revisit the Ninth Circuit’s decision and directed the remainder of his comments to the “questionable precedent” created by the lower courts’ emphasis on “nontextual arguments when interpreting § 230.4 Further, he suggested that, when appropriate, the Court “consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by Internet platforms,”5 which could impact both plaintiffs and defendants involved in litigation regarding Internet services.

The Communications Decency Act and Stratton Oakmont

The passage of section 230 of the Communications Decency Act in 1996 addressed a decision one year prior, Stratton Oakmont, Inc. v. Prodigy Services Co., that blurred the lines between liability for distributors and publishers or speakers.6 While distributors were typically “liable for defamatory statements of others only if they knew or had reason to know of the defamatory statement at issue,” a publisher could be a liable “as if he had originally published it.”7 The plaintiff in Stratton Oakmont argued, and the court agreed, that the defendant should be held liable as a publisher based on the defendant’s policies and statements that it “exercised editorial control over the content of messages posted on its computer bulletin boards.”8

Section 230 of the Communications Decency Act overturned Stratton Oakmont.9 It includes two subsections, 47 U.S.C. §§ 230(c)(1) and (2), that grant immunity for certain claims and thereby limit the potential liability of Internet service providers,10 and were intended to “promote self-regulation of Internet service providers.”11 The first subsection states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”12 As noted above, the law had traditionally distinguished between “publishers or speakers (like newspapers),” which typically have more control over content and face greater potential liability.13 In contrast, “distributors (like newsstands and libraries)” that do not exercise “editorial control” were only liable “when they knew (or constructively knew) that content was illegal.”14 Accordingly, “[t]his provision ensures that a company (like an e-mail provider) can host and transmit third-party content without subjecting itself to the liability that sometimes attaches to the publisher or speaker of unlawful content.”15 The second subsection grants immunity to any computer service provider for “(A) good-faith acts to restrict access to, or remove, certain types of objectionable content; or (B) giving consumers tools to filter the same types of content.16

In combination, these subsections prevent an Internet service provider from being held liable for hosting or distributing third party content, or for engaging in good faith efforts to “take down or restrict access to objectionable content.”17 However, Justice Thomas notes that the courts “have discarded the longstanding distinction between ‘publisher’ liability and ‘distributor’ liability” and have read “extra immunity into statutes where it does not belong.”18 In fact, multiple courts have concluded that “§ 230 confers immunity even when a company distributes content that it knows is illegal.”19

The Lower Courts’ Questionable Precedent

Justice Thomas identifies three ways in which he believes the lower courts have misconstrued section 230. First, he notes courts have granted “Internet companies immunity for their own content” when the statute grants immunity only for content “provided by another information content provider.”20 He ties this to one of the earliest decisions interpreting section 230 that concluded the statute barred liability for Internet service providers exercising “traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content.”21 However, this appears to conflict with another subsection, which specifies a party may be an “information content provider” by even partial creation or development of content.22 To reconcile this, courts have concluded “minor alterations” to another’s content and the “choice to publish” that content will not expose the Internet service provider to liability.23 For example, Justice Thomas cites one case24 in which the defendant solicited defamatory submissions before selecting and editing certain submissions for publication, but avoided liability because, per the court’s interpretation, the defendant did not “materially contribute[] to the defamatory content” and therefore could not be treated as a publisher for liability purposes.25

Second, Justice Thomas also pointed out that “courts have curtailed the limits Congress placed on decisions to remove content.”26 Congress limited the immunity provisions to companies that “unknowingly decline to exercise editorial functions to edit or remove third-party content” and those that “decide to exercise those editorial functions in good faith,” per §§ 230(c)(1) and (2)(A), respectively.27 In contrast, Justice Thomas asserts that courts have construed section 230 “to protect any decision to edit and remove content.”28

Third, Justice Thomas cites a number of cases in which the courts granted immunity to defendants by extending section 230 because the statute “should be construed broadly.”29 Further, he notes that in each of these cases, the plaintiffs’ claims identified “defendant’s own misconduct,” in contrast to other cases in which defendants’ immunity resulted from allegations that defendant was a publisher or speaker of third party content, or because defendant removed content in good faith.30 For example, one appellate court concluded that defendant, who had allegedly developed its website “to make sex trafficking easier,” was free from liability because “Congress . . . chose to grant broad protections to internet publishers” with the Communications Decency Act.31 Another appellate court stated “interactive computer services” that promote certain user-generated content “should not incur liability as developers or creators of third-party content” because “a defendant interactive computer service would be ineligible for Section 230(c)(1) immunity by virtue of simply organizing and displaying content exclusively provided by third parties.”32 In a third decision, defendant was granted immunity for its dating application that “allegedly lacked basic safety features to prevent harassment and impersonation,”33 because the court reasoned that defendant’s “‘refusal to remove’ offensive content authored by another [is] barred by § 230.”34 And in yet another decision, the court granted defendant’s motion to dismiss, because the Communications Decency Act granted defendant immunity from liability for the allegedly defective design of its app, which included a Speed Filter feature encouraging reckless driving.35

The Future for Section 230

In conclusion, Justice Thomas acknowledges any new interpretations of section 230 could expose defendants to additional liability. However, he reasoned that the “sweeping immunity courts have read into § 230 would not necessarily render defendants liable for online misconduct” but “would give plaintiffs a chance to raise their claims in the first place.”36 He further reasoned that plaintiffs will have to “prove the merits of their cases, and some claims will undoubtedly fail.”37

Given Justice Thomas’s suggestion that the Supreme Court should someday address section 230,38 it might also be prudent for plaintiffs and defendants to consider the cases and outcomes he discussed along with the actual text of section 230. For example, plaintiffs might encourage closer adherence to the text of section 230 to narrow the scope of immunity for defendants while emphasizing the “questionable precedent” created by the lower courts. In particular, some judges might wish to avoid unfavorable appellate review by a justice who is inclined to address the interpretation of section 230. Additionally, potential defendants might assess their content moderation efforts and policies to ensure they adhere closely to one of the two functions Justice Thomas identified as protected: declining “to exercise editorial functions to edit or remove third-party content” or exercising “editorial functions” of said content “in good faith.”39 In the meantime, both plaintiffs and defendants will be left to wrestle with the enigma of the “the correct interpretation of § 230.”40

1 See, e.g., Isobel Asher Hamilton, “The Georgia runoffs could decide the fate of Section 230 – along with the future of Big Tech,” Business Insider, Nov. 15, 2020, available at; Adam Smith, “What Is Section 230 And Why Does Trump Want It Revoked?,” The Independent, Nov. 6, 2020, available at; Adam Jacobson, “Will Section 230 Changes Come At The FCC? Not Likely,” Radio Business Television Business Report, Nov. 13, 2020, available at; Liana Sowa, “Proposals to Curtail Section 230 Gather Steam Among Journalists and Thinkers at Reboot Event,” Broadband Breakfast, Nov. 11, 2020, available at

2 Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 2020 WL 6037214, at *1 (U.S. Oct. 13, 2020).

3 Malwarebytes, 2020 WL 6037214, at *1 (addressing 47 U.S.C. § 230).

4 Malwarebytes, 2020 WL 6037214, at *1.

5 Malwarebytes, 2020 WL 6037214, at *1, 5.

6 See, e.g., F.T.C. v. Accusearch Inc., 570 F.3d 1187, 1195 (10th Cir. 2009) (“Congress enacted the [Communications Decency Act (“CDA”)] in response to a state-court decision, Stratton Oakmont, Inc. v. Prodigy Servs. Co., 1995 WL 323710, *5 (N.Y. Sup. Ct. May 24, 1995), which held that the provider of an online messaging board could be liable for defamatory statements posted by third-party users of the board.”).

7 Stratton Oakmont, Inc. v. Prodigy Services Co., 1995 WL 323710, at *3 (N.Y. Sup. Ct. May 24, 1995).

8 Stratton Oakmont, 1995 WL 323710, at *2, 5.
9 See, e.g., Zeran v. Am. Online, Inc., 129 F.3d 327, 331 (4th Cir. 1997) (“Congress enacted § 230 to remove the disincentives to selfregulation [sic] created by the Stratton Oakmont decision”).

10 Malwarebytes, 2020 WL 6037214, at *1.

11 800-JR Cigar, Inc. v., Inc., 437 F. Supp. 2d 273, 295 (D.N.J. 2006).

12 47 U.S.C. § 230(c)(1).

13 Malwarebytes, 2020 WL 6037214, at *1.

14 Malwarebytes, 2020 WL 6037214, at *1.

15 Malwarebytes, 2020 WL 6037214, at *1.

16 Malwarebytes, 2020 WL 6037214, at *1.

17 Malwarebytes, 2020 WL 6037214, at *2.

18 Malwarebytes, 2020 WL 6037214, at *2.

19 Malwarebytes, 2020 WL 6037214, at *2 (emphasis in original) (citing Zeran v. Am. Online, Inc., 129 F.3d 327, 331-334 (4th Cir. 1997); Universal Commc'n Sys., Inc. v. Lycos, Inc., 478 F.3d 413, 420-21 (1st Cir. 2007); Shiamili v. Real Estate Group of New York, Inc., 17 N.Y.3d 281, 288-89, 952 N.E.2d 1011, 1017 (2011); Doe v. Bates, 2006 WL 3813758, at *18 (E.D. Tex. Dec. 27, 2006). Justice Thomas also offers multiple “reasons to question this interpretation.” Malwarebytes, 2020 WL 6037214, at *3. He reasons that Congress explicitly included distributor liability in 47 U.S.C. § 223(d), which weighs against the courts’ reasoning “that Congress implicitly eliminated distributor liability” with section 230. Id. Further, Justice Thomas notes that section 230 was enacted in response to the Stratton Oakmont decision, which distinguished between the terms “distributors” and “publishers,” even if it blurred the liabilities imposed on these types of entities. Id. (discussing Stratton Oakmont, 1995 WL 323710 and citing Accusearch Inc., 570 F.3d at 1195). Finally, he reasons that section 230 includes two subsections distinguishing between liability for distributors and publishers, when Congress “could have simply created a categorical immunity in § 230(c)(1): No provider ‘shall be held liable’ for information provided by a third party.” Malwarebytes, 2020 WL 6037214, at *3.

20 Malwarebytes, 2020 WL 6037214, at *3 (quoting 47 U.S.C. § 230(1)) (emphasis in original).

21 Malwarebytes, 2020 WL 6037214, at *3 (quoting Zeran, 129 F.3d at 330) (emphasis in original).
22 47 U.S.C. § 230(f)(3) (“The term ‘information content provider’ means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.”).

23 See, e.g., Batzel v. Smith, 333 F.3d 1018, 1031 (9th Cir. 2003) (“The ‘development of information’ therefore means something more substantial than merely editing portions of an e-mail and selecting material for publication.”).

24 Malwarebytes, 2020 WL 6037214, at *4 (citing Jones v. Dirty World Entm't Recordings LLC, 755 F.3d 398, 403, 416 (6th Cir. 2014)).

25 Dirty World, 755 F.3d at 415-417 (emphasis added); see also Fair Hous. Council of San Fernando Valley v. Roommates.Com, LLC, 521 F.3d 1157, 1167–68 (9th Cir. 2008) (“we interpret the term ‘development’ as referring not merely to augmenting the content generally, but to materially contributing to its alleged unlawfulness. In other words, a website helps to develop unlawful content, and thus falls within the exception to section 230, if it contributes materially to the alleged illegality of the conduct.”).

26 Malwarebytes, 2020 WL 6037214, at *4.

27  Malwarebytes, 2020 WL 6037214, at *4 (emphasis in original).

28 Malwarebytes, 2020 WL 6037214, at *4 (discussing Sikhs for Justice, Inc. v. Facebook, Inc., 697 Fed. Appx. 526 (CA9 2017), aff’d 144 F. Supp. 3d 1088, 1094 (ND Cal. 2015) where the court construed section 230 to bar liability for the defendant who allegedly removed content for racially discriminatory reasons).

29 Malwarebytes, 2020 WL 6037214, at *4-5.

30 Malwarebytes, 2020 WL 6037214, at *5.

31 Malwarebytes, 2020 WL 6037214, at *4 (discussing Jane Doe No. 1 v., LLC, 817 F.3d 12, 29 (1st Cir. 2016) and citing M.A. ex rel. P.K. v. Vill. Voice Media Holdings, LLC, 809 F. Supp. 2d 1041, 1058 (E.D. Mo. 2011) (“Congress has declared such websites to be immune from suits arising from such injuries.”).
32 Force v. Facebook, Inc., 934 F.3d 53, 66 (2d Cir. 2019), cert. denied, 140 S. Ct. 2761, 206 L. Ed. 2d 936 (2020).

33 Malwarebytes, 2020 WL 6037214, at *5.

34 Herrick v. Grindr LLC, 765 Fed. Appx. 586, 590 (2d Cir. 2019), cert. denied, 140 S. Ct. 221 (2019)).

35 Lemmon v. Snap, Inc., 440 F. Supp. 3d 1103, 1113 (C.D. Cal. 2020)); but see Maynard v. Snapchat, Inc., 346 Ga. App. 131, 136, 816 S.E.2d 77, 81 (2018) (which held “that CDA immunity does not apply because there was no third-party user content published” in association with the Speed Filter).

36 Malwarebytes, 2020 WL 6037214, at *5.

37 Malwarebytes, 2020 WL 6037214, at *5.

38 Malwarebytes, 2020 WL 6037214, at *1, 5.

39 Malwarebytes, 2020 WL 6037214, at *5.

40 Malwarebytes, 2020 WL 6037214, at *5.

Baker Botts is an international law firm whose lawyers practice throughout a network of offices around the globe. Based on our experience and knowledge of our clients' industries, we are recognized as a leading firm in the energy, technology and life sciences sectors. Since 1840, we have provided creative and effective legal solutions for our clients while demonstrating an unrelenting commitment to excellence. For more information, please visit

Related Professionals