What is Section 230 of the Communications Decency Act?

This page has been peer-reviewed, fact-checked, and edited by multiple qualified attorneys and legal professionals to ensure substantive accuracy and coverage. Our publication process is robust, following a 16-step content creation and review process.

The Communications Decency Act (“CDA”) is a piece of legislation that was enacted as part of Title V of the Telecommunications Act of 1996, to regulate the internet and online communications. The primary purpose of the CDA was to protect children from viewing “obscene or indecent” material published on the internet. The CDA criminalized the knowing transmission of obscene messages to minors and transmission of material that depicted or described, “sexual or excretory activities or organs.”

In 1997, this section of the CDA was held to be unconstitutional in the case Reno v. American Civil Liberties Union by the U.S. Supreme Court. The Court found that this provision infringed upon First Amendment free speech protections and ruled that although the government has an interest in protecting children from harmful materials, it was an “unnecessarily broad suppression of speech addressed to adults.”
After the Supreme Court’s ruling in Reno, the CDA’s most important feature became Section 230. Section 230 is credited by many as the “twenty-six words that created the internet.”

Video: Why Section 230 of the CDA Should Matter to You

Video Placeholder

Watch

 

Section 230 makes internet service providers, like Google and Facebook, immune from lawsuits based on claims related to content published by third-parties using their service. For example, if someone posts a fake Google Review about your business that is defamatory, you generally cannot sue Google for defamation. This is because they are immune under Section 230 of the CDA.

Learn about the provisions of the Communication Decency Act (CDA)

Section 230 Overview

Section 230 of the Communications Decency Act is one of, if not the most, important pieces of internet legislation in the United States. The goal of Section 230 is to enable internet and tech companies to develop without fear of legal liability content published to their websites by internet users.

  • If internet computer services are held liable for content published by their users, this could either:
  • Drive companies into bankruptcy from constant legal battles, or
    Incentivize companies to heavily regulate and censure content published by users.

Both consequences would discourage free online speech and innovation on the internet. Section 230 is often credited for enabling social media companies, search engines, and cloud storage systems to be as successful and integrated into society as they are today.

Section 230 of the Communications Decency Act prevents courts from viewing interactive computer service providers as “publishers” in cases involving content published by third-party information content providers (“ICP”s). The text of Section 230(c)(1) states that:

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Section 230(c)(1) classifies interactive computer service providers and users (“ISP”s) as intermediaries and grants them broad immunity from liability for content published on their services by third-parties. What is more, Section 230 enhances an ISP’s oversight power over content published on their platforms without considering them publishers of the objectionable content and making them liable for a third-party’s unlawful content.

Without the broad immunity provided by Section 230, ISPs, including Google and social media sites such as Facebook, Twitter, or YouTube, could be liable for every message or post made using their services. This would make it very costly for these websites to allow users to publish content on their services. It would also cause many of these services to shut down or lead to most websites heavily restricting user internet speech.

Courts use a three-prong test to determine if the CDA provides an ISP defendant immunity.

  • First, the defendant must be a provider or user of an interactive computer service.
  • Second, the plaintiff’s cause of action must view the defendant as the “publisher” or “speaker” of a harmful statement.
  • Third, the harmful information was provided by another information content provider, other than the defendant.

If all three prongs are met, then the defendant is protected by the CDA and immune from legal liability.

business-owners-guide-fake-reviews

Difference Between Interactive Computer Service Provider and an Information Content Provider

Under the CDA, an interactive computer service is any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server. Websites are the most common example of an interactive computer service provider (commonly “internet service provider” or “ISP”). Both individuals and corporations can qualify under this definition as well.

An information content provider (“ICP”) is an entity or individual responsible, in whole or part, for the creation or development of content. Anyone who creates content on the internet can be an ICP. For instance, when someone makes a post using their Facebook account, the person making the post is an ICP. Facebook is the ISP.

ISPs can also be an ICP if they create or develop their own content, or exercise editorial control over the content.

One of the main questions driving whether an ISP can claim legal protection under Section 230 is whether the ISP acted in the capacity of an ICP in creating the content in question, rather than a passive distributor of content.

 

Why Does It Matter If an ISP Is Considered a “Publisher”?

Before the CDA was enacted, ISPs could be held liable for defamatory material published on their services if they were considered a “publisher” of content rather than merely a “distributor.”

Publishers of content are generally liable for a defamatory republications made by a third-party. But distributors are only liable if they know or have reason to know that a statement is defamatory.

Pre-Section 230 Case Examples

For example, in a federal court’s decision in Cubby, Inc. v. CompuServe Inc., 776 F. Supp. 135 (S.D.N.Y. 1991) an ISP was not liable for defamation because it was merely a distributor of content, rather a publisher. A publisher has editorial control over content and reason to know of unlawful statements, whereas a distributor would not necessarily have reason to know of the content or its nature.

Likewise, in Smith v. California, 361 U.S. 147, 152-153 (1959), the Supreme Court considered a bookseller a distributor, because a bookseller may not be aware of the contents of every book in his shop. To impose liability on every bookseller as though they published the content could lead the seller to strictly inspect all materials sold, which isn’t very practical and would eventually restrict the public’s access to materials and have a chilling effect on speech. A publisher plays an active role in the publication whereas a distributor plays a more passive role.

The problem with the internet was determining the line between when an ISP was publishing its own material or merely distributing material published by third-parties.

Publishing activity generally goes beyond merely allowing a post to be viewable on a website, and requires that an ISP affirmatively acted or otherwise exercised editorial control over the content published on a website. The more editorial control an ISP exercises over its website, the more it will open itself to liability for what was published on their services. See Stratton Oakmont, Inc. v. Prodigy Services Co., 1995 N.Y. Misc. LEXIS 712.

Stratton Oakmont, Inc. v. Prodigy Services Co.

In Stratton Oakmont, an anonymous user posted to Prodigy Services’ Money Talk bulletin board alleging that Stratton Oakmont, an “over the counter” securities brokerage founded by the infamous ‘Wolf of Wall Street’ Jordan Belfort, committed fraudulent and criminal acts in association with an initial public offering (“IPO”). Stratton Oakmont then sued Prodigy Services and the anonymous poster for defamation.

The plaintiffs argued that Prodigy Services was a publisher of the defamatory content and should be held liable for defamation. Prodigy argued that the case should be dismissed on the grounds that they could not be held legally liable for content published by third-parties. The Court ultimately found that Prodigy was in fact a publisher because they exercised editorial control over the messages in their bulletin board by:

  • Posting content guidelines for users;
  • Enforcing content guidelines with ‘Board Leaders’; and
  • Removing objectionable and offensive language via screening software.

The Court reasoned, “PRODIGY’s conscious choice, to gain the benefits of editorial control, has opened it up to a greater liability than CompuServe and other computer networks that make no such choice.”

The decision in Stratton Oakmont encouraged ISPs to stop controlling or reviewing what content was published on their services because it opened them up to greater liability. Section 230 of the CDA was specifically enacted to overturn the court’s decision in Stratton Oakmont and to encourage ISPs to exercise oversight in-house, rather than subjecting them to liability for every statement published by a third-party using their services.

Legal Defenses Available to ISPs Under Section 230

ISPs and tech companies can claim two types of immunity under Section 230 as defenses in civil actions. First, defendants can argue they qualify for immunity against claims treating an ISP as a publisher of information provided by third-parties. Second, in claims regarding a defendant’s decision to remove or filter content, the defendant can argue they made the content decision voluntarily and in good faith.

Section 230(c)(1) allows an ISP to claim immunity as a defense to a claim if:

  • the defendant is a provider and user of an interactive computer service;
  • the claim relates to information provided by a third-party information content provider; and
  • the claim treats the defendant as the publisher or speaker of the information.

Claims that treat a defendant as a publisher or speaker of a statement include defamation, false advertising, unfair competition, intentional infliction of emotional distress, invasion of privacy, among others. Because this immunity is generally applicable to so many civil claims, it is often the most effective defense for an ISP defendant.

Section 230(c)(2) protects defendants from liability for making voluntary decisions in good faith to restrict access to or availability of material that the ISP considers obscene, unlawful, or otherwise objectionable. Simply put, Section 230 (c)(2) provides immunity for ISPs to make decisions about removing or filtering content on their services.

To defend against a claim based on a content removal decision, an ISP needs to show they made the content-related decision in good faith. Good faith means that the ISP did not make the decision with the intent to defraud or otherwise facilitate illegal activity. Defendants can use this defense even if they do not qualify for immunity under Section 230(c)(1) because they developed the content at issue. Defendants may rely on this defense so long as it was a decision to restrict access to content because the ISP considered it obscene or otherwise objectionable.

Who Is Liable for Content If Not the ISP?

While Section 230 protects an ISP from claims treating them as publishers of content, it does not bar claims against the original author of the content. If a defamatory statement was posted on Facebook, for example, the Facebook user who posted it is liable and can be sued. But bringing a claim against the original author of unlawful content can be difficult, particularly if the author posted content anonymously.

If a statement was posted anonymously to a platform, plaintiffs can file a John Doe lawsuit. John Doe lawsuits enable plaintiffs to conduct discovery to attempt to identify anonymous posters. The plaintiff can ask a court for permission to subpoena the ISP that hosts the unlawful content to obtain the author’s email, IP addresses, or other identifying information.

Video: What is a John Doe Lawsuit? How to Identify Anonymous Defamers Online

Video Placeholder

Watch

 

Because the right to speak anonymously has been traditionally protected under the First Amendment, courts are still struggling to develop tests and procedures that adequately protect:

  • An individual’s right to publish anonymous opinions, and
  • A plaintiff’s right to face their accusers and present a full case against them for unlawful conduct.

There is not a single (or even a majority test) used by courts to evaluate these claims. This can make bringing a John Doe lawsuit extremely difficult.

Arguments For and Against Section 230 Immunity

Section 230 is considered controversial because of the freedom it allows ISPs at the expense of plaintiffs who genuinely suffer harm as a result of ISP content decisions or their lack of oversight of posts by third-parties.

Supporters of Section 230 argue that it protects free speech rights on the internet and minimizes the cost and burden of having to engage in lengthy litigation for millions of unmeritorious claims made against ISPs. They also argue that Section 230 encourages ISPs to self-regulate content without being overly pressured by the constant threat of lawsuits based on any potential post made using their services.

Section 230 immunity provides an all-encompassing defense to a myriad of potential civil claims and, thus, only requires the ISP to prove a single defense rather than having to combat each claim individually. This limits costs and allows courts to efficiently dispose of or sustain those claims. Without Section 230 protection, supporters contend that ISPs would be encouraged to censor user speech out of fear that they could be opening themselves up to lawsuits. Censorship of user speech could ultimately offend the core principles of the First Amendment.

Opponents of Section 230 argue that it puts plaintiffs with meritorious claims at a disadvantage and with limited or no other means of recovering for injuries suffered as a result of online content. They argue that courts have interpreted CDA immunity too broadly in protecting ISPs, especially when in traditional circumstances, outside of cyberspace, they would be liable for facilitating illegal activity.

For example, a physical magazine or newspaper that published nude photos or clearly defamatory gossip submitted by readers could be liable for that content. But under Section 230, websites who knowingly host the same content cannot be found liable. The original purpose of the CDA and Section 230 was to immunize ISPs who wanted to restrict access to objectionable and offensive material, rather than provide a safe harbor for ISPs to host unlawful content published by third-parties.

Opponents also argue that Section 230 immunity does not necessarily protect free speech rights but rather facilitates harassment and other unlawful forms of speech that would not otherwise be protected.

Hate speech, for instance, has proliferated the internet and can be directly linked to mass shootings committed in the U.S. in the last five years. But because of broad First Amendment protections regarding political opinions, hate speech is not generally considered illegal unless it directly incites violence. Tech companies and ISPs like 8chan are protected under Section 230 from any liability for hate speech. So long as the content is not illegal, they have no incentive to regulate hate speech published on their websites that could possibly lead to mass shootings or other hate crimes.

Section 230 immunity is particularly controversial with regard to sex trafficking cases and other online sexual victimization cases. The internet quickly became an easy way to commit, facilitate, and promote sex trafficking on a national and international scale, sometimes through the use of mainstream websites such as Facebook and Craigslist.

While many of the mainstream websites did work to block content related to sexual victimization after the CDA was enacted, websites like Backpage.com could not be held liable for enabling sex trafficking and victimization on their websites because of the broad scope of Section 230 immunity. A case in the Texas state court system against Facebook is currently being litigated to determine the scope of Section 230 immunity after the FOSTA exception was enacted in 2017.

For further information about Online Extortion and Sextortion, see How to Deal With Sextortion on the Internet.

Social Media and Section 230

Under Section 230, social media and tech platforms like Facebook and Twitter qualify as an ISP and, thus, are protected from being sued for what users post on their networks. Since the U.S. presidential election in 2016, social media platforms have come under scrutiny for failing to adequately regulate polarizing, political propaganda and “fake news” published on their websites by Russian agents. After the election, many politicians believe social media companies began regulating and censoring conservative political opinions from their websites rather than protecting the free speech rights of users.

On May 28, 2020, President Donald Trump issued an executive order after Twitter marked some of his tweets about mail-in voter fraud as potentially misleading. The “Executive Order on Preventing Online Censorship” was aimed at Section 230 immunity being applied to social media websites. Trump’s executive order stated that social media sites and companies “censor opinions with which they disagree” and that, by doing so, they “cease functioning as passive bulletin boards, and ought to be viewed and treated as content creators.”

Additionally, the order accused internet platforms of “invoking inconsistent, irrational, and groundless justifications to censor or otherwise restrict Americans’ speech” and “profiting from and promoting the aggression and disinformation spread by foreign governments like China.”

This Trump Administration order sought to provide policy recommendations for the enforcement of CDA protections by instructing executive agencies to consider restricting and clarifying Section 230 immunity to “determine the circumstances under which a provider of an interactive computer service that restricts access to content in a manner not specifically protected by [Section 230(c)(2)] may also not be able to claim liability protection under [Section 230(c)(1)]” for actions taken based on its own editorial decisions.

The Trump order also suggested that the “good faith” requirement to Section 230(c)(2)(A) be defined to exclude ISP conduct in certain situations. One specific situation includes editorial decisions to remove or restrict access to content must be made without the removal being “deceptive, pretextual, or inconsistent with a provider’s terms of service…taken after failing to provide adequate notice, reasoned explanation, or a meaningful opportunity to be heard..and any other proposed regulations that the [National Telecommunications and Information Administration] concludes may be appropriate to advance the policy” previously described in the order.

The remainder of the order instructed executive agencies to draft proposed legislation and review and enforce state statutes to prohibit ISPs from engaging in unfair, deceptive, or discriminatory practices.

Finally, the order sought to revoke almost all CDA protections given to social media platforms if they were determined to make “biased” editorial decisions about their content.

Critics of the order cited that it lacked the authority to complete its objectives. While the President does have authority to instruct executive agencies that enforce federal statutes, he cannot, via executive order, completely negate the text and legislative intent behind the CDA. He cannot also negate the judicial precedent that interprets Section 230 immunity and applies it. The order did not result in a congressional inquiry into Section 230 protections that could have led to reconsideration or amendment of the statute.

This executive order was ultimately revoked by President Joe Biden in 2021.

Exceptions to Section 230 Immunity

The CDA does not provide ISPs immunity in all circumstances. Courts have found exceptions to Section 230 immunity where an ISP is directly involved with unlawful activity. If an ISP edits a published statement, materially alters its meaning, and the altered statement is defamatory then the ISP is not immune from a claim based on that statement. If the ISP creates or develops illegal activity, they cannot claim CDA immunity under Section 230. In certain circumstances, an ISP’s promise to remove content and subsequent failure to do so may waive their immunity under Section 230(c)(2).

As a direct response to cases upholding Section 230 immunity with websites like Backpage.com that aided and abetted online sexual victimization, Congress signed the “Allow States and Victims to Fight Online Sex Trafficking Act” (FOSTA) in 2017.

 

Fight Online Sex Trafficking Act

FOSTA makes it illegal for websites to knowingly assist, facilitate, or support sex trafficking and allows sex trafficking victims to bring civil claims against them, irrespective of Section 230 immunity. Some argue that FOSTA goes too far in removing Section 230 immunity because it does not distinguish between consensual sex work conducted online and nonconsensual sex trafficking.

Critics also argue that it places a large burden on internet companies to regulate user content to prevent any suspected sex trafficking activity. This was a monumental blow to the immunity bestowed on ISPs for content posted to their platform under the CDA. As of June, 2020, it has not yet been widely tested in courts.

Criminal & Intellectual Property Infringement Claims

Additionally, Section 230 immunity only applies to cases involving civil liability. It does not protect ISPs against criminal prosecutions. ISPs can, therefore, be prosecuted for criminal forms of online harassment such as:

But, it does generally bar civil claims of relief for those same actions. So while a prosecutor can criminally indict and convict an ISP for illegal behavior, the victims of harassment may not have a civil avenue to recover damages resulting from the ISP’s criminal conduct.

Product Liability

CDA protections in product liability cases may also be limited. In 2019, the Third Circuit held in Oberdorf v. Amazon.com that negligence and strict liability claims based on defective products sold on Amazon.com did not seek to treat Amazon.com as a publisher or speaker of information and, therefore, the CDA did not bar the plaintiff’s claims.

The plaintiff had bought a defective dog collar through Amazon produced by a third-party vendor and sought to hold Amazon liable for their role as a “seller” or actor in the sales process. Because Amazon plays a large role in sales transactions, such as by selling or distributing products, beyond merely advertising products on its website or other editorial functions, the plaintiff could pursue a claim of product liability. But Amazon was immunized by the CDA to the extent that the claims alleged Amazon failed to provide or edit warnings about the use of products sold on its site.

 

Possible Damages

If an ISP falls into one of the exceptions listed above, or if the ICP of unlawful content is identifiable, then a plaintiff may recover damages from a civil claim against them. The type and amount of damages vary depending on the type of claim and factual circumstances of the case. Common types of collectible damages include:

  • Special damages;
  • General damages;
  • Punitive damages.

Special Damages

Specific damages are quantifiable monetary losses suffered from the date of the defendant’s unlawful act up to a certain time typically determined at trial.

Special damages can include:

  • Lost wages,
  • Medical bills,
  • Damage to property, or
  • Other quantifiable losses suffered by the plaintiff.

General Damages

Also called “compensatory damages”, general damages are not able to be quantified in monetary terms. General damages are to account for pain and suffering as a result of the harm.

For example, if a plaintiff suffers emotional or mental anguish as a result of failure to remove a post, not quantifiable by other means, they may be entitled to recover general damages.

Punitive Damages

Punitive damages are awarded specifically to punish a defendant for unlawful conduct. In most jurisdictions, punitive damages may be recovered in negligence actions. However, punitive damages may generally only be recovered if the plaintiff shows that the defendant acted recklessly, rather than merely with ordinary negligence.

In defamation suits, for example, reckless behavior means that the defendant knew or should have known the statement was false before it was published.

Significant Court Cases Ruling on Section 230

Zeran v. America Online, Inc.

Federal Circuit Courts initially interpreted CDA immunity broadly, allowing many different types of entities to qualify as an interactive computer service provider or user. Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997), was one of the first cases to interpret Section 230 immunity.

In Zeran, an anonymous AOL user posted false advertisements on an AOL message board. The advertisements included the sale of t-shirts with insensitive language about the Oklahoma City bombing and prompted people to call the plaintiff at his home phone number to buy the shirts. As a result, the plaintiff received thousands of calls, including death threats. While AOL removed the original posting, they did not post a retraction telling users the advertisement was fake. AOL argued they were immune from the plaintiff’s defamation claims under Section 230.

 

The 4th Circuit agreed with tech company AOL and held that Congress intended the immunity under Section 230 to be applied broadly to ISPs to protect free speech interests on the internet. They reasoned that when faced with potential liability for each message posted on their services, interactive computer service providers might choose to severely restrict the number and type of messages posted.

In more recent legal cases, some courts have scaled back on applying CDA immunity in certain circumstances.

Fair Housing Councils v. Roommates.com

In Fair Housing Councils v. Roommates.com, 521 F.3d 1157 (9th Cir. 2008), the 9th Circuit considered whether Roommates.com could be liable for violating the Fair Housing Act. for developing questionnaires to elicit discriminatory information and data from users. For example, Roommates.com questionnaires required users looking for a roommate to determine whether they want to exclude users with children and users of certain sexual orientations or races from their search results.

The court held that, while Roommates.com was immune regarding open-ended questions on its questionnaire, it was not immune regarding the discriminatory questions on the questionnaire. The site categorized, channeled, and limited the distribution of user profiles. This provided an additional layer of information and made the site an ICP with respect to those questions. The decision in Roommates was considered to be a huge shift in court treatment of ISPs and led to many cases attempting to restrict Section 230 immunity.

 

Nemet Chevrolet, Ltd. v. ConsumerAffairs.com

The 4th Circuit in Nemet Chevrolet, Ltd. v. Consumeraffairs.com, 591 F.3d 250 (4th Cir. 2009), distinguished between when a website acts as an ICP and when it does not. Plaintiffs sued Consumeraffairs.com for allowing users to post defamatory comments about their dealership online.

The court found that, unlike in Roommates.com, ConsumerAffairs.com was not sufficiently contributing to the unlawful content to be considered an information content provider. The test seems to be how involved the ISP is in creating the unlawful content. The more passive an ISP is in filtering or developing content, the more likely they will qualify for Section 230 immunity. The more active and involved a provider is in its service, the more likely they will be considered an information content provider and not qualify for CDA immunity.

 

Jones v. Dirty World Entertainment Recordings LLC

Similarly, in Jones v. Dirty World Entertainment Recordings LLC, 755 F.3d 398 (6th Cir. 2014) the Sixth Circuit adopted the “material contribution” test, which states that an ISP is responsible for the development of content under Section 230(f)(3) that is created by a third-party if it contributes materially to the alleged illegality of the conduct, such that is responsible for what makes the displayed content unlawful.

According to the Sixth Circuit’s interpretation, the ISP cannot have materially contributed to defamatory content simply by selecting posts for publication or refusing to remove posts because that falls within its traditional editorial functions under Section 230(c)(1).

In Jones, an anonymous poster uploaded defamatory posts about the plaintiff onto TheDirty.com. TheDirty.com’s web host selects posts to publish and adds his own editorial comments to it. The Sixth Circuit found that TheDirty.com was not a creator or developer of the defamatory posts under the material contribution test. They reasoned that because, even though the web host selected to publish the posts about the plaintiff, the upload process did not specifically require or encourage that users submit defamatory, unlawful, or harmful content. The court also emphasized that the host’s personal comment to the post was not defamatory.

Barnes v. Yahoo!

Section 230 immunity does not bar all potential claims a plaintiff may have, and may not apply to claims arising from contractual obligations. In Barnes v. Yahoo!, Inc., 570 F.3d 1096 (9th Cir. 2009), a woman sued Yahoo!, Inc. after her ex-boyfriend posted nude photos of her and other personal information on a site operated by Yahoo!, Inc. She told Yahoo! about the post and Yahoo! promised to remove the post. Yahoo! did not remove the post until she brought negligent undertaking and promissory estoppel claims against them.

The court held that Yahoo! was immune from civil liability for the negligent undertaking claim because it treated Yahoo! as a publisher. But Yahoo! could not claim immunity for the promissory estoppel claim.

Promissory estoppel is a cause of action based on a breach of a promise that forms an implied contract, even if there wasn’t a written agreement. Because the promissory estoppel claim treated Yahoo! as a party to a contract rather than as a publisher, Yahoo! was not immune from the claim under Section 230.

More Recent Section 230 Cases

The law surrounding Section 230 is always evolving and the subject of much litigation. There are several notable cases currently being litigated that have the potential to change how courts interpret Section 230.

 

In re Facebook, et al.

One is In re Facebook, et al., filed in Texas state court, and is based on allegations that Facebook provides a platform for sexual predators to exploit and extort children into sex trafficking. The plaintiffs were all minors, one as young as 13, who became connected with sexual predators on Facebook and Instagram.

Social media company and tech giant Facebook moved to dismiss the claims under Section 230 immunity for state civil claims, but the plaintiffs argue that the new exception provided by FOSTA allows their civil claims.

 

On April 28, 2020, the Fourteenth District Court of Appeals denied Facebook’s request for relief and dismissal of the claims. In 2021, Facebook appealed their ruling to the Texas Supreme Court. The Court held that the Communications Decency Act did bar the victims’ claims of state negligence, gross negligence, negligent-undertaking, and products-liability. However, the CDA did not bar the victims’ claims under the Texas human trafficking statute. In re Facebook, Inc. 625 S.W.3d 80, 101 (Texas 2021).

Malwarebytes, Inc. v. Enigma Software Group USA, LLC

The U.S. Supreme Court may consider a 9th Circuit case considering the breadth of Section 230(c)(2) immunity. Malwarebytes, Inc. v. Enigma Software Group USA, LLC involves the question of whether Section 230 immunizes ISP decisions to filter or block content when driven by anti-competitive motives and, thus, whether a software or internet company can claim that CDA immunity in a false advertising suit. Enigma alleges that Malwarebytes configured its software to block users from accessing Enigma’s security software to divert Enigma’s potential customers.

 

The 9th Circuit held that providers cannot claim Section 230(c)(2) immunity when the provider restricts content and finds it objectionable because it is the product of a business competitor. This decision puts a limit on Section 230 immunity and would allow ISP liability for unfair competition claims.

Malwarebytes submitted a petition for a writ of certiorari to the U.S. Supreme Court on May 11, 2020. Their petition for a writ of certiorari was subsequently denied.

Hassell v. Bird

Section 230 immunizes ISPs from court orders, as well as lawsuits naming them as defendants. Hassell v. Bird, 5. Cal. 5th 522 (2018) was a case of online defamation that was published on Yelp.com by an ICP. However, unlike in traditional cases involving Section 230 immunity, Yelp was not named as a defendant or alleged to have acted as the publisher of the defamatory review.

Rather, the plaintiff sought and received a court order compelling Yelp to remove certain consumer reviews published on its website. Yelp challenged the judgment, arguing they were immunized from complying under Section 230.

 

The Supreme Court of California agreed with Yelp and reversed the judgment. They held that Section 230 protected Yelp from having to remove the reviews because the order treated Yelp as the publisher of information under Section 230(c)(1). Even though Yelp was not named as a defendant, the removal order sought to overrule Yelp’s decision to publish the reviews. And, since an ISP’s relevant conduct in a defamation case goes “no further than the mere act of publication…Section 230 prohibited” the court order compelling Yelp to make an editorial decision.

Herrick v. Grindr

An ISP is also immune under the CDA if it provides app features that could be used to engage in unlawful behavior. Herrick v. Grindr, 765 Fed Appx. 586 (2nd Cir. 2019) involved a plaintiff whose ex-boyfriend created Grindr profiles to impersonate and harass him. The plaintiff argued that Grindr was negligent, failed to warn users, and engaged in deceptive business practices because it did not design safety features to its app to prevent that behavior.

The Second Circuit held that the plaintiff’s claims were barred under Section 230 because the claims sought to view Grindr as a publisher and were based on its editorial decisions. To the extent that the claims were based on app features that were used to harass the plaintiff, the claims were also barred.

Under Section 230, an ISP cannot be held responsible for providing the means for harassment to occur unless it “assisted in the development of what made the content unlawful and cannot be held liable for providing neutral assistance in the form of tools and functionality available equally to bad actors and the app’s intended users.”

Work With Experienced Internet Attorneys Today

The law surrounding the CDA is complex. If you believe you are a victim of a CDA violation, then contact the experienced internet defamation attorneys at Minc Law to evaluate your case. Call today to find out more information at (216) 373-7706.

 

★★★★★

“I was so impressed with the service Minc Law provided to me in my matter. Darcy Buxton and Andrew Stebbins were responsive, knowledgeable, reassuring, and ready to tackle the issue I was facing. They answered all my questions quickly and thoroughly, and helped me to trust that my matter was in the right hands. I would absolutely recommend Minc Law to anyone facing internet defamation or content takedown challenges in the future–they’re the real deal.”

H.W., June 3, 2021

Related Articles