Despite continued scrutiny over the legal immunity online providers enjoy under Section 230 of the Communications Decency Act (CDA), online platforms continue to successfully invoke its protections. This is illustrated by three recent decisions in which courts dismissed claims that sought to impose liability on providers for hosting or restricting access to user content and for providing a much-discussed social media app filter.

In one case, a California district court dismissed a negligence claim against online real estate database Zillow over a fraudulent posting, holding that any allegation of a duty to monitor new users and prevent false listing information inherently derives from Zillow's status as a publisher and is therefore barred by the CDA. (924 Bel Air Road LLC v. Zillow Group Inc., No. 19-01368 (C.D. Cal. Feb. 18, 2020)). In the second, the Ninth Circuit, in an important ruling, affirmed the dismissal of claims against YouTube for violations of the First Amendment and the Lanham Act over its decision to restrict access to the plaintiff's uploaded videos. The Ninth Circuit found that despite YouTube's ubiquity and its role as a public-facing platform, it is a private forum not subject to judicial scrutiny under the First Amendment. It also found that its statements concerning its content moderation policies could not form a basis of false advertising liability. (Prager Univ. v. Google LLC, No. 18-15712 (9th Cir. Feb. 26, 2020)). And in a third case, the operator of the messaging app Snapchat was granted CDA immunity in a wrongful death suit brought by individuals killed in a high-speed automobile crash where one of the boys in the car had sent a snap using the app's Speed Filter, which had captured the speed of the car at 123MPH, minutes before the fatal accident. (Lemmon v. Snap, Inc., No. 19-4504 (C.D. Cal. Feb. 25, 2020)).

The Trio of CDA Decisions

In the Zillow case, a property owner, 924 Bel Air Road, LLC ("Bel Air"), took issue with false listing and sales data on a Residence Page about one of its luxury properties and claimed that the posting harmed the property's market value. Apparently, an unknown user had posted information about the Bel Air property, and following an investigation, Zillow eventually took down the false listing, blocked the user and shared the user's IP address with Bel Air. Through this communication with Zillow, the plaintiff learned that Zillow's internal monitoring system purportedly does not involve manually verifying the identity of each user who claims a Residence Page. Bel Air brought a negligence claim, alleging that Zillow's "inadequate monitoring system" allowed the unknown user to publish false information that had harmed the "elite status" and potential value of the luxury property. In dismissing the claims on CDA grounds, the court found that Zillow's activities in reviewing user content were essentially decisions about whether to include or exclude third party material, or "publisher" duties that fell under Section 230:

"Ultimately, Bel Air's allegations boil down to a charge that Zillow must prevent users from falsely claiming a Residence Page or posting false content. Yet, reviewing each user's activity and postings to ensure their accuracy is precisely the kind of activity for which Congress intended section 230 to provide immunity."

In the Prager decision, Prager University ("Prager"), a conservative organization, objected when YouTube restricted access to a number of its uploaded videos (i.e., placing certain videos it deemed age-inappropriate in Restricted Mode) and demonetizing others (i.e., preventing third parties from serving ads alongside these videos). Google countered that, as per YouTube's terms, it reserves the right to remove or restrict content that violates the terms and to take down objectionable videos, such as those it deems age-inappropriate. In bringing claims that were seemingly crafted to bypass the CDA, Prager alleged, among other claims, that Google censored speech in violation of the First Amendment. In affirming dismissal of the claims, the appeals court rejected Prager's argument and held that even though YouTube may be "a paradigmatic public square on the Internet," it is "not transformed" into a state actor subject to First Amendment constraints solely by providing a forum for speech. While prior courts have uniformly ruled that online platforms that host user-generated content are not state actors or public forums, a decision from the Ninth Circuit is important in that it denies a CDA workaround that sought to portray YouTube's publishing decisions as unlawful censorship.

"Because the state action doctrine precludes constitutional scrutiny of YouTube's content moderation pursuant to its Terms of Service and Community Guidelines, we affirm the district court's dismissal of PragerU's First Amendment claim."

The Snap case involved a tragic car crash that killed three boys, one of whom had posted a Snap using the app's Speed Filter to document their actual speed of 123MPH at the moment the photo was taken, minutes before the accident. Snapchat is a mobile app that allows users to take ephemeral photos and videos, also known as "snaps," and share them with friends. A Snapchat filter is essentially an overlay that can be superimposed over a photo or video taken on Snapchat, and might include geotags, the time, or something fanciful (e.g., wild sunglasses), or in this instance, the real-life speed of the user. The plaintiffs, in bringing a wrongful death suit, alleged that Snapchat encouraged dangerous speeding because it knew or should have known that many teenage and young adult users of the app wanted to use Snapchat to capture a mobile photo or video showing them hitting over 100MPH and then share that snap with their friends. Seeking to avoid CDA immunity, the plaintiffs argued that they are not seeking to hold Snapchat liable for third-party content, but rather its own content (i.e., the Speed Filter). Snapchat countered that the Speed Filter was a "content-neutral tool" that facilitates communication between users.

In dismissing the claims, the court concluded that the Speed Filter is a neutral tool and that CDA immunity applies where the online provider merely provides a framework that could be utilized for proper or improper purposes by the user. The court noted that Snapchat does not require users to send a snap at a high speed and the app displays warnings and otherwise discourages users from capturing while driving at fast speeds. In rejecting plaintiffs' argument that its claims were not based on user content, the court stated that "the content itself, the 100-MPH-Snap (or other high-speed Snaps) is at the crux of Plaintiffs' claims" and that plaintiffs are seeking to hold Snapchat responsible for failing to regulate what users post through the Speed Filter, actions which are protected by CDA immunity.

"Plaintiffs do not allege that Defendant asked its users to capture a Snap with a high speed or even suggested that they should do so. Instead, Plaintiffs' allegations appear to amount to 'enhancement by implication or development by inference' – that the Speed Filter impliedly suggested to users that they should Snap a 100-MPH-Snap. However, in such a close scenario, the Ninth Circuit has held that section 230 immunity applies."

It should be noted that the instant decision diverges from the outcome of a similar case involving another serious high-speed auto accident allegedly prompted by use of the Speed Filter (though no snaps were posted using the Speed Filter by the defendant prior to the accident). In that decision, a Georgia appellate court ruled that the CDA does not shield Snapchat from the injury claims because the plaintiffs were seeking to hold Snapchat liable for its own conduct, principally for the creation of the Speed Filter and its failure to warn users that the Speed Filter could encourage unsafe driving practices, and because no third-party user content was published at the time of the accident. (Maynard v. Snapchat, Inc., 346 Ga. App. 131, 816 S.E.2d 77 (2018)).

Final Considerations

These recent decisions are victories for online providers that must routinely navigate the legal issues associated with both fraudulent and truthful content. Such disputes often involve unfortunate and painful facts and circumstances (e.g., Dyroff (where a social networking site received CDA immunity for messaging functions that connected a user with a dealer, culminating in an overdose) and Seaver (where a court had found that the CDA shielded the organization responsible for maintaining the Tor Browser from various claims stemming from an incident where a minor died after taking illegal narcotics purchased from a site on the "dark web")). These cases certainly evoke the difficult questions that arise under the broad scope of the CDA, whose principal goal is "to promote the continued development of the internet." Indeed, as the Ninth Circuit has previously stated in its Roommates.com decision: "Websites are complicated enterprises, and there will always be close cases where a clever lawyer could argue that something the website operator did encouraged the illegality." This push and pull of legal and ethical considerations remains an ongoing debate, particularly as members of Congress and others view the aforementioned judicial outcomes as the reasons why the CDA should be scaled back. We will be closely watching this debate, as it appears to be far from over.

Online Platforms Sidestep Claims Over User Content Decisions And Social App Functions

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.