Add a bookmark to get started

11 January 202411 minute read

Navigating the digital dilemma: Court addresses social media liability in adolescent addiction litigation

In November 2023, Judge Yvonne Gonzales Rogers of the US District Court for the Northern District of California issued an order (Order) addressing several of the arguments raised in the motion to dismiss[1] the ongoing multidistrict litigation In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation (MDL).[2]

The MDL encompasses hundreds of cases filed on behalf of children, adolescents, school districts, and state attorneys general across the US against prominent technology giants that operate some of the most popular social media platforms in the world, such as Meta, Google, ByteDance, and Snap. Among other matters, the litigation raises several questions regarding the liability of social media companies as it relates to alleged harm caused by social media adolescent addiction.

In the Order, the court dismissed several of the plaintiffs’ claims based on Section 230 of the Communications Decency Act (CDA) and the First Amendment’s speech protections, but allowed other claims related to the defendants’ platforms to proceed based on the plaintiffs’ allegations of defective design and failure to warn.[3] To the latter, the court also rejected in part the defendants' “all or nothing” arguments that their platforms are not “products” subject to product liability law and that they owe no duty to the plaintiffs.

Background

Plaintiffs’ claims

The Order addressed the defendants’ motions to dismiss the plaintiff’s five priority claims against the defendants under various state law: (1) strict liability – design defect (New York), (2) strict liability – failure to warn (New York), (3) negligence – design defect (Georgia), (4) negligence – failure to warn (Georgia), and (5) negligence per se (Oregon).

Many plaintiffs are adolescents (or their parents) who allege that they suffered physical, mental, and emotional injuries as a result of using the defendants' platforms, which they claim are defectively designed and have caused harm. Further, the individuals claimed that the platforms failed to warn users of the risks associated with use.

The plaintiffs identified several features of the defendants' platforms that they allege are defective, including but not limited to endless-content feeds, intermittent variable rewards, ephemeral (or disappearing) content, deployment of notifications to attract and re-attract users, algorithm-based prioritization of content, features that allow connection between child and adult users, and lack of age verification and/or parental controls. These features, alongside a failure to adequately warn users of the dangers they pose, form the basis of the plaintiffs’ product liability claims sounding in strict liability and ordinary negligence based upon alleged design defects and failure to warn.

On the other hand, the plaintiffs’ negligence per se cause of action is based on the defendants’ alleged violation of two federal statutes: the Children’s Online Privacy Protection Act (COPPA)[4] and the Protect Our Children Act (Protect Act).[5] The plaintiffs claimed that COPPA was violated by the defendants’ failure to provide parents with adequate notice of their collection and use of children’s personal information and for failing to obtain parental consent before the collection of such information. The plaintiffs also asserted that the Protect Act was violated based on the defendants’ failure to minimize the number of employees with access to visual depictions of the plaintiffs and to report the violations of child pornography laws suspected within their platforms.

Defendants’ motion to dismiss

The defendants moved to dismiss these claims based on several grounds including immunity under Section 230 of the CDA[6] and the First Amendment, lack of the existence of “products” to which product liability claims would attach, and lack of duty and causation. Ultimately, the court adopted a fine-grained approach to its analysis, rejecting both parties “all or nothing approach” and evaluated each claim and defense separately against each product functionality feature at issue to determine whether it was subject to dismissal or not.

Summary of the Order

The court's Order addressed each of the defendants' arguments in turn and reached the following conclusions:

  • Section 230 immunity:

Section 230 of the CDA generally affords providers of interactive computer services immunity from liability under state or local law as publishers of third-party content generated by its users. The court applied the Barnes v. Yahoo!, Inc.[7] and the Fair Housing Council of San Fernando Valley v. Roommates.Com, LLC[8] tests to determine whether the defendants were immune from liability under Section 230(c)(1) of the CDA.

The court found that several of the plaintiffs' design defect claims were barred by Section 230. The court reasoned that allegations targeting defendants’ role as publishers of third-party content fell within Section 230’s immunity provisions. This included features such as providing endless content, distributing ephemeral content, and the timing and clustering of third-party content.[9]

In contrast, the court found that certain design defect claims involving defendants' own conduct or the creation or development of content did not escape plaintiffs’ statutory negligence per se claims and were not protected by Section 230 where the features did not involve a process related to “publishing.” Claims the court found were not insulated from Section 230 immunity included the defendants' failure to offer robust parental controls and the timing and clustering of notifications of the defendants’ own content, among others.[10]

  • First Amendment:

The defendants also argued that the First Amendment precluded liability on several grounds, including protections for speech they publish as well as their choices in how to disseminate that speech.

Like with the Section 230 claims, the court analyzed each claim in turn, finding that some, but not all, of the plaintiffs' design defect claims were not protected by the First Amendment. The court found, among others, claims based on the defendants' failure to provide effective parental controls and notifications, failure to provide screen time limitation options, and barriers to account deletion were not protected by the First Amendment. The court held that these actions do not constitute speech or expression, or publication of the same.[11]

The court found that other claims were protected by the First Amendment, such as the plaintiffs’ design defect claims based on the defendants' timing and clustering of notifications of the defendants’ content. The court held that there was no dispute that the content of such notifications is considered free speech under the First Amendment and that it could not conceive of any way of interpreting the plaintiffs’ defect claim that would not require the defendants to change how, when, and how much they publish such speech.[12]

  • Platforms, not “products”:

A central aspect of the defendants’ argument in their motion to dismiss was that their platforms are not “products” subject to product liability law. The court again took a nuanced approach to the analysis, evaluating whether the functionalities of the defendants’ platform qualified as “products.” In its analysis, the court drew upon various considerations such as whether the functionality was analogous to personal property, or was more akin to ideas, content, and free expression. The court found that the plaintiffs had adequately pled the existence of product components for many of their design defect claims for the various social media platforms, such as those related to the defendants’ failure to implement robust age verification processes, failure to implement effective parental controls, and creating barriers to account deletion.[13]

  • No duty owed:

The court also addressed whether the plaintiffs adequately pled the duty element of their product-based negligence claims. First, the court determined that, because the plaintiffs adequately pled the existence of “products” in connection to their alleged design defect claims, the defendants owe users the duty to design those products in a reasonably safe manner and to warn about any risks they pose.[14] Next, the court determined that the defendants did not owe the plaintiffs a duty to protect them from third-party users of the defendants’ platforms, such as those who published content on them.[15] Thus, the court allowed the plaintiffs’ negligence claims to proceed to the extent they are based on product defects and arise out of the defendants’ duty to design reasonably safe products and warn users of known defects.[16]

  • Causation:

Finally, the court briefly examined and ultimately rejected the defendants’ argument that the plaintiffs failed to allege causation. First, the court found that the plaintiffs’ adequately alleged causation between the design features and alleged harm, as the plaintiffs were able to show, through a description of the inner workings of the defendants’ platforms, that the defendants made design choices which caused the plaintiffs’ injuries.[17] Further, the court found the plaintiffs’ claims of harm caused by the defendants’ design and operation of their platform were adequately alleged as the plaintiffs submitted academic studies empirically demonstrating causal connections.[18]

Key takeaways

Legal landscape shift

  • Redefining digital product liability: Traditional principles of product liability are being applied to digital platforms in novel ways. The court’s consideration of claims against social media platforms as "products" with potential design defects continues to extend the realm of product liability into the digital space. This could set precedent for how digital services are legally categorized and regulated, leading to increased risk of liability for companies.
  • Impact on technology companies: Technology companies, especially those offering services with significant user engagement like social media, may need to reevaluate their legal strategies and product designs to mitigate risks associated with claims of harmful effects, particularly on vulnerable populations like children.

Design and ethical considerations

  • User-centric design ethics: The case highlights the need for a more ethical approach to user interface and experience design, focusing on the wellbeing of users, especially minors. Companies should consider the psychological impacts of their design choices, such as addictive features or manipulative content algorithms.
  • Corporate responsibility: There is an emerging narrative that technology companies should be more accountable for the societal impact of their products. This includes being responsible not just for the content on their platforms, but also for how their design choices influence user behavior and mental health.

Anticipated regulatory changes

  • Potential legislative actions: The outcomes of this litigation could spur legislative actions aimed at more stringent regulation of social media platforms, particularly regarding their duty to protect younger users from harm.
  • Revisiting legal protections: The arguments around Section 230 and First Amendment rights in this case indicate that these legal shields for social media companies might be revisited. Future rulings may limit these protections, especially in cases where platforms' design elements are implicated in harmful effects on users.
  • International influence: The US legal trends could influence global regulatory approaches toward digital platforms, potentially leading to more harmonized international standards regarding the operation and governance of social media.

DLA Piper is here to help

The Financial Times named DLA Piper as the winner of the 2023 North America Innovative Lawyers in Technology award for its AI and Data Analytics practice.

For more information on AI and the emerging legal and regulatory standards, visit DLA Piper’s focus page on AI.

Gain insights and perspectives that will help shape your AI Strategy through our newly released AI Chatroom series.

For further information or if you have any questions, please contact any of the authors.

[1] Defendants filed two separate consolidated dismissal motions in response to Plaintiffs’ Master Amended Complaint. The first Motion to Dismiss addressed whether plaintiffs legally stated each of their priority claims. The second Motion to Dismiss addressed immunity and protections under Section 230 of the Communications Decency act and the First Amendment. The Order addresses arguments from both Motions to Dismiss.
[2] In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, MDL No. 3047 (Case No. 4:22-md-03047-YGR (Oct. 6, 2022)).
[3] Order Granting in Part and Den. in Part Defs.’ Mot. to Dismiss, ECF No. 430 (Available here: Order Granting in Part and Denying in Part Defendants’ Motions to Dismiss)
[4] 15 U.S.C. §§ 6501-6506.
[5] 18 U.S.C. §§ 2258A, 2258B.
[6] 47 U.S.C. § 230.
[7] 570 F.3d 1096 (9th Circ. 2009).
[8] 521 F.3d 1157 (9th Cir. 2008).
[9] Order 16, ECF No. 430.
[10] Id. at 14-16; 20.
[11] Id. at 21-22.
[12] Id. at 22.
[13] Id. at 36.
[14] Id. at 42.
[15] Id. at 47.
[16] Id. at 48.
[17] Id. at 49.
[18] Id.

Print