Social Media Companies Face Landmark Child Safety Litigation, Drawing Historical Comparisons to Big Tobacco

12:27 PM, Feb. 14, 2026

The digital age has collided head-on with an old legal question: when does a product become so harmful/addictive that its makers must be held liable? 

Jury selection began on January 27, 2026, in California Superior Court, Los Angeles County for a case that can redefine the liability landscape for social media companies. In 2023 a lawsuit was filed by a now-20-year-old California woman, identified in court filings by the initials K.G.M., who alleged that as a minor she became addicted to social media platforms. And as a result, she suffered anxiety, depression, and body-image issues. (The lawsuit originally named Meta, ByteDance, Snap, and Google as defendants.) This case is just one of multiple scheduled cases known as bellwether trials. What makes this case unique is not simply the plaintiff’s individual story, but the legal theory behind it: social media platforms themselves are inherently defective products, designed in ways that foreseeably harm children. 

Testimony from teenagers and young adults is expected, which may carry significant weight with the jurors. The trial is also expected to feature testimony from CEOs Mark Zuckerberg (Meta) and Neal Mohan (YouTube, a subsidiary of Google). Mr. Zuckerberg and Mr. Mohan are expected to be confronted with internal company communications acknowledging risks to young users’ mental health. These revelations underscore earlier congressional hearings, including one in which Mr. Zuckerberg publicly apologized to parents who believe social media contributed to their children’s deaths. Likewise, this political and moral pressure is now on trial. 

At the heart of the plaintiffs’ case are allegations that product features have become ubiquitous across platforms. These features include infinite scroll, video autoplay, and algorithmic recommendation systems designed solely to maximize engagement. Plaintiff attorneys believe these features are not neutral design choices. Instead, they argue, these features function as behavioral hooks that keep young users online for hours, perpetuating anxiety, depression, eating disorders, and in some cases, self-harm. To illustrate, beauty filters on image-based platforms, they claim, intensify toxic body comparisons and directly contribute to dysmorphia.

Plaintiffs plan to draw a comparison to Big Tobacco litigation in the 1990s, involving companies such as Philip Morris and R.J. Reynolds, and argue that social media giants recognize the long-term monetary value of hooking users early. Just as tobacco companies were alleged to have concealed the addictive nature of cigarettes, the plaintiffs argue that social media giants knew their products could harm children. 

The defense, however, is unlikely to concede much ground, if any. Unsurprisingly, social media companies are expected to lean on Section 230 of the Communications Decency Act, which generally shields online platforms from liability for content posted by users. The defendants will argue that holding them liable would undermine the current legal framework of the modern internet. Social media companies are also expected to dispute the science, claiming that there is no proven causal link between platform use and addiction or mental health disorders.

This is a landmark case of first impression because the plaintiffs are not merely arguing that platforms failed to remove harmful content. Rather, they are asserting that the platforms’ core designs render them unreasonably dangerous, thereby triggering personal injury liability. If successful, the theory could open the door to a surge in product liability claims against the same companies whose business models depend on user engagement. 

Plaintiffs plan to draw a comparison to Big Tobacco litigation in the 1990s, involving companies such as Philip Morris and R.J. Reynolds, and argue that social media giants recognize the long-term monetary value of hooking users early.

The stakes extend far beyond individual plaintiffs. States and school districts are involved in their own litigation, alleging that they have incurred enormous costs providing mental health counseling, services, and additional educational resources to address the fallout from social media use. Full circle, these claims mirror public-sector lawsuits against tobacco manufacturers, in which governments sought reimbursement for societal harm.

Some companies have already chosen to settle. Snap and TikTok (a subsidiary of ByteDance) have settled in the K.G.M. case. Both companies remain defendants in more than a dozen other lawsuits expected to move forward this year. Approximately nine instances are scheduled to be heard in California state court, with a second wave in federal court, specifically the Northern District of California. 

Whether Big Tech is destined to face the same reckoning as Big Tobacco remains uncertain. Section 230 remains a powerful shield, but the mere fact that juries are now being asked to determine whether social media platforms are inherently harmful marks a meaningful shift. For an industry that once prided itself on disruption without consequence, a new message is becoming increasingly clear: innovation does not exist within a vacuum.

Spencer S. Vora

Spencer received his undergraduate degree from Michigan State University and is a J.D. candidate at the University of North Carolina School of Law. His Note, “Piggy Banks to Paychecks: Ensuring Child Content Creators’ Protection Against Financial Exploitation by Parents and Guardians,” was published in Volume 27, Issue 2 of the North Carolina Journal of Law & Technology.