Meta and YouTube social media addiction trial courtroom in Los Angeles
LANDMARK VERDICT
Photo: Getty/CBS News
Tech & Society

Meta & YouTube Found Liable: $6M Verdict Over Addictive Design Harming Minors

For the first time, a U.S. jury ruled that Instagram and YouTube's addictive design directly caused mental health harm to a child. This verdict could reshape the relationship between technology and child safety.

Published: April 4, 2026

Key Takeaways

  • Jury found Meta and YouTube negligent, acting with "malice, oppression or fraud".
  • Plaintiff KGM (Kaley), now 20, used Instagram/YouTube from ages 6 to 9 and suffered depression, body dysmorphia, suicidal thoughts.
  • Total damages: $6M ($3M compensatory + $3M punitive).
  • Meta held 70% responsible, YouTube 30%.
  • Over 2,000 similar lawsuits pending nationwide.
  • Verdict challenges Section 230 by targeting product design, not user content.

The Verdict Breakdown

After a 3-week trial at the Los Angeles Superior Court, the jury returned its verdict on March 25, 2026, finding Meta (parent company of Instagram) and YouTube (owned by Google) liable for the mental health harms suffered by plaintiff KGM (Kaley). This marks the first time in U.S. legal history that a jury concluded social media platform design directly harmed a minor.

$3M
Compensatory
Mental health damages
$3M
Punitive
"Malice, oppression or fraud"
$6M
Total
Landmark verdict
Liability Split
Meta (Instagram)70%
YouTube (Google)30%
Source: Jury verdict, Los Angeles Superior Court, March 25, 2026

-> The $3M punitive award signals the jury believed these companies knowingly continued harmful design practices — this was not negligence, but intent.

The Plaintiff's Story

KGM, known as Kaley, began using Instagram and YouTube at age 6. By age 9, she had become addicted to both platforms. Kaley's attorneys presented evidence that the content recommendation algorithms of both platforms continuously served her harmful content about body image, extreme dieting, and self-harm.

Severe depression
Clinically diagnosed from age 10
Body dysmorphia
Constant comparison with social media images
Suicidal thoughts
Multiple hospitalizations
Compulsive use
Unable to stop despite knowing the harm

Notably, the jury found both companies acted with "malice, oppression or fraud" — the highest legal standard for awarding punitive damages in California. This means the jury believed Meta and YouTube knew their designs were harmful to children but deliberately continued.

-> With over 4.8 billion social media users globally, if just 1% are children similarly affected, that is 48 million children — more than the population of many countries.

Plaintiff attorneys and parties during the social media addiction trial
Photo: Getty/CBS News

The Section 230 Challenge: Design, Not Content

Section 230 of the U.S. Communications Decency Act has long shielded tech platforms from liability for user-generated content. However, this lawsuit employed a different legal strategy: it targeted not user content but the platform's product design itself.

Critical Legal Strategy

The plaintiff's attorneys argued that recommendation algorithms, infinite scroll, push notifications, and the "like" system are deliberate product design choices — not user content. Therefore, Section 230 does not apply. The jury agreed.

According to multiple legal experts, if this verdict is upheld on appeal, it will set a precedent allowing future lawsuits to hold tech companies accountable for product design, not just the content on their platforms.

-> For tech startups: this verdict signals that "move fast and break things" is no longer an acceptable motto when your product affects children.

Case Timeline

2022
Lawsuit Filed

KGM (Kaley)'s family filed the lawsuit against Meta and YouTube in Los Angeles Superior Court, alleging both platforms intentionally designed addictive features targeting minors.

-> At the time of filing, over 100 similar lawsuits were already pending — showing this is a systemic issue, not an isolated case.

March 2026
Trial Begins

The 3-week trial featured testimony from psychology experts, former Meta and YouTube employees, and internal evidence about content recommendation algorithms. Experts testified that the platforms used "variable-ratio reinforcement" techniques — similar to slot machines — to keep users returning.

-> "Variable-ratio reinforcement" is why you find yourself opening Instagram 50 times a day without knowing why — this mechanism is engineered to create compulsive habits.

25/3/2026
Jury Returns Verdict

The jury found Meta and YouTube negligent and acting with "malice, oppression or fraud." Total damages: $6 million. Meta was assigned 70% liability, YouTube 30%.

-> $6 million is tiny compared to Meta's $135 billion in 2025 revenue. But the punitive precedent matters for 2,000+ pending cases — potential total damages could reach billions.

Jury and courtroom during the Meta YouTube addiction trial verdict
Photo: Getty Images

Broader Implications

This verdict comes amid a global wave of tightening regulations on social media and online child protection. In the U.S., the KOSA (Kids Online Safety Act) is being considered in Congress, while the European Union has already implemented the DSA (Digital Services Act) with strict provisions for protecting minors.

2,000+
Lawsuits Pending
Across the U.S., watching this verdict

For the tech industry, this verdict sends a clear message: product design must be responsible. Features like recommendation algorithms, infinite scroll, push notifications, and variable reward systems — all can be legally scrutinized if they harm young users.

Both Meta and YouTube rejected the verdict and announced they will appeal. Meta stated that "our platforms are designed to create positive experiences" and YouTube said they have "invested significantly in protecting minors." However, the jury was not persuaded by these arguments.

-> If you are a parent with children using social media: this verdict confirms that your concerns about addictive design have legal backing, not just intuition.

Tech Accountability in the AI Era

This verdict raises important questions for the rapidly growing AI industry. As companies like OpenAI deploy increasingly consumer-facing AI products, responsible design will become critical — not just for ethics but for legal risk. The lesson from the Meta/YouTube case applies directly to AI design: algorithms must be assessed for impact on vulnerable users before deployment.

-> AI developers take note: if your algorithm recommends addictive content to children, you may face similar liability — even if your jurisdiction hasn't set precedent yet.

ZestLab Analysis

The $6 million amount sounds small for companies worth hundreds of billions. But the true value lies in the legal precedent: the jury confirmed that addictive design is a defective product, not user-generated content. This opens an entirely new legal approach, bypassing the Section 230 shield that tech companies have relied on for decades.

With 2,000+ pending lawsuits, total expected damages could reach billions. Even if the appeal succeeds, the legal costs and reputational risk alone are enough to force companies to change their approach to designing products for young users.

Frequently Asked Questions

References

  1. CBS News — Meta, YouTube social media addiction lawsuit verdict (March 2026)
  2. NPR — Meta, YouTube social media trial verdict (March 25, 2026)
  3. ABC7 — Los Angeles social media addiction trial: jury finds Instagram, YouTube liable (March 2026)

This article is compiled from primary news sources (CBS News, NPR, ABC7). Content is informational, not legal advice. Analysis represents ZestLab's perspective.

ML
By Minh Le · Senior Technology Correspondent
Published: April 4, 2026
tech·meta youtube addiction verdict · social media addiction lawsuit · meta liable $6 million · youtube negligent children
Share

Related Topics

meta youtube addiction verdictsocial media addiction lawsuitmeta liable $6 millionyoutube negligent childrensection 230 social mediasocial media trial 2026

Stay on top of trends

Bookmark this page and check back often for the latest updates and insights.