Key Takeaways
- Jury found Meta and YouTube negligent, acting with "malice, oppression or fraud".
- Plaintiff KGM (Kaley), now 20, used Instagram/YouTube from ages 6 to 9 and suffered depression, body dysmorphia, suicidal thoughts.
- Total damages: $6M ($3M compensatory + $3M punitive).
- Meta held 70% responsible, YouTube 30%.
- Over 2,000 similar lawsuits pending nationwide.
- Verdict challenges Section 230 by targeting product design, not user content.
The Verdict Breakdown
After a 3-week trial at the Los Angeles Superior Court, the jury returned its verdict on March 25, 2026, finding Meta (parent company of Instagram) and YouTube (owned by Google) liable for the mental health harms suffered by plaintiff KGM (Kaley). This marks the first time in U.S. legal history that a jury concluded social media platform design directly harmed a minor.
-> The $3M punitive award signals the jury believed these companies knowingly continued harmful design practices — this was not negligence, but intent.
The Plaintiff's Story
KGM, known as Kaley, began using Instagram and YouTube at age 6. By age 9, she had become addicted to both platforms. Kaley's attorneys presented evidence that the content recommendation algorithms of both platforms continuously served her harmful content about body image, extreme dieting, and self-harm.
Notably, the jury found both companies acted with "malice, oppression or fraud" — the highest legal standard for awarding punitive damages in California. This means the jury believed Meta and YouTube knew their designs were harmful to children but deliberately continued.
-> With over 4.8 billion social media users globally, if just 1% are children similarly affected, that is 48 million children — more than the population of many countries.

The Section 230 Challenge: Design, Not Content
Section 230 of the U.S. Communications Decency Act has long shielded tech platforms from liability for user-generated content. However, this lawsuit employed a different legal strategy: it targeted not user content but the platform's product design itself.
The plaintiff's attorneys argued that recommendation algorithms, infinite scroll, push notifications, and the "like" system are deliberate product design choices — not user content. Therefore, Section 230 does not apply. The jury agreed.
According to multiple legal experts, if this verdict is upheld on appeal, it will set a precedent allowing future lawsuits to hold tech companies accountable for product design, not just the content on their platforms.
-> For tech startups: this verdict signals that "move fast and break things" is no longer an acceptable motto when your product affects children.
Case Timeline
KGM (Kaley)'s family filed the lawsuit against Meta and YouTube in Los Angeles Superior Court, alleging both platforms intentionally designed addictive features targeting minors.
-> At the time of filing, over 100 similar lawsuits were already pending — showing this is a systemic issue, not an isolated case.
The 3-week trial featured testimony from psychology experts, former Meta and YouTube employees, and internal evidence about content recommendation algorithms. Experts testified that the platforms used "variable-ratio reinforcement" techniques — similar to slot machines — to keep users returning.
-> "Variable-ratio reinforcement" is why you find yourself opening Instagram 50 times a day without knowing why — this mechanism is engineered to create compulsive habits.
The jury found Meta and YouTube negligent and acting with "malice, oppression or fraud." Total damages: $6 million. Meta was assigned 70% liability, YouTube 30%.
-> $6 million is tiny compared to Meta's $135 billion in 2025 revenue. But the punitive precedent matters for 2,000+ pending cases — potential total damages could reach billions.

Broader Implications
This verdict comes amid a global wave of tightening regulations on social media and online child protection. In the U.S., the KOSA (Kids Online Safety Act) is being considered in Congress, while the European Union has already implemented the DSA (Digital Services Act) with strict provisions for protecting minors.
For the tech industry, this verdict sends a clear message: product design must be responsible. Features like recommendation algorithms, infinite scroll, push notifications, and variable reward systems — all can be legally scrutinized if they harm young users.
Both Meta and YouTube rejected the verdict and announced they will appeal. Meta stated that "our platforms are designed to create positive experiences" and YouTube said they have "invested significantly in protecting minors." However, the jury was not persuaded by these arguments.
-> If you are a parent with children using social media: this verdict confirms that your concerns about addictive design have legal backing, not just intuition.
Tech Accountability in the AI Era
This verdict raises important questions for the rapidly growing AI industry. As companies like OpenAI deploy increasingly consumer-facing AI products, responsible design will become critical — not just for ethics but for legal risk. The lesson from the Meta/YouTube case applies directly to AI design: algorithms must be assessed for impact on vulnerable users before deployment.
-> AI developers take note: if your algorithm recommends addictive content to children, you may face similar liability — even if your jurisdiction hasn't set precedent yet.
The $6 million amount sounds small for companies worth hundreds of billions. But the true value lies in the legal precedent: the jury confirmed that addictive design is a defective product, not user-generated content. This opens an entirely new legal approach, bypassing the Section 230 shield that tech companies have relied on for decades.
With 2,000+ pending lawsuits, total expected damages could reach billions. Even if the appeal succeeds, the legal costs and reputational risk alone are enough to force companies to change their approach to designing products for young users.