technology

Meta & YouTube Guilty: Social Media Addiction Verdict

A California jury found Meta 70% liable and YouTube 30% liable for addicting a young woman to social media, awarding $6 million in damages in the first-ever trial of its kind.

Bellwether Verdict$6M Damages2,000+ Lawsuits
Meta & YouTube Guilty: Social Media Addiction Verdict
70%
Meta liability share
30%
YouTube liability share
$6M
Total damages awarded
2,000+
Pending similar lawsuits

Key Takeaways

  • This is the first social media addiction case to reach a jury verdict, setting a legal precedent for over 2,000 pending lawsuits against tech companies.
  • The jury rejected both companies' Section 230 defense, ruling that algorithmic recommendation systems are not protected speech but rather a product design choice.
  • Meta was hit with $2.1 million in punitive damages — triple its share of compensatory damages — signaling the jury believed the company acted with conscious disregard for user safety.
  • A separate New Mexico case ordered Meta to pay $375 million for failing to protect children, compounding Meta's legal exposure on platform safety.

The Verdict: March 25, 2026

After weeks of testimony in a Los Angeles Superior Court, a California jury delivered a historic verdict on March 25, 2026: Meta and YouTube are liable for harming the mental health of a young woman identified only as KGM through their addictive platform designs. The jury assigned 70% of the liability to Meta, whose Instagram platform was at the center of the case, and 30% to YouTube. The split reflected the jury's assessment that Instagram's algorithmic recommendation system played a more direct role in driving compulsive usage. Compensatory damages totaled $3 million. But the punitive damages told the real story: $2.1 million from Meta and $900,000 from YouTube — amounts meant not to compensate the plaintiff, but to punish the companies for conduct the jury deemed reckless.
If you use Instagram or YouTube daily, this verdict says the algorithms keeping you scrolling were designed with knowledge they could cause psychological harm.
Mark Zuckerberg leaving the courthouse after the social media addiction trial verdict
Photo: NBC News

KGM's Story: Addiction, Depression, and Suicidal Thoughts

The plaintiff, now 20 years old and identified in court documents only as KGM, began using Instagram and YouTube heavily as a teenager. Her attorneys presented evidence that she became psychologically dependent on the platforms, spending hours each day consuming algorithmically curated content that reinforced negative body image, social comparison, and anxiety. KGM testified that her social media use escalated from casual browsing to compulsive behavior she could not control. She described experiencing severe depression and suicidal ideation that her medical experts linked directly to her platform usage patterns. Her therapists documented a deterioration in mental health that tracked closely with increases in screen time on both platforms. The defense argued that KGM had pre-existing mental health conditions and that social media was merely one factor among many. The jury was not persuaded.
For parents of teenagers spending 3+ hours daily on social media: this case established that platforms bear legal responsibility when their algorithms drive compulsive use.
Plaintiff arriving at the Los Angeles Superior Court for the social media addiction trial
Photo: NBC News

How the Algorithm Was Put on Trial

The centerpiece of the plaintiff's case was not the content itself, but the recommendation engine that served it. Attorneys argued that Instagram's algorithm was specifically designed to maximize engagement by exploiting psychological vulnerabilities — particularly in young users. Internal Meta documents introduced as evidence showed that the company's own researchers had flagged concerns about Instagram's impact on teen mental health as early as 2019. The now-infamous internal study that found "32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse" was presented to the jury. The plaintiff's expert witnesses demonstrated how the recommendation system creates feedback loops: a user who lingers on certain content receives more of it, creating an escalating cycle of engagement that the platform profits from but the user cannot easily escape. YouTube's autoplay and recommendation sidebar faced similar scrutiny, with experts testifying that the platform's design prioritizes watch time over user wellbeing.
The next time you notice you have been scrolling for an hour without intending to, remember: a jury ruled that behavior was engineered, not accidental.

Liability Breakdown: Meta vs YouTube

Meta (Instagram)YouTube (Google)
Liability share70%30%
Compensatory damages$2.1M$900K
Punitive damages$2.1M$900K
Key mechanismAlgorithmic feed + ExploreAutoplay + Recommendations
Internal research cited2019 teen health studyWatch time optimization docs
Section 230 defenseRejected by juryRejected by jury

Timeline: Filing to Verdict

Oct 2023

Multidistrict litigation consolidated

Over 1,000 social media addiction lawsuits from across the country are consolidated into multidistrict litigation in California federal court. KGM's case is selected as the bellwether — the first to go to trial.

The bellwether selection meant this single verdict would shape settlement negotiations for thousands of families.
Feb 2026

Trial begins in Los Angeles

Opening arguments begin. Plaintiff's team presents internal Meta research documents and expert testimony on algorithmic addiction mechanisms.

Courtroom testimony put Silicon Valley's internal debates about user safety into the public record for the first time.
Mar 25, 2026

Jury delivers guilty verdict

The jury finds Meta 70% liable and YouTube 30% liable. Compensatory damages: $3 million. Punitive damages: $2.1 million from Meta, $900,000 from YouTube.

Meta's stock dipped on the news — not because of the $6M, but because investors priced in the risk from 2,000+ similar pending cases.
Mar 26, 2026

New Mexico orders Meta to pay $375M

In a separate case, a New Mexico court orders Meta to pay $375 million for failing to protect children on its platforms. The timing amplifies pressure on the company.

Combined legal exposure from both cases signals that courts are no longer treating platform safety as a voluntary corporate responsibility.

Section 230 Defense Rejected — What It Means

Both Meta and YouTube argued that Section 230 of the Communications Decency Act should shield them from liability. The law, enacted in 1996, protects internet platforms from being treated as publishers of user-generated content. The jury's rejection of this defense is significant. The plaintiff's team successfully argued that the case was not about the content users posted, but about the design decisions the companies made: how the algorithm selects, ranks, and serves content to maximize engagement. The jury agreed that algorithmic recommendation is a product design choice — not a form of speech protected by Section 230. This distinction could reshape how courts treat platform liability nationwide. If recommendation algorithms are products rather than editorial decisions, they can be subject to product liability law — the same framework used for defective cars, dangerous drugs, and unsafe consumer goods.
For the tech industry: if this legal reasoning holds on appeal, every recommendation algorithm in every social media app becomes a potential liability.
NBC Nightly News segment covering the Meta YouTube social media addiction trial verdict
Photo: NBC News

The 2,000 Lawsuits Waiting in the Wings
This bellwether verdict does not automatically decide the other 2,000+ pending cases, but it dramatically shifts the negotiating landscape. Plaintiffs' attorneys now have a jury-validated theory of liability: that algorithmic recommendation systems cause foreseeable harm, and that Section 230 does not protect design choices. Meta and YouTube face a strategic decision — settle the remaining cases at potentially billions of dollars, or risk more jury trials with precedent now stacked against them. Legal analysts estimate Meta's total exposure across all pending social media addiction cases could exceed $10 billion if the company loses even a fraction of them at trial.

What This Means for Social Media Users

The verdict does not ban social media or force immediate changes to how Instagram or YouTube operate. Both companies will appeal, and the legal process could take years to reach a final resolution. However, the signal is clear: courts are beginning to treat addictive design features the same way they treat other dangerous products. This could accelerate regulatory efforts already underway in Congress and state legislatures to restrict algorithmic recommendation for minors, require age verification, and mandate parental controls. For everyday users, the practical takeaway is that the "I can't stop scrolling" experience is not a personal failing — a jury of twelve people just ruled it was engineered. Whether that leads to meaningful product changes depends on whether the tech companies decide the legal risk outweighs the engagement revenue these algorithms generate.
Check your screen time right now. If Instagram or YouTube accounts for 2+ hours daily, you are experiencing exactly what this trial proved was designed to happen.

Frequently Asked Questions

Cover image: NBC News. Published March 27, 2026. All figures sourced from court documents and news reports as of publication date.
HD
By Hoa Dinh · Founder & Senior Tech Editor
Published: March 27, 2026
technology·meta youtube trial verdict · social media addiction lawsuit · instagram addiction children · youtube harmful algorithm
Share

Related Topics

meta youtube trial verdictsocial media addiction lawsuitinstagram addiction childrenyoutube harmful algorithmmeta liable 2026social media mental health lawsuitKGM casetech giants accountability

Stay on top of trends

Bookmark this page and check back often for the latest updates and insights.