A major court case over alleged social media addiction opens Tuesday in California. Leading technology executives are expected to testify. The outcome could reshape legal responsibility for digital platforms.
The plaintiff is a 19-year-old woman identified as KGM. She claims platform algorithms caused addiction and harmed her mental health. She says the designs encouraged excessive use during adolescence.
The defendants include Meta, which owns Instagram and Facebook, TikTok owner ByteDance, and YouTube parent Google. Snapchat settled with the plaintiff last week. The remaining firms now face trial.
The case will proceed at Los Angeles Superior Court. Legal experts view it as the first of many similar lawsuits. These cases could weaken a long-standing legal shield for technology companies.
Focus shifts to platform design and behaviour
The companies argue the evidence fails to prove responsibility for depression or eating disorders. They deny a direct link between their products and the alleged harm.
The move to trial reflects a broader legal shift. Courts increasingly examine claims that digital products promote addictive behaviour. Scrutiny of technology firms continues to intensify.
For years, companies relied on Section 230 of the Communications Decency Act. Congress passed the law in 1996 to protect platforms from liability over user content.
This lawsuit targets a different issue. It examines algorithms, notifications, and engagement tools. These features shape how users interact with social media apps.
KGM’s lawyer, Matthew Bergman, described the trial as unprecedented. He said a jury will finally assess social media company conduct.
He said many young people worldwide face similar harm. He accused companies of placing profits above children’s wellbeing.
Legal pressure mounts on tech firms
Eric Goldman, a law professor at Santa Clara University, said the stakes remain extremely high. He warned losses could threaten the companies’ future.
He also noted the difficulty for plaintiffs. Courts rarely attribute psychological harm directly to content publishers.
Still, he said these lawsuits opened new legal territory. Existing laws never anticipated claims focused on product design.
Documents and executives under oath
Jurors will review extensive testimony and evidence. This material includes internal company documents.
Mary Graw Leary, a law professor at Catholic University of America, expects major disclosures. She said companies may reveal information long hidden from public view.
Meta previously said it introduced dozens of safety tools for teenagers. Some researchers dispute the effectiveness of those measures.
The companies plan to argue third-party users caused any alleged harm. They deny their designs directly injured young people.
Meta chief executive Mark Zuckerberg is scheduled to testify early. His appearance stands among the most anticipated moments.
In 2024, Zuckerberg told US senators scientific research showed no proven causal link. He said studies failed to connect social media to worse youth mental health.
During that hearing, he apologised to victims and their families. Lawmakers questioned him during emotional exchanges.
Global attention on social media harms grows
Mary Anne Franks, a law professor at George Washington University, questioned executive testimony strategies. She said technology leaders often struggle under pressure.
She added companies strongly hoped to avoid placing top executives on the stand. Public testimony carries significant reputational risk.
The trial comes as global scrutiny increases. Families, school districts, and prosecutors increasingly challenge social media practices.
Last year, dozens of US states sued Meta. They accused the company of misleading the public about platform risks.
Australia has banned social media use for children under 16. The UK signalled in January it may follow.
Franks said society has reached a tipping point. She argued governments no longer treat the technology industry with automatic deference.
