A landmark trial over alleged social media addiction starts Tuesday in California. Top technology executives are expected to testify in court.
The plaintiff is a 19-year-old woman identified by the initials KGM. She claims platform algorithms caused addiction and harmed her mental health.
The defendants include Meta, owner of Instagram and Facebook, TikTok owner ByteDance, and YouTube parent Google. Snapchat reached a settlement with the plaintiff last week.
The closely watched case unfolds at Los Angeles Superior Court. It marks the first of many similar lawsuits nationwide. These cases could challenge a long-used legal shield protecting technology companies in the United States.
Dangerous and addictive algorithms under scrutiny
The social media companies argue the evidence does not prove responsibility for depression or eating disorders. They deny their platforms caused the alleged harms.
The case moving to trial signals a shift in how courts treat major technology firms. Lawmakers and families increasingly accuse these products of fostering addictive behaviour.
For years, companies relied on Section 230 of the Communications Decency Act. Congress passed the law in 1996 to protect platforms from liability over user posts.
This case focuses instead on product design choices. Algorithms, notifications, and engagement features shape how people use these apps.
KGM’s lawyer, Matthew Bergman, said the case marks a historic moment. A jury will directly judge a social media company’s conduct for the first time.
He said many children worldwide suffer similar harm from addictive platform designs. He accused companies of prioritising profits over young lives.
Legal risks for the platforms grow
Eric Goldman, a law professor at Santa Clara University, warned the stakes remain enormous. He said losing such cases could threaten the companies’ existence.
He also noted the challenge for plaintiffs. Courts rarely link physical or psychological harm directly to content publishers.
Still, he said these lawsuits opened new legal territory. Existing laws never anticipated such claims, he explained.
Internal documents and executive testimony
Jurors will review extensive evidence during the trial. This material includes internal company documents and communications.
Mary Graw Leary, a law professor at Catholic University of America, expects major disclosures. She said companies may reveal information they long kept private.
Meta previously said it introduced dozens of safety tools for teenagers. Some researchers dispute the effectiveness of those measures.
The companies plan to argue third-party users caused any alleged harm. They deny their designs directly injured young users.
One key witness will be Meta chief executive Mark Zuckerberg. He is scheduled to testify early in the proceedings.
In 2024, Zuckerberg told US senators scientific research showed no proven causal link. He said studies failed to show social media worsened youth mental health.
During that hearing, he apologised to victims and their families. Lawmakers pressed him during emotional testimony.
Growing global pressure on the tech industry
Mary Anne Franks, a law professor at George Washington University, questioned executive testimony. She said technology leaders often struggle under intense questioning.
She added companies strongly hoped to avoid putting top executives on the stand. Public testimony carries significant reputational risk.
The trial arrives amid rising global scrutiny. Families, school districts, and prosecutors increasingly challenge social media practices.
Last year, dozens of US states sued Meta. They accused the company of misleading the public about platform risks.
Australia has already banned social media use for children under 16. The UK signalled in January it may adopt similar restrictions.
Franks said society approaches a turning point. She argued governments no longer treat the technology industry with automatic deference.
