LA Court Hears High-Profile Case
Meta and Google appeared in Los Angeles Superior Court on Tuesday for the second day of a major trial. Plaintiffs claim the companies intentionally designed their platforms to addict young users. Experts warn the verdict could influence hundreds of similar lawsuits across the U.S., potentially exposing the tech giants to major damages.
Conflicting Opening Statements
Plaintiff attorney Mark Lanier told jurors that the companies deliberately targeted children using techniques borrowed from casinos and tobacco companies. He described the case as “as easy as ABC,” standing for “addicting the brains of children.”
During his two-hour presentation, Lanier used props such as a toy Ferrari, a bicycle hand brake, and eggs to illustrate how social media exploits teenagers’ craving for validation. He emphasized that young users struggle to disengage from platforms designed to manipulate their attention.
By contrast, Meta attorney Paul Schmidt argued that K.G.M.’s struggles stemmed primarily from family problems, bullying, and body image concerns, rather than social media itself. He highlighted that the plaintiff continues to use Instagram, YouTube, and TikTok, which he said undermines claims of substantial harm. Schmidt presented his evidence formally using a PowerPoint, guiding jurors through the argument step by step.
Potential Nationwide Impact
The trial is expected to last six to eight weeks. A ruling in favor of the plaintiff could influence more than 1,600 related cases. Plaintiffs nationwide allege social media use contributed to addiction, depression, anxiety, and self-harm.
In addition, legal analysts note that the case could reshape platform design, safety regulations, and civil liability standards. Even if the jury awards limited financial damages, the symbolic and regulatory implications could last for years.
Growing Concerns Over Young Users
Parents, schools, and regulators have raised alarms about excessive social media use among children. This trial brings attention to the responsibilities of tech companies in protecting vulnerable users, highlighting the need for clearer safety measures and legal accountability.
