The 19-year-old protester claims the platforms fueled depression and suicidal thoughts
Global social media giants Meta, TikTok and YouTube face their first product liability trial starting Tuesday in Los Angeles over claims they knowingly designed their platforms to be addictive and harmful to children, according to court filings.
The protester, a 19-year-old California woman identified as KGM, says she became addicted to the companies’ platforms at a young age because of their attention-grabbing design. She claims the apps contributed to her depression and suicidal thoughts, and is trying to hold the companies accountable. Jury selection will begin on Tuesday.
Her lawsuit is the first of several expected to go to trial this year focusing on what plaintiffs describe as “social media addiction” among children. It’s the first time technology companies will have to defend themselves in court over alleged harm caused by their products, said the plaintiffs’ attorney, Matthew Bergman.
Meta CEO Mark Zuckerberg is expected to take the witness stand. Meta plans to argue that its products did not cause KGM mental problems, the company’s lawyers told Reuters.
A key issue is a federal law that largely shields platforms like Instagram and TikTok from liability for content posted by users, which the companies say applies in KGM’s case. A verdict against them could weaken that longstanding defense, signal that juries may find the platforms themselves liable and potentially trigger a Supreme Court review, Bergman said.
Snap CEO Evan Spiegel was expected to testify after Snap, Snapchat’s parent company, was named as a defendant, but the company agreed to settle KGM’s lawsuit last week. YouTube will argue that its platforms are fundamentally different from Instagram and TikTok and should not be treated the same in court, a YouTube executive said.
Concerns about children’s safety on the Internet have intensified legal pressure. In the US, Meta is facing lawsuits alleging it failed to remove illegal content involving minors, including contact with adult strangers and material related to suicides, eating disorders and child sexual abuse. Globally, the company faces increasing regulatory challenges as it has been designated as a company “extremist organizations” in Russia in 2022 and faces a number of EU measures, including a €797m antitrust fine and separate copyright, data protection and advertising cases across Europe.
You can share this story on social media:


Leave a Reply