Are social media apps addictive like cigarettes? Are these sites defective products?
Those are the claims that Meta, Snap, TikTok and YouTube will face this year in a series of landmark trials. Teenagers, school districts and states have filed thousands of lawsuits accusing the social media titans of designing platforms that encouraged excessive use by millions of young Americans, leading to personal injury and other harms.
On Tuesday, the first of these bellwether cases is scheduled to start with jury selection in California Superior Court of Los Angeles County. A now-20-year-old Californian identified by the initials K.G.M. filed the lawsuit in 2023, claiming she became addicted to the social media sites as a child and experienced anxiety, depression and body-image issues as a result.
The cases pose one of the most significant legal threats to Meta, Snap, TikTok and YouTube, potentially opening them up to new liabilities for users’ well-being. Drawing inspiration from a legal playbook used against Big Tobacco last century, lawyers plan to use the argument that the companies created addictive products.
A win could open the door to more lawsuits from millions of social media users. It could also lead to huge monetary damages and changes to social media sites’ designs.
The companies, which have largely dodged previous legal threats by citing a federal shield that protects them from liability for what their users post, have been scrambling to defend themselves. They have pre-emptively argued that the courts should drop the cases and hired armies of litigators from top law firms. Last week, Snap settled with K.G.M. for an undisclosed amount.
“This is a cutting-edge case of gargantuan and very powerful companies that have so far managed to avoid liability far better than many other industries,” said Benjamin Zipursky, a professor at Fordham Law School who is an expert in tort law. He added, “They may face some accountability here.”
At the first trial, Mark Zuckerberg, the chief executive of Meta, and Neal Mohan, who runs YouTube, are among the executives expected to face questions stemming from company documents that warned their products could lead to harm.
A jury trial adds to the challenges for the companies, legal experts said, since jurors may be more easily swayed as teenagers take the stand to claim they were harmed.
The cases have drawn comparisons to those against Big Tobacco in the 1990s, when companies like Philip Morris and R.J. Reynolds were accused of hiding information about the harms of cigarettes. In 1998, the companies reached a $206 billion master settlement with more than 40 states that led to an agreement to stop underage marketing. Strict tobacco regulations and a decline in smoking followed.
“This is ground zero for our fight against social media, where society will set new expectations and standards for how social media companies can treat our children,” said Joseph VanZandt, one of the lead lawyers in the Los Angeles trial.
The tech giants plan to argue that there is no scientific proof that social media causes addiction and that the lawsuits violate online speech protections, according to media briefings.
Snap, which owns Snapchat, said in a statement that it settled its part of the K.G.M. suit last week. It remains a defendant in other social media addiction cases, but did not respond to additional requests for comment.
TikTok declined to comment. YouTube, which is owned by Google, is not a social media platform and has for years offered products like YouTube Kids with extra safety measures, company officials told reporters on a call last week.
“Providing young people with a safer, healthier experience has always been core to our work,” said José Castañeda, a spokesman for YouTube.
Meta, which owns Instagram and Facebook, said in a blog post last week that the cases cherry-picked statements from executives in internal documents and that the first trial “oversimplifies” the problem.
“Clinicians and researchers find that mental health is a deeply complex and multifaceted issue, and trends regarding teens’ well-being aren’t clear-cut or universal,” the company added.
Concern about social media’s effects on children has mounted globally. The European Union, Britain and other nations have passed laws limiting certain features of the platforms for children. Last month, Australia barred children under the age of 16 from using social media.
In the United States, Congress has threatened action against social media companies for years, but most efforts have fizzled. Lawmakers have held hearings on the issue, including one in which Mr. Zuckerberg was forced to stand and apologize to parents who claim social media contributed to their children’s deaths.
And while states like California, Texas and Ohio have enacted laws aimed at protecting children online, the tech companies have successfully sued to block many of the laws on free-speech grounds.
In the personal injury cases that are now set to go to trial, plaintiffs’ lawyers said, they planned to argue that features like infinite scroll, auto video play and algorithmic recommendations lead to compulsive social media use and cause mental health issues leading to anxiety, depression, eating disorders and self-harm.
Nine cases in total are likely go to trial in state court in Los Angeles. A separate set of federal cases brought by school districts and several attorneys general will be heard this summer in U.S. District Court in Oakland, Calif., where plaintiffs plan to argue that social media is a public nuisance and that they have had to shoulder the costs of treating a generation of youths suffering from addictive social media use.
The Los Angeles cases will be overseen by Judge Carolyn Kuhl, who worked in the Reagan administration and was appointed to the state court in 1995 by Gov. Pete Wilson, a Republican.
K.G.M. created a YouTube account at age 8, then joined Instagram at 9; Musical.ly, now TikTok, at 10; and Snapchat at 11. Her lawyers said they would argue that the social platforms created addictive products comparable to cigarettes. Beauty filters on Instagram and Snapchat force toxic comparisons that cause body dysmorphia, K.G.M.’s lawyers said they would argue.
At the trials, plaintiffs’ lawyers also plan to lean on a trove of internal documents from the past decade that show tech executives knew and discussed the negative effects of their products on children. They plan to argue that the companies put profits ahead of the well-being of users, even as employees pleaded with executives to shut down certain tools.
In 2019, Meta removed some Instagram beauty filters that made users look as if they had undergone plastic surgery. Internal documents showed that in 2019 and 2020, Meta executives emailed Mr. Zuckerberg, asking him to reconsider a plan to restore those beauty filters. The filters were known internally to lead young users, particularly girls, to body-image issues. One executive said her own daughter had suffered from body dysmorphia. The filters were still restored.
Other internal documents from YouTube in recent years showed that executives had discussed how to make the app more “addictive” to “compel users to come back more and more often.”
The social media companies plan to cite Section 230 of the Communications Decency Act of 1996, which shields websites from liability for content created by their users.
The civil litigation is expected to drag out for years. Dozens of parents who have pushed for regulation of social media companies plan to attend parts of the trials.
Julianna Arnold, of Los Angeles who started a nonprofit, Parents Rise, to lobby for social media regulation, said her daughter, Coco, was found dead at age 17 in New York in 2022 after obtaining a drug from a dealer on Instagram. In 2024, Ms. Arnold sat behind Mr. Zuckerberg in Congress when he apologized to parents who had lost their children.
In the absence of regulation, Ms. Arnold, who is not a plaintiff in the personal injury cases, said that she hoped the trials would force changes at the companies.
“These trials are now our hope for the world to see how dangerous these social media platforms are,” she said.
Cecilia Kang reports on technology and regulatory policy for The Times from Washington. She has written about technology for over two decades.
The post Social Media Giants Face Landmark Legal Tests on Child Safety appeared first on New York Times.




