In a crowded Los Angeles courtroom on Monday, Mark Lanier pulled three wooden children’s blocks from his bag and stacked them on top of each other.
“This case is as easy as ABC,” said Mr. Lanier, a lawyer. “Addicting, brains, children.”
His demonstration kicked off opening statements in a bellwether tech addiction trial in which a 20-year-old California woman has accused Meta, which owns Instagram, and YouTube of building their platforms to be addictive, leading to personal injury and other harm.
The plaintiff, identified as K.G.M., became hooked on YouTube and Instagram as a child because the apps are like “digital casinos,” with features such as endless swiping that are comparable to the handle of a slot machine, Mr. Lanier said. K.G.M. represents a generation of young people who became addicted to social media, even as executives knew of the technology’s risks, he said.
“They didn’t just build apps, they built traps,” Mr. Lanier said. “They didn’t want users, they wanted addicts.”
The trial in the California Superior Court of Los Angeles is the first in a series of landmark cases against Meta, Snap, TikTok and YouTube that test a novel legal theory arguing that tech can be as harmful as casinos and cigarettes.
Teenagers, school districts and states have filed thousands of lawsuits accusing the social media titans of designing platforms that encourage excessive use. Drawing inspiration from a legal playbook used against Big Tobacco last century, lawyers argue that features like infinite scroll, auto video play and algorithmic recommendations have led to compulsive social media use.
The cases pose some of the most significant legal threats to Meta, Snap, TikTok and YouTube, potentially opening them up to new liabilities for users’ well-being. A win for the plaintiffs could prompt more lawsuits and lead to monetary damages, as well as change how social media is designed.
The social media companies have denied the accusations, arguing there is no scientific evidence proving that their platforms cause addiction. They have also pointed to a speech shield law that protects them from liability for what their users post online.
Concern about social media’s effects on children has mounted globally. In December, Australia barred children under 16 from using social media, and other nations including Malaysia, Spain and Denmark are considering similar rules. The European Union, Britain and other nations have passed laws limiting certain features of the platforms for children.
In the United States, dozens of state attorneys general have sued social media companies over allegations of child harm. On Monday, opening statements began in a separate trial against Meta on social media addiction and child sexual exploitation in New Mexico. The state’s attorney general, Raúl Torrez, sued Meta in 2023, accusing the company of allowing predators to reach children and having chatbots that harmed young people.
Executives prioritized profits over safety, Don Migliori, the litigator for New Mexico argued on Monday. “Internally, Meta clearly knew that youth safety is not its corporate priority, that safety measures were under-resourced, ineffective and deprioritized, that youth safety was less important than growth and engagement,” he said.
This summer, another set of federal cases will go to trial in Oakland, Calif., at the U.S. District Court of Northern California. In those cases, school districts and states plan to argue that social media is a public nuisance and that they have had to shoulder the costs of treating a generation of youths suffering from addictive social media use.
K.G.M.’s case, which is being presided over by Judge Carolyn B. Kuhl and will be decided by a jury, is one of nine cases bundled together in state court in Los Angeles and represent some of the strongest personal injury cases among the thousands of suits that were filed.
Snap and TikTok earlier settled with K.G.M. under undisclosed terms. The companies remain defendants in the other suits.
The K.G.M. trial against Meta and YouTube, which is owned by Google, is expected to last six to eight weeks. Mark Zuckerberg, the chief executive of Meta, and Neal Mohan, who runs YouTube, are among the executives expected to testify.
K.G.M., who is from Chico, Calif., and filed suit in 2023, created a YouTube account at age 8, then joined Instagram at 9, her lawyers said. K.G.M. and her mother weren’t aware of the risks of the platforms and harmful features that led to anxiety and depression, the lawyers said. Beauty filters on Instagram led to her body dysmorphia, they claimed.
During his nearly two-hour opening, Mr. Lanier presented the jury with a trove of internal Meta and YouTube documents dating to 2011 that showed tech executives knew of and discussed the negative effects of their products on children.
One YouTube presentation showed how the company courted children under the age of 4, comparing itself to a babysitter. In two instances, Meta employees said the company’s tactics reminded them of tobacco companies.
“If we want to win big with teens, we must bring them in as tweens,” said one internal Meta document from 2018. People who joined Facebook at 11 had four times the long-term retention as people who joined at 20, according to the document.
Lawyers for Meta and YouTube are expected to deliver their opening statements later Monday.
Representatives from tech policy and child safety groups and a handful of parents suing Meta and YouTube sat on crowded benches in the back of the courtroom.
During the lunch break, the parents filed out and embraced, some crying. Among them was Lori Schott, who said her daughter Annalee killed herself in 2020 at the age of 18 after viewing videos that glorified self harm.
“The emotions are all over the board for us,” Ms. Schott said. “We are here to learn the truth.”
Eli Tan covers the technology industry for The Times from San Francisco.
The post Meta and YouTube Created ‘Digital Casinos,’ Lawyers Argue in Landmark Trial appeared first on New York Times.




