For over a decade, U.S. lawmakers have promised guardrails for children on social media. They’ve grilled the chiefs of Meta, Snap, YouTube and TikTok about the dangers of their sites. They’ve introduced dozens of child safety bills.
Little has come from the noise.
But this week, two juries held social media companies accountable for harming young users.
In Los Angeles on Wednesday, a jury decided in favor of a plaintiff who had claimed that Meta and YouTube hooked her with addictive features — a verdict validating a novel legal strategy holding the companies accountable for personal injury. And a day earlier in New Mexico, a jury found Meta liable for violating state law by failing to safeguard users of its apps from child predators.
The landmark decisions highlight a growing backlash against social media and its effects on young people, including criticism from parents and policymakers around the globe that it is contributing to a youth mental health crisis. And they show that the push for change may finally be gaining steam.
U.S. lawmakers said on Wednesday that the verdicts underscored the need for child safety legislation. Senators Marsha Blackburn, Republican of Tennessee, and Richard Blumenthal, Democrat of Connecticut, called for legislators to pass their bill, the Kids Online Safety Act.
Federal momentum would build on laws in more than 30 states banning phones in schools. Globally, Australia in December banned social media for those under 16. Spain, Denmark, France, Malaysia and Indonesia are considering similar restrictions.
The implications of the court decisions this week are “very, very big,” said Catherine M. Sharkey, a professor of law at New York University and an expert on tort and product liability. “We’re in a new era, a digital era, where we have to rethink definitions for products based on which entities might have superior information to prevent these injuries and accidents.”
The verdicts stem from a flood of thousands of lawsuits filed against social media companies by teenagers, parents, school districts and state attorneys general. More than a dozen are scheduled for trial this year, exposing a vulnerable legal flank for Silicon Valley giants that could ripple across the once impervious tech industry.
Social media companies have fiercely defended themselves, saying it is not possible to link social media use to addiction and mental health harms. They have also introduced safety features for young users.
Legal experts caution that it is too early to determine whether this week’s legal decisions will deal a lasting blow to the internet giants. Meta and YouTube, which deny any wrongdoing, have said they will appeal.
In the past, social media companies won legal victories by leaning on Section 230 of the federal Communications Decency Act of 1996, which protects them from liability for what their users post.
And the Supreme Court has indicated openness to similar arguments. On Wednesday, the justices unanimously said a major internet provider could not be held liable for the piracy of thousands of songs online. Free speech advocates had warned the court of a chilling effect on free expression, particularly on social media platforms, if internet companies could face hefty penalties for the actions of their users.
But the landmark decision in Los Angeles held Meta and YouTube accountable for personal injury, not user content. The jury found that addictive features like infinite scroll and video autoplay made the apps dangerous.
That decision aligns the defendants more closely with makers of tobacco, opioid and consumer products that have been held liable for harms to their users, legal experts said. Lawsuits against those industries have led to huge financial damages and forced changes to the marketing and design of products, including warning labels.
“We can’t unsee what’s been taken out of the box,” said Sarah Gardner, director of a child safety group, Heat Initiative. “There is now too much evidence that the companies and their executives knew about the harms of their products to children, and yet they chose to ignore the warnings for profits.”
The Los Angeles case was considered a bellwether in a series of lawsuits, slated to go to trial this year, that test the legal argument that social media can cause personal injury. They include eight others brought by individual plaintiffs in Superior Court of Los Angeles County.
A set of federal cases brought by states and school districts in U.S. District Court in Oakland, Calif., are scheduled for jury trials beginning this summer.
Wednesday’s winning plaintiff, K.G.M., whose first name is Kaley, sued Meta, YouTube, Snap and TikTok in 2023. Kaley, who lives in Chico, Calif., said she had begun using social media at age 6 and claimed the sites had caused personal injury, including body dysmorphia and thoughts of self-harm.
She settled with Snap and TikTok for undisclosed sums before the five-week trial. A jury deliberated more than a week on whether Meta, which owns Instagram, and YouTube, which is owned by Google, were liable.
In the end, all but two of the jurors determined that Meta and YouTube were negligent in designing their platforms, and that their products harmed K.G.M. They awarded her $6 million in damages.
“Mental health is profoundly complex and cannot be linked to a single app,” Meta said in a statement.
José Castañeda, a spokesman for Google, said the case “misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.”
In New Mexico, the jury fined Meta $375 million for violating consumer protection laws by making the site unsafe and by allowing child exploitation on its Facebook, Instagram and WhatsApp sites.
Child safety groups and parents who have crusaded for better protections, including a failed effort in late 2024 to pass a previous version of the Kids Online Safety Act, hailed the decisions.
Alvaro Bedoya, a former Democratic commissioner at the Federal Trade Commission, said company lobbying had kept Congress from doing more to protect young users.
“Why this matters is these decisions are from regular people,” he said. “You can’t lobby a mom or a dad or a schoolteacher who are responsible for the well-being of a child.”
The trial in Los Angeles was attended by members of tech policy and child safety groups, as well as dozens of parents who said social media had harmed their children.
Julianna Arnold, who said she had lost her 17-year-old daughter, Coco, to social media harms, wept in the courtroom on Wednesday as the verdict was read aloud. The leader of Parents Rise, a child safety advocacy group, she had attended nearly every day of the trial and has lobbied for passage of the Kids Online Safety Act and other legislation.
“It was so validating, so thrilling,” Ms. Arnold said. “We are re-energized and are going to keep fighting.”
Cecilia Kang reports on technology and regulatory policy for The Times from Washington. She has written about technology for over two decades.
The post Juries Take the Lead in the Push for Child Online Safety appeared first on New York Times.




