Three leading social media companies have agreed to undergo independent assessments of how effectively they protect the mental health of teenage users, submitting to a battery of tests announced Tuesday by a coalition of advocacy organizations.
The platforms will be graded on whether they mandate breaks and provide options to turn off endless scrolling, among a host of other measures of their safety policies and transparency commitments. Companies that reviewers rate highly will receive a blue shield badge, while those that fair poorly will be branded as not able to block harmful content. Meta, which operates Facebook and Instagram, TikTok and Snap are first three companies to sign up for the process.
“I hope that by having this new set of standards and ratings it does improve teens’ mental health,” said Dan Reidenberg, managing director of the National Council for Suicide Prevention, who oversaw the development of the standards. “At the same time, I also really hope that it changes the technology companies: that it really helps shape how they design and they build and they implement their tools.”
Teenagers represent a coveted demographic for social media sites and the new standards come as the tech industry faces increasing pressure to better protect young users.
A wave of lawsuits alleges that leading firms have engineered their platforms to be addictive. Congress is weighing a suite of bills designed to protect children’s safety online. And state lawmakers have sought to impose age limits on social apps.
But those efforts have borne little fruit. Some legal experts argue teens and their families may face difficulty in court cases proving the connection between social media use and their struggles. Officials in Washington, meanwhile, have been unable to agree on how to regulate the industry and laws passed by the states have run into First Amendment challenges.
The voluntary standards represent an alternative approach. Reidenberg said in an interview that the ratings are not a substitute for legislation but will be a helpful way for teenagers and parents to decide how to engage with particular apps. The project is backed by the Mental Health Coalition, an advocacy group founded by fashion designer Kenneth Cole.
Cole said in a statement that the standards “recognize that technology and social media now play a central role in mental health — especially for young people — and they offer a clear path toward digital spaces that better support well-being.”
There is still no scientific consensus on whether social media is on the whole harmful for children and teenagers. While some research has found that the heaviest users have worse mental health, studies have also found that young people who are not online can also struggle. But teenagers themselves have reported becoming more uneasy about the time they spend online, with girls in particular telling pollsters at the Pew Research Center in 2024 that apps were affecting their self-confidence, sleep patterns and overall mental health.
Reidenberg said it’s clear that in some cases young people’s time online becomes problematic. He said the system was developed without funding from the tech industry, but companies will have to volunteer to participate.
Antigone Davis, Meta’s global head of safety, said the standards will “provide the public with a meaningful way to evaluate platform protections and hold companies accountable.” TikTok’s American arm said it looked forward to the ratings process. Snap called the Mental Health Coalition’s work “truly impactful.”
Organizers compared the process to how Hollywood assigns age ratings to movies or the government assesses the safety of new cars. Companies will submit internal polices and designs for review by outside experts who will develop their ratings. In all, the companies’ performance will be measured in about two dozen areas covering their policies, app design, internal oversight, user education and content.
Many of the standards specifically target users’ exposure to content about suicide and self harm. But one also targets the sheer length of time that some people spend scrolling, crediting platforms for offering either voluntary or mandatory “take-a-break” features.
The standards are being launched at an event in Washington on Tuesday. Sen. Mark R. Warner (D-Virginia) said in a statement that he welcomed the standards but they weren’t a substitute for regulatory action.
“Congress has a responsibility to put lasting, enforceable guardrails in place so that every platform is held accountable to the young people and families who use them,” he added.
The post Under growing pressure, the biggest social networks agree to be rated on teen safety appeared first on Washington Post.




