Marie Le Tiec was 15 when she ended her life in 2021. Her mother, Stéphanie Mistre, went through her phone a month later, and said she was shocked to discover the kind of content on the girl’s TikTok feed.
There were songs glamorizing death. There were people encouraging each other to take their lives. There were detailed instructions on how to do so, that matched what Marie had done.
“They are making money on the mental health of our children,” said Ms. Mistre, who has joined other French parents in suing the social media platform, and has become an activist for children’s online safety.
Last month, a California jury found Meta and YouTube had harmed the mental health of a young user with addictive design features — a landmark case that could lead to further lawsuits in the United States. Efforts to protect children from dangers online are also well underway in Europe, where the European Union and national authorities in France and elsewhere are pushing for new measures.
The goal, they say, is an internet where children are not exposed to sexual or violent content before they are old enough to process it, and in which algorithms are not addictive. It’s a kind of gentler version of the web.
“We are setting a clear limit that you can’t do business by harming people’s mental health,” Henna Virkkunen, the commissioner at the European Union focused on technology issues, said during a recent interview.
TikTok declined to comment on the open case of Ms. Le Tiec, but it has pushed back on accusations that its design harms children, as have other social media companies. And some academics warn that choking off access to these online services could leave children less digitally savvy.
Still, the wave of efforts across Europe continues, and may be the most comprehensive attempt yet to limit what children can access on apps and the internet.
It builds on long-running efforts by the E.U. to better police the internet, including landmark legislation, the Digital Services Act, which prods large online platforms to set standards and monitor for harmful content on their own sites.
Among the new initiatives are these:
-
Investigations into social media companies, including one opened last week by European Union regulators into child protection safeguards at Snapchat. Regulators say the social media service has an ineffective age-verification system for children. Snap has said its services had “security and privacy built in” and that it does not allow users under 13.
-
Another E.U. announcement last week finding that Pornhub, Stripchat and other porn platforms “did not diligently identify and assess” the risks their platforms posed to children, or do enough to keep minors off their websites, based on a preliminary investigation. If those results are confirmed, the platforms could face hefty fines. A spokesman for Stripchat said the company disagreed with aspects of the finding, adding that discussions were ongoing. A Pornhub spokesman noted that many age verification efforts do not work and drive people toward less regulated sites, and said the company was working with the commission to protect minors.
-
An announcement in February by the European Union that TikTok’s infinite scroll, auto-play features and recommendation algorithm may amount to an “addictive design” that violates E.U. online safety laws. TikTok has said it planned to challenge the findings.
-
A bloc-wide approach adopted by the European Union to counter cyberbullying.
-
Efforts in France, Greece, Denmark and Spain to explore minimum ages for social media, following in the footsteps of Australia, where a law barring under-16s from social media took hold in late 2025.
“There is no reason our children should be exposed online to what is legally forbidden in the real world,” Emmanuel Macron, the president of France, said in a speech this year.
Ursula von der Leyen, the president of the European Commission, the E.U. executive arm, has recently convened a panel of experts to advise on whether Europe-wide social media age restrictions for kids might make sense.
Like that conversation, many of Europe’s social media restrictions are still in development. If age restrictions do take hold, they will likely be backed by European Union technology. The bloc is working on a digital identity wallet that would allow platforms and other websites to easily check how old users are before allowing them in.
The flurry of activity comes as officials become more worried about the role social media is playing in the lives of children, an issue that is politically salient in Europe when little else achieves wide consensus. Research and advocacy papers in France have raised concerns about algorithmic rabbit holes that can exacerbate depressive thinking. About one in six young adolescents experiences cyberbullying, according to World Health Organization Europe data.
Many of the European Union’s efforts aim at non-American firms. TikTok’s owner is Chinese, and Pornhub is Canadian.
There’s a chance, though, that Europe’s push to protect minors could eventually extend to American companies like Meta, putting it on a potential collision course with the United States.
The Trump administration often criticizes Europe’s broader approach to digital regulation, especially its efforts to curb misinformation during elections and its push to make X’s recommender system more transparent.
The administration argues that such efforts could curb free speech. Trump officials have also blasted Europe and Britain for their crackdown on X’s Grok artificial intelligence, which was generating explicit images of real people, though X has said it has sought to restrict the generation of such images.
But when it comes to child safety online, the United States shows some signs of moving in the same direction as Europe.
Many changes are happening through the courts. In addition to the California case against Meta and YouTube, a New Mexico jury recently found Meta liable for violating state laws in ways that it found had enabled sexual exploitation of young users.
Also in the United States, some rules have been enacted in states to keep young people away from algorithm-driven social media.
In a recent New York Post interview, Lara Trump, the president’s daughter-in-law, suggested that President Trump was “very interested” in social media bans for young people, though she declined to say definitively whether he would support such a move. In the movie “Melania,” the president’s wife is seen discussing how to protect children against harassment online with Brigitte Macron, the French president’s wife.
Some experts say overly broad efforts to keep children away from harmful content on social media could limit their access to information, leaving them less digitally literate. They also warn that digital identification information could be hacked.
What’s needed, said Joan Barata, a member of the Center for Law, Democracy and Society at Queen Mary University of London, is “a more granular approach.”
For Ms. Mistre, who lost her daughter, curbing the risks of social media is a major priority, requiring rapid action. She has been active in the European debate, regularly appearing in local media to push for more expansive regulation.
“It’s horrible, the way they use our children,” she said of social media sites. “We have to protect them.”
Adam Satariano contributed reporting.
Jeanna Smialek is the Brussels bureau chief for The Times.
The post Europe Pushes for a Gentler Internet for Children appeared first on New York Times.




