DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Instagram to Alert Parents to Teens’ Self-Harm Searches

February 28, 2026
in News
Instagram to Alert Parents to Teens’ Self-Harm Searches

Instagram will begin notifying parents when their teenage children use the platform to repeatedly search for terms related to suicide or self-harm.

The alerts “are designed to give parents the information they need to support their teen and come with expert resources to help parents approach these sensitive conversations,” Meta, which owns Instagram, said in a news release on Thursday.

Meta is promoting the new alerts as it is on trial in two states over claims that its platforms are addictive and have harmed young users.

Here’s what to know:

Users must opt in to get the alerts.

To receive the alerts, which will be sent by email, text or WhatsApp, as well as through an in-app notification, teenagers and parents must be enrolled in Instagram’s parental supervision tools. In other words, users need to opt in to get the messages.

Meta stressed in its news release that a “vast majority of teens do not try to search for suicide and self-harm content on Instagram,” and highlighted an existing policy of blocking searches for such content.

Parents who tap on a notification will see a “full-screen message” explaining that their teen has repeatedly searched for harmful content within a short period, the company said. Parents will then “have the option to view expert resources designed to help them approach potentially sensitive conversations with their teen.”

The new alerts will be available in the United States, Britain, Australia and Canada starting next week, the company said, while other regions of the world will have access to the alerts later this year.

Meta is on trial in two U.S. states.

The new feature is being introduced as Meta is defending itself in landmark trials against claims that its platforms are addictive and have harmed young users.

In a series of trials, the plaintiffs’ lawyers are testing a novel legal theory claiming that Meta and other social media companies have caused personal injury through defective products.

Instagram has denied that its platform is “clinically addictive.”

Adam Mosseri, the platform’s chief executive, testified in Los Angeles County Superior Court earlier this month that social media could cause some harm, but the company is careful to test features used by young people before releasing them.

He added that people could be addicted to social media in the same way they could be addicted to a good television show, but that did not mean they were “clinically addicted.”

Meta is a defendant in a separate trial in New Mexico state court over charges brought by the state attorney general, citing allegations that the company’s technology is addictive and facilitates child exploitation.

What are the experts saying?

The new alerts may be a “step in the right direction,” but they are not “a replacement for improving their platform design and content management policies, which should be protecting children from harmful content online already,” said Kris Perry, the executive director of Children and Screens: Institute of Digital Media and Child Development, a nonprofit that researches the impact of digital media on children.

Because users must enroll in Instagram’s supervision tools to get the alerts, Ms. Perry said, there are already limits to how effective it could be. Some parents don’t opt into parental supervision, she said, and many children have figured out how to work around those guardrails anyway, for instance by using secret accounts or coded language that wouldn’t trigger the alert.

The new feature, she said, continues Instagram’s longstanding pattern of placing the onus for child supervision on the parent rather than the platform itself.

Ms. Perry said Meta should let the public know as soon as possible if the alerts are having a measurable effect on the amount of harmful content that teenage users are seeing, “so we could see if this idea is effective or if it wasn’t effective” and “so that we could ask for additional steps on their part.”

“We won’t know that until it’s implemented and we see the results — in other words, reduced content being shared with kids, perhaps lower suicide rates, greater satisfaction on the part of parents,” she said, adding that Meta has rolled out safety features in the past that didn’t achieve their intended goals.

“We have to be cautious,” she said. “and request transparency here so that we can all see whether these ideas actually work.”

Aimee Ortiz covers breaking news and other topics.

The post Instagram to Alert Parents to Teens’ Self-Harm Searches appeared first on New York Times.

‘Needless display of brute punishment’: WSJ editorial rips into Trump’s war with AI firm
News

‘Needless display of brute punishment’: WSJ editorial rips into Trump’s war with AI firm

by Raw Story
February 28, 2026

The Wall Street Journal editorial board sharply condemned President Donald Trump on Friday for ordering the federal government to blackball ...

Read more
News

‘We Got to Win the Midterms’: Trump Takes His State of the Union Message on the Road

February 28, 2026
News

Judge slams brakes on Trump’s ‘dystopian nightmare’ to lock up refugees

February 28, 2026
News

New ‘Ride the D’ Metro shirts sold out almost immediately. Because of course they did

February 28, 2026
News

Trump Pitches Everyone’s Most-Hated Senator for SCOTUS

February 28, 2026
MAGA in shock watching James Carville’s crude midterm skit: ‘Needs a straitjacket!’

MAGA in shock watching James Carville’s crude midterm skit: ‘Needs a straitjacket!’

February 28, 2026
Cambodia Celebrates the Return of Looted Artifacts From a Tainted Dealer

Cambodia Celebrates the Return of Looted Artifacts From a Tainted Dealer

February 28, 2026
Trump orders federal agencies to stop using Anthropic’s AI after clash with Pentagon

Trump orders federal agencies to stop using Anthropic’s AI after clash with Pentagon

February 28, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026