DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Instagram to Alert Parents to Teens’ Self-Harm Searches

February 28, 2026
in News
Instagram to Alert Parents to Teens’ Self-Harm Searches

Instagram will begin notifying parents when their teenage children use the platform to repeatedly search for terms related to suicide or self-harm.

The alerts “are designed to give parents the information they need to support their teen and come with expert resources to help parents approach these sensitive conversations,” Meta, which owns Instagram, said in a news release on Thursday.

Meta is promoting the new alerts as it is on trial in two states over claims that its platforms are addictive and have harmed young users.

Here’s what to know:

Users must opt in to get the alerts.

To receive the alerts, which will be sent by email, text or WhatsApp, as well as through an in-app notification, teenagers and parents must be enrolled in Instagram’s parental supervision tools. In other words, users need to opt in to get the messages.

Meta stressed in its news release that a “vast majority of teens do not try to search for suicide and self-harm content on Instagram,” and highlighted an existing policy of blocking searches for such content.

Parents who tap on a notification will see a “full-screen message” explaining that their teen has repeatedly searched for harmful content within a short period, the company said. Parents will then “have the option to view expert resources designed to help them approach potentially sensitive conversations with their teen.”

The new alerts will be available in the United States, Britain, Australia and Canada starting next week, the company said, while other regions of the world will have access to the alerts later this year.

Meta is on trial in two U.S. states.

The new feature is being introduced as Meta is defending itself in landmark trials against claims that its platforms are addictive and have harmed young users.

In a series of trials, the plaintiffs’ lawyers are testing a novel legal theory claiming that Meta and other social media companies have caused personal injury through defective products.

Instagram has denied that its platform is “clinically addictive.”

Adam Mosseri, the platform’s chief executive, testified in Los Angeles County Superior Court earlier this month that social media could cause some harm, but the company is careful to test features used by young people before releasing them.

He added that people could be addicted to social media in the same way they could be addicted to a good television show, but that did not mean they were “clinically addicted.”

Meta is a defendant in a separate trial in New Mexico state court over charges brought by the state attorney general, citing allegations that the company’s technology is addictive and facilitates child exploitation.

What are the experts saying?

The new alerts may be a “step in the right direction,” but they are not “a replacement for improving their platform design and content management policies, which should be protecting children from harmful content online already,” said Kris Perry, the executive director of Children and Screens: Institute of Digital Media and Child Development, a nonprofit that researches the impact of digital media on children.

Because users must enroll in Instagram’s supervision tools to get the alerts, Ms. Perry said, there are already limits to how effective it could be. Some parents don’t opt into parental supervision, she said, and many children have figured out how to work around those guardrails anyway, for instance by using secret accounts or coded language that wouldn’t trigger the alert.

The new feature, she said, continues Instagram’s longstanding pattern of placing the onus for child supervision on the parent rather than the platform itself.

Ms. Perry said Meta should let the public know as soon as possible if the alerts are having a measurable effect on the amount of harmful content that teenage users are seeing, “so we could see if this idea is effective or if it wasn’t effective” and “so that we could ask for additional steps on their part.”

“We won’t know that until it’s implemented and we see the results — in other words, reduced content being shared with kids, perhaps lower suicide rates, greater satisfaction on the part of parents,” she said, adding that Meta has rolled out safety features in the past that didn’t achieve their intended goals.

“We have to be cautious,” she said. “and request transparency here so that we can all see whether these ideas actually work.”

Aimee Ortiz covers breaking news and other topics.

The post Instagram to Alert Parents to Teens’ Self-Harm Searches appeared first on New York Times.

NASA Is Making Big Changes to Speed Up the Artemis Program
News

NASA Is Making Big Changes to Speed Up the Artemis Program

by Wired
February 28, 2026

NASA Administrator Jared Isaacman announced sweeping changes to the Artemis program on Friday morning, including an increased cadence of missions ...

Read more
News

Lawsuit against L.A. County deputies involved in fight outside Santa Clarita bar dismissed

February 28, 2026
News

Video shows nearly blind refugee being released by Border Patrol, 5 days before his death

February 28, 2026
News

ICE detainees see migrants go into ‘diabetic shock’ after being denied insulin for days

February 28, 2026
News

Tucker Carlson hit with scathing attack from irate ambassador over ‘disgusting’ conspiracy

February 28, 2026
Scouts Will Abandon D.E.I. Policies Under Agreement With Pentagon, Hegseth Says

Scouts Will Abandon D.E.I. Policies Under Agreement With Pentagon, Hegseth Says

February 28, 2026
Trump DOJ sharply rebuked by judge after seizing reporter’s devices: ‘Fox in the henhouse’

Pam Bondi under fire from judges over ‘eyebrow-raising’ social media posts

February 28, 2026
Radiohead Demands That ICE Remove Its Song From a Social Media Video

Radiohead Demands That ICE Remove Its Song From a Social Media Video

February 28, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026