DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

Character.AI to Bar Children Under 18 From Using Its Chatbots

October 29, 2025
in News
Character.AI to Bar Children Under 18 From Using Its Chatbots
494
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

Character.AI said on Wednesday that it would bar people under 18 from using its chatbots starting late next month, in a sweeping move to address concerns over child safety.

The rule will take effect Nov. 25, the company said. To enforce it, Character.AI said, over the next month the company will identify which users are minors and put time limits on their use of the app. Once the measure begins, those users will not be able to converse with the company’s chatbots.

“We’re making a very bold step to say for teen users, chatbots are not the way for entertainment, but there are much better ways to serve them,” Karandeep Anand, Character.AI’s chief executive, said in an interview. He said the company also planned to establish an A.I. safety lab.

The moves follow mounting scrutiny over how chatbots sometimes called A.I. companions can affect users’ mental health. Last year, Character.AI was sued by the family of Sewell Setzer III, a 14-year-old in Florida who killed himself after constantly texting and conversing with one of Character.AI’s chatbots. His family accused the company of being responsible for his death.

The case became a lightning rod for how people can develop emotional attachments to chatbots, with potentially dangerous results. Character.AI has since faced other lawsuits over child safety. A.I. companies including the ChatGPT maker OpenAI have also come under scrutiny for their chatbots’ effects on people — especially youths — if they have sexually explicit or toxic conversations.

In September, OpenAI said it planned to introduce features intended to make its chatbot safer, including parental controls. This month, Sam Altman, OpenAI’s chief executive, posted on social media that the company had “been able to mitigate the serious mental health issues” and would relax some of its safety measures.

(The New York Times has sued OpenAI and Microsoft, claiming copyright infringement of news content related to A.I. systems. The two companies have denied the suit’s claims.)

In the wake of these cases, lawmakers and other officials have begun investigations and proposed or passed legislation aimed at protecting children from A.I. chatbots. On Tuesday, Senators Josh Hawley, Republican of Missouri, and Richard Blumenthal, Democrat of Connecticut, introduced a bill to bar A.I. companions for minors, among other safety measures.

Gov. Gavin Newsom this month signed a California law that requires A.I. companies to have safety guardrails on chatbots. The law takes effect Jan. 1.

“The stories are mounting of what can go wrong,” said Steve Padilla, a Democrat in California’s State Senate, who had introduced the safety bill. “It’s important to put reasonable guardrails in place so that we protect people who are most vulnerable.”

Mr. Anand of Character.AI did not address the lawsuits his company faces. He said the start-up wanted to set an example on safety for the industry “to do far more than what the regulation might require.”

Character.AI was founded in 2021 by Noam Shazeer and Daniel De Freitas, two former Google engineers, and raised nearly $200 million from investors. Last year, Google agreed to pay about $3 billion to license Character.AI’s technology, and Mr. Shazeer and Mr. De Freitas returned to Google.

Character.AI allows people to create and share their own A.I. characters, such as custom anime avatars, and it markets the app as A.I. entertainment. Some personas can be designed to simulate girlfriends, boyfriends or other intimate relationships. Users pay a monthly subscription fee, starting at about $8, to chat with the companions. Until its recent concern about underage users, Character.AI did not verify ages when people signed up.

Last year, researchers at the University of Illinois Urbana-Champaign analyzed thousands of posts and comments that young people had left in Reddit communities dedicated to A.I. chatbots, and interviewed teenagers who used Character.AI, as well as their parents. The researchers concluded that the A.I. platforms did not have sufficient child safety protections, and that parents did not fully understand the technology or its risks.

“We should pay as much attention as we would if they were chatting with strangers,” said Yang Wang, one of the university’s information science professors. “We shouldn’t discount the risks just because these are nonhuman bots.”

Character.AI has about 20 million monthly users, with less than 10 percent of them self-reporting as being under the age of 18, Mr. Anand said.

Under Character.AI’s new policies, the company will immediately place a two-hour daily limit on users under the age of 18. Starting Nov. 25, those users cannot create or talk to chatbots, but can still read previous conversations. They can also generate A.I. videos and images through a structured menu of prompts, within certain safety limits, Mr. Anand said.

He said the company had enacted other safety measures in the past year, such as parental controls.

Going forward, it will use technology to detect underage users based on conversations and interactions on the platform, as well as information from any connected social media accounts, he said. If Character.AI thinks a user is under 18, the person will be notified to verify his or her age.

Dr. Nina Vasan, a psychiatrist and director of a mental health innovation lab at Stanford University that has done research on A.I. safety and children, said it was “huge” that a chatbot maker would bar minors from using its app. But she said the company should work with child psychologists and psychiatrists to understand how suddenly losing access to A.I. companions would affect young users.

“What I worry about is kids who have been using this for years and have become emotionally dependent on it,” she said. “Losing your friend on Thanksgiving Day is not good.”

Natallie Rocha is a San Francisco-based technology reporter and a member of the 2025-26 Times Fellowship class, a program for early-career journalists.

Kashmir Hill writes about technology and how it is changing people’s everyday lives with a particular focus on privacy. She has been covering technology for more than a decade.

The post Character.AI to Bar Children Under 18 From Using Its Chatbots appeared first on New York Times.

Share198Tweet124Share
North Carolina father killed 4 children separately over 5 months, sheriff says
News

North Carolina father killed 4 children separately over 5 months, sheriff says

by WHNT
October 29, 2025

ZEBULON, N.C. (WNCN) — Day two of a murder investigation in Johnston County, North Carolina, is underway after deputies said ...

Read more
Books

A Writer Who Did What Hillbilly Elegy Wouldn’t

October 29, 2025
News

A Flowing Toast to Diane von Furstenberg, Queen of the Wrap Dress

October 29, 2025
News

Don Jr. Suggests Wild Conspiracy Behind His Lack of Retweets

October 29, 2025
News

Boeing pushes 777X jet deliveries to 2027 amid certification delays

October 29, 2025
How To Find a Lost Android Phone: A Helpful Guide

How To Find a Lost Android Phone: A Helpful Guide

October 29, 2025
Ex-Illinois Deputy Convicted in Fatal Shooting of Woman Who Had Called 911

Ex-Illinois Deputy Convicted in Fatal Shooting of Woman Who Had Called 911

October 29, 2025
GM lays off about 1,750 employees amid ‘slower near-term EV adoption’ and ‘evolving regulatory environment’

GM lays off about 1,750 employees amid ‘slower near-term EV adoption’ and ‘evolving regulatory environment’

October 29, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.