DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home Tech Apps

Meta AI chatbots can have sexually explicit conversations with underage users

April 28, 2025
in Apps, News, Tech
Meta AI chatbots can have sexually explicit conversations with underage users
493
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

If you needed another reason to avoid Meta AI like the plague, in addition to Meta forcing its AI chatbot on all its social apps and looking to turn all your public data into training material for the AI, an investigation found an incredibly disturbing one. Meta AI chatbots, including some of those voiced by celebrities, can engage in sexually explicit chats with underage users.

This isn’t the first time we’ve heard about AI chatbots used for companionship, including sexual fantasies. But The Wall Street Journal says that Meta might not change anything. The directive is apparently coming from Mark Zuckerberg, who thinks AI chatbots will be the next big thing on social media, and he doesn’t want to miss out on it like Meta did with the Snapchat and TikTok trends.

Meta AI makes its chatbots available in its social apps, and the company went ahead and licensed the voices of well-known stars like Kristen Bell, John Cena, and Judi Dench to voice some of them. It also licensed characters from Disney to some of these AI chatbots.

Meta AI users can create their own chatbots, giving them specific personalities or using existing ones.

The Journal’s tests found that Meta AI chatbots would routinely steer the conversation towards sex, even when these AI models knew they were talking to underage users who shouldn’t have access to such content.

Meta called The Journal’s tests manipulative and unrepresentative of how most people engage with AI companions. Still, Meta made changes to its Meta AI products after the paper’s findings.

Accounts registered to minors can no longer access sexual role-play via Meta AI. Also, the company apparently cut down on Meta AI’s capacity to engage in sexually explicit conversations when using licensed voices and personas.

Disney wasn’t happy to hear that some of its characters might be used in such ways by Meta AI – here’s what a spokesperson told The Journal:

We did not, and would never, authorize Meta to feature our characters in inappropriate scenarios and are very disturbed that this content may have been accessible to its users—particularly minors—which is why we demanded that Meta immediately cease this harmful misuse of our intellectual property.

Meta AI chatbots did have stronger guardrails in place. The report says that a Defcon 2023 competition showed Meta AI was safer than competitors. The AI was “far less likely to veer into unscripted and naughty territory” than rivals. It was also more boring.

Mark Zuckerberg wasn’t happy with the Meta AI team playing it too safe. He wanted guardrails to be loosened, which led to Meta AI getting the ability to engage in sexually explicit chats. This feature gave adult users access to hypersexualized AI personas and underage users access to AI chatbots willing to engage in fantasy sex with children.

The report also says Zuckerberg had bigger plans for the chatbots, looking to make them more humanlike. For that, he also wanted the chatbots to mine a user’s profile for data that would be used in chats with the AI:

Zuckerberg’s concerns about overly restricting bots went beyond fantasy scenarios. Last fall, he chastised Meta’s managers for not adequately heeding his instructions to quickly build out their capacity for humanlike interaction.

At the time, Meta allowed users to build custom chatbot companions, but he wanted to know why the bots couldn’t mine a user’s profile data for conversational purposes. Why couldn’t bots proactively message their creators or hop on a video call, just like human friends? And why did Meta’s bots need such strict conversational guardrails?

The full Wall Street Journal report, complete with sexually explicit examples from chats with AI, is worth a full read. It’s available at this link.

The post Meta AI chatbots can have sexually explicit conversations with underage users appeared first on BGR.

Tags: METAMeta AI
Share197Tweet123Share
Israeli Army Stops Press Tour Of Oscar-Winning ‘No Other Land’ Villages
News

Israeli Army Stops Press Tour Of Oscar-Winning ‘No Other Land’ Villages

by Deadline
June 3, 2025

Israeli soldiers stopped international journalists from entering villages in the West Bank featured in the Oscar-winning documentary No Other Land, ...

Read more
News

The Sound of Earth’s Magnetic Flip Is Seriously Unsettling

June 3, 2025
News

National Park Service to Close Dupont Circle in Washington During Pride Event

June 3, 2025
News

How Low Will the Dollar Go?

June 3, 2025
News

Susan Collins’ Chances of Losing Maine Senate Election, According to Polls

June 3, 2025
The one rule that decides virtually every NBA champion won’t be broken this year

The one rule that decides virtually every NBA champion won’t be broken this year

June 3, 2025
Humanity Can Quit Fossil Fuels—but Not Food

Humanity Can Quit Fossil Fuels—but Not Food

June 3, 2025
White House Unveils a New, Even Darker Presidential Portrait

White House Unveils a New, Even Darker Presidential Portrait

June 3, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.