DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

Can you trust AI to keep your secrets? Probably not, lawyers say.

August 31, 2025
in News
Can you trust AI to keep your secrets? Probably not, lawyers say.
496
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter
A close-up of someone's fingers typing on a laptop.
Microsoft researchers found that jobs related to providing and communicating information are most likely to be affected by AI.

Oscar Wong/Getty

Artificial intelligence chatbots like OpenAI’s ChatGPT are increasingly serving as confidants and stand-in therapists for many users.

But for the average user, sharing your deepest secrets with AI tools can potentially open you up to serious risks. Those conversations are not legally protected in the same way that they would be with, say, a doctor, lawyer, therapist, or even a spouse, attorneys warned.

Two lawyers with expertise in AI-related legal issues told Business Insider that people should exercise caution when conversing with AI chatbots, be familiar with their terms of service and data retention policies, and understand that sensitive chat records, if relevant, could be subpoenaed in a lawsuit or government investigation.

“People are just pouring their hearts out in these chats, and I think they need to be cautious,” said Juan Perla, a partner at the global firm Curtis, Mallet-Prevost, Colt & Mosle LLP.

Perla, a leader in the firm’s AI practice, said, “Right now, there really isn’t anything that would protect them if a court really wanted to get to the chat for some reason related to a litigation.”

OpenAI CEO Sam Atlman raised this point during a podcast that aired last month, noting that users, especially young people, are frequently turning to ChatGPT as a therapist or life coach. “People,” the billionaire said, “talk about the most personal shit in their lives to ChatGPT.”

“Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s like legal privilege for it — there’s doctor-patient confidentiality, there’s legal confidentiality,” Altman told podcaster Theo Von. “We haven’t figured that out yet for when you talk to ChatGPT.”

“So if you go talk to ChatGPT about your most sensitive stuff and then there’s like a lawsuit or whatever, we could be required to produce that, and I think that’s very screwed up,” Altman said.

This lack of legal confidentiality when using AI tools, Perla said, should make users think twice about how much they choose to share.

Chatbot messages tied to situations like a workplace dispute, divorce, or custody case could be subject to discovery in related litigation, he said. The same goes for messages related to potential criminal activity.

“If you’re putting something into ChatGPT because it’s something that you would normally only share with your medical doctor, with your therapist, or with a lawyer, that should already tell you, ‘I should not be putting this information in here,'” Perla said.

Even if users attempt to shield their identities or speak hypothetically to the chatbots, that won’t fully eliminate possible risk.

The “wisest and safest” thing to do is not have those sensitive conversations with AI chatbots at all, Perla said.

“If you’re talking about your personal intimate affairs with a chatbot that have nothing to do with the commission of a crime, that have nothing to do with a dispute or a litigation that could emerge, then the likelihood that these chats are going to be public or be turned over to a court or another party in discovery is pretty low,” Perla said.

Knowing how AI platforms handle data

James Gatto, a partner at Sheppard Mullin who co-leads the firm’s AI industry team of attorneys, told Business Insider that it’s crucial for users to understand how different AI tools handle their data.

Some paid versions of certain AI platforms may offer more robust privacy features, such as the automatic deletion of user inputs, while the free, public versions typically do not, he said.

“If I was going to use a tool for anything sensitive. I’d want a tool that deleted the information,” Gatto said. “And I would want to make sure the terms of service explicitly calls that out.”

If users care about confidentiality and protecting themselves from any kinds of future legal risks, they must do their own diligence, he said.

“The important takeaway is you need to understand the pros and cons of using these tools, you need to understand the legal and personal risk,” Gatto said.

“There may be circumstances where you’re taking some risk, but the worst case scenario is not that bad,” Gatto said. “There are other cases where a worst-case scenario is really bad and you wouldn’t want to do it.”

Perla added that the risk factor should be weighed “any time we’re creating a record — text messages, chats, for sure.”

“The question should be,” Perla said, “am I comfortable with this information ever landing in the hands of somebody else that is not the person that I thought I was having this conversation with, or that is limited to the technology that I was engaged with?”

The post Can you trust AI to keep your secrets? Probably not, lawyers say. appeared first on Business Insider.

Share198Tweet124Share
Chiefs All-Pro Reportedly Gets Major Contract Update
News

Chiefs All-Pro Reportedly Gets Major Contract Update

by Newsweek
August 31, 2025

The Kansas City Chiefs and Los Angeles Chargers face off on Friday, September 6, in Sao Paolo, Brazil. The defending ...

Read more
News

Man arrested after SUV crashes into Russian consulate gates in Sydney

August 31, 2025
Education

Keir Starmer wants to fix Britain. He’s still working out how.

August 31, 2025
News

Fuming ICE Barbie Blasts CBS for ‘Shameful Editing’ of Interview

August 31, 2025
News

Burning Man death prompts homicide investigation

August 31, 2025
Nun chases down thief who stole credit cards from customer of religious goods shop

Nun chases down thief who stole credit cards from customer of religious goods shop

August 31, 2025
‘Fantastic Four: First Steps’ Crosses $500M WW – International Box Office

‘Fantastic Four: First Steps’ Crosses $500M WW – International Box Office

August 31, 2025
North Korea’s Kim Jong Un inspects new missile production line

North Korea’s Kim Jong Un inspects new missile production line

August 31, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.