DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

People are using ChatGPT as a lawyer in court. Some are winning.

October 8, 2025
in News, Tech
People are using ChatGPT as a lawyer in court. Some are winning.
494
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

Even as some litigants have found success in small-claims disputes, legal professionals who spoke to NBC News say AI-drafted court documents are often littered with inaccuracies and faulty reasoning.

Holmes said litigants “will use a case that ChatGPT gave them, and when I go to look it up, it does not exist. Most of the time, we get them dismissed for failure to state an actual claim, because a lot of times it’s just kind of, not to be rude, but nonsense.” AI models often generate information that is false or misleading but presented as fact, a phenomenon known as “hallucination.” Chatbots are trained on vast datasets to predict the most likely response to a query but sometimes encounter gaps in their knowledge. In these cases, the model may attempt to fill in the missing pieces with its best approximation, which can result in inaccurate or fabricated details.

For litigants, AI hallucinations can lead to pricey penalties. Jack Owoc, a colorful Florida-based energy drink mogul who lost a false advertising case to the tune of $311 million and is now representing himself, was recently sanctioned for filing a court motion with 11 AI-hallucinated citations referencing court cases that do not exist.

Owoc admitted he had used generative AI to draft the document due to his limited finances. He was ordered to complete 10 hours of community service and is now required to disclose whether he uses AI in all future filings in the case.

“Just like a law firm would have to check the work of a junior associate, so do you have to check the work generated by AI,” Owoc said via email.

Holmes and other legal professionals say there are common telltale signs of careless AI use, such as citations to nonexistent case law, filler language that was left in, and ChatGPT-style emoji or formatting that looks nothing like a typical legal document.

Damien Charlotin, a legal researcher and data scientist, has organized a public database tracking legal decisions in cases where litigants were caught using AI in court. He’s documented 282 such cases in the U.S. and more than 130 from other countries dating back to 2023.

“It really started to accelerate around the spring of 2025,” Charlotin said.

And the database is far from exhaustive. It only tracks cases where the established or alleged use of AI was directly addressed by the court, and Charlotin said most of its entries are referred to him by lawyers or other researchers. He noted that there are generally three types of AI hallucinations:

“You got the fabricated case law, and that’s quite easy to spot, because the case does not exist. Then you got the false quotations from existing case law. That’s also rather easy to spot because you do control-F,” Charlotin said. “And then there is misrepresented case law. That’s much harder, because you’re citing something that exists but you’re totally misrepresenting it.”

Earl Takefman has experienced AI’s hallucinatory tendencies firsthand. He is currently representing himself in several cases in Florida regarding a pickleball business deal gone awry and started using AI to help him in court last year.

“It never for a second even crossed my mind that ChatGPT would totally make up cases, and unfortunately, I found out the hard way,” Takefman told NBC News.

Takefman realized his mistake when the opposing counsel pointed out a hallucinated case in one of Takefman’s filings. “I went back to ChatGPT and told it that it really f—-d me over,” he said. “It apologized.”

A judge admonished Takefman for citing the same nonexistent case — an imaginary one from 1995 called Hernandez v. Gilbert — in two separate filings, among other missteps, according to court documents.

Embarrassed about the oversight, Takefman resolved to be more careful. “So I said, ‘OK, I know how to get around it. I’m going to ask ChatGPT to give me actual quotations from the court case I want to reference. Surely they would never make up an actual quotation.’ And it turns out they were making that up too!”

“I certainly did not intend to mislead the court,” Takefman said. “They take it very, very seriously and don’t let you off the hook because you’re a pro se litigant.”

In late August, the court forced Takefman to explain why he should not receive sanctions given his mistakes. The court accepted Takefman’s apology and did not apply sanctions.

The experience has not turned off Takefman from using AI in his court dealings.

“Now, I check between different applications, so I’ll take what Grok gives me and give it to ChatGPT and see if it agrees — that all the cases are real, that there aren’t any hallucinations, and that the cases actually mean what the AI thinks they mean,” Takefman said.

“Then, I put all of the cases into Google to do one last check to make sure that the cases are real. That way, I can actually say to a judge that I checked the case and it exists,” Takefman said.

So far, the majority of AI hallucinations in Charlotin’s database come from pro se litigants, but many have also come from lawyers themselves.

Earlier this month, a California court ordered an attorney to pay a $10,000 fine for filing a state court appeal in which 21 of the 23 quotes from cited cases were hallucinated by ChatGPT. It appears to be the largest-ever fine issued over AI fabrications, according to CalMatters.

“I can understand more easily how someone without a lawyer, and maybe who feels like they don’t have the money to access an attorney, would be tempted to rely on one of these tools,” said Robert Freund, an attorney who regularly contributes to Charlotin’s database. “What I can’t understand is an attorney betraying the most fundamental parts of our responsibilities to our clients … and making these arguments that are based on total fabrication.”

Freund, who runs a law firm in Los Angeles, said the influx of AI hallucinations wastes both the court’s and the opposing party’s time by forcing them to use up resources identifying factual inaccuracies. Even after a judge admonishes someone caught filing AI slop, sometimes the same plaintiff continues to flood the court with AI-generated filings “filled with junk.”

Matthew Garces, a registered nurse in New Mexico who’s a strong proponent of using AI to represent himself in legal matters, is currently involved in 28 federal civil suits, including 10 active appeals and several petitions to the Supreme Court. These cases cover a range of topics, including medical malpractice, housing disputes between Garces and his landlord, and alleged improper judicial conduct toward Garces.

After noting that Garces submitted documents referencing numerous nonexistent cases, a panel of judges from the 5th U.S. Court of Appeals recently criticized Garces’ prolific filing of new cases, writing that he is “WARNED FOR A SECOND TIME” to avoid any “future frivolous, repetitive, or otherwise abusive filings” or risk increasingly severe penalties.

A magistrate judge in another of Garces’ cases also recommended that he be banned from filing any lawsuits without the express authorization of a more senior judge, and that Garces be classified as “a vexatious litigant.”

Still, Garces told NBC News that “AI provides access to the courthouse doors that money often keeps closed. Managing nearly 30 federal suits on my own would be nearly impossible without AI tools to organize, research and prepare filings.”

As the use of AI in court grows, some pro bono legal clinics are now trying to teach their self-representing clients to use AI in ways that help rather than harm them — without offering direct legal advice.

“This is the most exciting time to be a lawyer,” said Zoe Dolan, a supervising attorney at Public Counsel, a nonprofit public interest law firm and legal advocacy center in Los Angeles. “The amount of impact that any one advocate can now have is only sort of limited by our imagination and organizational structures.”

Last year, Dolan helped create a class for self-represented litigants in Los Angeles County to learn how to leverage AI in their cases. The class taught participants how to use various prompts to create documents, how to fact-check the AI systems’ outputs and how to use chatbots to verify other chatbots’ work.

Several of the litigants who took the class, including White, have gone on to win their cases while using AI.

Numerous legal professionals railing against the sloppy use of AI in court also say that they’re not opposed to the use of AI among lawyers more generally. In fact, many say they feel optimistic about AI adoption by legal professionals who have the expertise to analyze and verify its outputs.

Andrew Montez, an attorney in Southern California, said that despite his firm “seeing pro se litigants constantly using AI” over the past six months, he himself has found AI tools useful as a starting point for research or brainstorming. He said he never inputs real client names or confidential information, and he checks every citation manually.

While AI cannot substitute for his own legal research and analysis, Montez said, these systems enable lawyers to write better-researched briefs more quickly.

“Going forward in the legal profession, all attorneys will have to use AI in some way or another. Otherwise they will be outgunned,” Montez said. “AI is the great equalizer. Internet research, to a certain extent, made law libraries obsolete. I think AI is really the next frontier.”

As for pro se litigants without legal expertise, Montez said he believes most cases are too complex for AI alone to understand sufficient context and provide good enough analysis to help someone succeed in court. But he noted that he could envision a future in which more people will use AI to successfully represent themselves, especially in small claims courts.

White, who avoided eviction this year with the help of ChatGPT and Perplexity.ai, said she views AI as a way to level the playing field. When asked what advice she would give to other pro se litigants, she thought it was fitting to craft a reply with ChatGPT.

The post People are using ChatGPT as a lawyer in court. Some are winning. appeared first on NBC News.

Share198Tweet124Share
All the October Prime Day deals on Apple products we could find
News

All the October Prime Day deals on Apple products we could find

by KTLA
October 8, 2025

BestReviews is reader-supported and may earn an affiliate commission. Details. There are big deals on Apple products during October Prime ...

Read more
News

Flight delays spread to major airports across US

October 8, 2025
News

Trump, 79, Orders FBI to Urgently Look Into 90-Year-Old Mystery

October 8, 2025
News

Ukraine just broke cover on its newest homemade Neptune missile. It’s bulked up with the latest upgrades.

October 8, 2025
Health

More than 80% of health facilities in eastern Congo are out of medicine, Red Cross says

October 8, 2025
A Brief Guide to Lomography Film Formats

A Brief Guide to Lomography Film Formats

October 8, 2025
The right’s new civil war over Israel proves both sides need a nap

The right’s new civil war over Israel proves both sides need a nap

October 8, 2025
Transcript: Marjorie Taylor Greene Tirade Wrecks Trump Shutdown Stance

Transcript: Marjorie Taylor Greene Tirade Wrecks Trump Shutdown Stance

October 8, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.