DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Prosecutor Used Flawed A.I. to Keep a Man in Jail, His Lawyers Say

November 25, 2025
in News
Prosecutor Used Flawed A.I. to Keep a Man in Jail, His Lawyers Say

When Kyle Kjoller, a 57-year-old welder, was ordered held without bail in Nevada County, Calif., in April, he protested. The charges against him — multiple counts of illegal gun possession — were not grave enough under California law to warrant keeping him in jail for months awaiting his trial, he argued.

Prosecutors disagreed, and offered 11 pages’ worth of reasons. But the brief they filed, Mr. Kjoller’s lawyers contend, was rife with errors that bear the hallmarks of generative artificial intelligence.

The lawyers soon turned up briefs in four separate cases, including Mr. Kjoller’s, that were filled with mistakes, all of them from the office of the same prosecutor, District Attorney Jesse Wilson. The mistakes included wholesale misinterpretations of the law, as well as quotations that do not actually appear in the cited texts.

Those are the sorts of errors that generative A.I. tends to make.

Mr. Wilson has acknowledged that the briefs contain numerous errors, but he has said that A.I. was used to draft only one of them, and not the one filed in Mr. Kjoller’s case.

That answer has not satisfied Mr. Kjoller’s lawyers, who have asked the California Supreme Court to investigate whether the briefs indicate a “wider pattern” of prosecutors asking courts to rule against defendants “on the basis of nonexistent case citations and holdings.”

“Prosecutors’ reliance on inaccurate legal authority can violate ethical rules, and represents an existential threat to the due process rights of criminal defendants and the legitimacy of the courts,” they wrote.

On Friday, the lawyers were joined by a group of 22 legal and technology scholars who warned that the unchecked use of A.I. could lead to wrongful convictions. The group, which filed its own brief with the state Supreme Court, included Barry Scheck, a co-founder of the Innocence Project, which has helped to exonerate more than 250 people; Chesa Boudin, a former district attorney of San Francisco; and Katherine Judson, executive director of the Center for Integrity in Forensic Sciences, a nonprofit that seeks to improve the reliability of criminal prosecutions.

The problem of A.I.-generated errors in legal papers has burgeoned along with the popular use of tools like ChatGPT and Gemini, which can perform a wide range of tasks, including writing emails, term papers and legal briefs. Lawyers and even judges have been caught filing court papers that were rife with fake legal references and faulty arguments, leading to embarrassment and sometimes hefty fines.

The Kjoller case, though, is one of the first in which prosecutors, whose words carry great sway with judges and juries, have been accused of using A.I. without proper safeguards.

“Our office continues to learn the dynamics of AI-assisted legal work and its pitfalls,” Mr. Wilson wrote in a statement, adding that the office had instituted an A.I. policy directive and conducted new staff trainings.

Lawyers are not prohibited from using A.I., but they are required to ensure that their briefs, however they are written, are accurate and faithful to the law. Today’s artificial intelligence tools are known to sometimes “hallucinate,” or make things up, especially when asked complex legal questions.

A.I. tools that are specifically designed to aid in legal research perform somewhat better than general-purpose tools like ChatGPT, experts say. The Nevada County district attorney’s office began using a specialized tool, Westlaw Advantage, in September, the same day it filed the last of the four flawed briefs identified by Mr. Kjoller’s lawyers.

Westlaw executives said that their A.I. tool does not write legal briefs, because they believe A.I. is not yet capable of the complex reasoning needed to do so. The research reports that their tool generates contain citations, so that lawyers can click through and read each of the cited cases, they said, and the reports include a warning that they may contain errors. “No one should be putting a document forward to the court and citing a bunch of cases that they’ve never read,” said Mike Dahn, the head of product at Westlaw.

Mr. Kjoller’s lawyers say the four briefs in question attribute opinions to the wrong courts, identify them with incorrect citations, and misname them or misquote them. More significantly, the lawyers said, the briefs misconstrue the opinions and the state constitution.

In Mr. Kjoller’s case, Madison Maxwell, an deputy district attorney, filed a brief saying that the state constitution says pretrial detention may be imposed “when the facts are evident or the presumption great, and release would create a substantial likelihood of great bodily harm to others.” But according to the provision cited in the filing, that would be true only in cases where a person is charged with committing a violent felony or a felony sexual assault. Mr. Kjoller was not accused of either type of crime.

In another of the cases, a father accused of exposing his 2-year-old child to a potentially deadly dose of fentanyl asked the court to order mental health treatment instead of a prison sentence. In opposing the request, Ms. Maxwell’s brief cited an appellate court opinion, Sarmiento v. Superior Court, to argue that judges had broad discretion to deny the mental health treatment option in order to protect the public.

The Sarmiento opinion, however, actually made the opposite point, ruling that a trial court judge had gone too far in denying treatment.

Ms. Maxwell is the author of three of the four briefs in questions, According to filings by Mr. Kjoller’s team, when she was asked about the filing in his case, she explained that the mistakes in the brief “were purely a result of her working on multiple matters, being constantly in court, responding to multiple briefings, and going too fast in her research and drafting.”

Through a colleague, Ms. Maxwell referred questions to her boss, Mr. Wilson.

He wrote in his statement, “Regardless of the source of any citation error — whether arising from the use of artificial intelligence or traditional human error — our office remains firmly committed to the highest standards of integrity.”

Gary Marchant, a law professor at Arizona State University who frequently gives talks to judges and lawyers on the use of artificial intelligence, said that A.I.-generated errors in court papers are usually the result of negligence rather than intentional deceit, Even so, he said, users should be aware that a common flaw of A.I. tools is sycophancy, or telling people what they want to hear.

“If you ask the A.I. to give you cases to support an argument, it will often — just like a human lawyer would — stretch to find something,” he said.

Damien Charlotin, a senior researcher at HEC Paris, maintains a database that includes more than 590 cases from around the world in which courts and tribunals have detected hallucinated content. More than half involved people who represented themselves in court. Two-thirds of the cases were in United States courts. Only one, an Israeli case, involved A.I. use by a prosecutor.

Mrs. Kjoller is being represented by a public defender, Thomas Angell, who filed the A.I. complaint with the help of Civil Rights Corps, a nonprofit group that works to reduce the detention of people awaiting trial. When the lawyers discovered the errors in the prosecution’s brief in Mr. Kjoller’s case, they asked the Third District Court of Appeal to order the prosecution to show why it should not be subjected to sanctions, including having to reimburse the defense team $23,000, which they said was the cost to identify and fight the errors.

The court denied that motion without explanation, but it did order a new bail hearing for Mr. Kjoller.

Mr. Angell and the Civil Rights Corps lawyer, Carson White, declined to comment for this article.

Mr. Kjoller’s lawyers then discovered a brief filed by Mr. Wilson’s office in another case that they said contained similar errors. That one had already caught the eye of a judge, who warned that A.I.-generated errors were “something that certainly everybody should be on notice about.”

Mr. Kjoller’s lawyers renewed their request for sanctions against the prosecutor’s office. But shortly after that, Mr. Kjoller was convicted at trial, and the Court of Appeals again denied the sanction request.

Kate Chatfield, executive director of the California Public Defenders Association, said she was surprised that the appellate court had appeared to shrug off the allegations, because other California courts have recently issued several A.I.-related scoldings to lawyers, and even imposed a $10,000 fine.

When prosecutors are involved, the stakes are even higher, Ms. Chatfield said. “A judge could rely on that fake authority, and as a result, your liberty could be taken, your bail could be raised, you could be taken into custody,” she said.

At a hearing on Nov. 6 in one of the cases involving questionable filings, Ms. Maxwell offered a string of corrections and clarifications to her brief. Asked about passages that were in quotation marks but did not appear in the cited source, she said that they should have appeared in italics instead.

Mr. Angell, who represented the defendant in that case, took issue with her explanations. “It’s high school English — when you put something in quotations, you’re directly citing to language attributed to another party,” he said in the hearing.

Shaila Dewan covers criminal justice — policing, courts and prisons — across the country. She has been a journalist for 25 years.

The post Prosecutor Used Flawed A.I. to Keep a Man in Jail, His Lawyers Say appeared first on New York Times.

War brews as GOP members clash over glut of fresh accusations: report
News

War brews as GOP members clash over glut of fresh accusations: report

November 25, 2025

Already facing criticism for taking an extended vacation to avoid voting on the release of the Jeffrey Epstein files, House ...

Read more
News

Russia expected to reject changes to U.S. peace plan that Ukraine accepts

November 25, 2025
News

Reaching a new low, CDC discards science in claims about vaccines and autism

November 25, 2025
News

‘I’ve tried to warn them’: Elon Musk says Tesla’s rivals don’t want its self-driving tech

November 25, 2025
News

New antisemitism probe of Berkeley Unified announced by congressional committee

November 25, 2025
‘Ignorance and incompetence’: Nobel Prize winner lays into DOGE’s vow to continue cuts

‘Ignorance and incompetence’: Nobel Prize winner lays into DOGE’s vow to continue cuts

November 25, 2025
The ex-landscaper behind the deportation diary L.A. never wanted

The ex-landscaper behind the deportation diary L.A. never wanted

November 25, 2025
What ever happened to the dream of the 4-day workweek?

What ever happened to the dream of the 4-day workweek?

November 25, 2025

DNYUZ © 2025

No Result
View All Result

DNYUZ © 2025