A federal judge has sent another message to lawyers who may be tempted to use generative artificial intelligence: Always check your work.
In a decision issued on Monday, Judge Nina Y. Wang of the U.S. District Court for the District of Colorado imposed sanctions on two lawyers who represented Mike Lindell, the founder of MyPillow, who is known for spreading conspiracy theories about the 2020 presidential election.
In February, Judge Wang said, the lawyers filed a court brief in a defamation case brought against Mr. Lindell that contained “nearly 30 defective citations.” It misquoted court cases, misrepresented principles of law and, “most egregiously,” cited “cases that do not exist,” she wrote.
Judge Wang said the lawyers, Christopher I. Kachouroff and Jennifer T. DeMaster, had not explained how such errors could have ended up in the filing “absent the use of generative artificial intelligence or gross carelessness by counsel.”
She found that they had violated a federal rule that requires lawyers to certify that the claims they are making in court filings are grounded in the law. She fined them $3,000 each, calling it “the least severe sanction adequate to deter and punish defense counsel in this instance.”
Mr. Kachouroff and Ms. DeMaster did not immediately respond on Tuesday to messages seeking comment on the fines.
At a hearing in April, Judge Wang asked Mr. Kachouroff if the court filing containing the errors had been generated by generative artificial intelligence.
“Not initially,” Mr. Kachouroff responded, according to court documents. “Initially, I did an outline for myself, and I drafted a motion, and then we ran it through A.I.”
Judge Wang also asked Mr. Kachouroff if he had double-checked the citations in the filing.
“Your Honor, I personally did not check it,” he said, according to court documents. “I am responsible for it not being checked.”
Mr. Kachorouff said later in court papers that he had been “taken by complete surprise” by the judge’s line of questioning.
“I did not understand what was going on because my co-counsel and I had not relied on A.I. legal research and had prepared a thoroughly cite-checked final document to be filed,” Mr. Kachorouff said.
But he said his co-counsel, Ms. DeMaster, “mistakenly filed” a draft version of the court filing instead of the final version that they had “carefully cite-checked and edited.” When she uploaded the document, Mr. Kachorouff said, he was on vacation in Mexico and had limited internet access.
Mr. Kachouroff added that while he often uses A.I. programs to analyze his arguments and those of his opposition, “I do not rely on A.I. to do legal research or find cases.”
“Regardless of whether I use A.I. in a particular pleading, I always conduct verification of citations before filing,” he added.
Ms. DeMaster said in her own filing in April that she had mistakenly uploaded the draft version of the court filing, thinking it was the final version.
“I sincerely apologize to the court for filing the wrong version and the inconvenience this has caused to this court and the parties,” she wrote. “It was not done intentionally, to mislead the court, in bad faith, or for any improper purpose.”
Judge Wang said in her decision on Monday that she was not persuaded that the filing “was simply an inadvertent error, given the contradictory statements and the lack of corroborating evidence.”
Mr. Lindell lost the defamation case last month and was ordered to pay $2.3 million in damages to Eric Coomer, a former employee of Dominion Voting Systems. Mr. Coomer had accused Mr. Lindell of calling him “a traitor to the United States” and said he should turn himself in to the authorities, according to court filings.
The sanctions came as lawyers — along with students, teachers, journalists and others — have been wrestling with the proper way to harness the immense powers of artificial intelligence while ensuring the work they produce is original and accurate.
The American Bar Association, in its first ethical guidance for lawyers on the use of generative A.I., noted last year that many lawyers had been using A.I. tools to improve the efficiency and quality of the legal services they provide.
But the bar association warned that “uncritical reliance” on such tools “can result in inaccurate legal advice to clients or misleading representations to courts and third parties.”
It said that lawyers should exercise “an appropriate degree of independent verification or review” of A.I.-generated content to ensure they are not violating their duty to provide competent representation to their clients.
Last month, the High Court of England and Wales warned lawyers that they could face criminal prosecution for presenting false material generated by artificial intelligence after a series of cases cited made-up quotes and rulings that did not exist.
Michael Levenson covers breaking news for The Times from New York.
The post Judge Fines Lawyers for MyPillow Founder for Error-Filled Court Filing appeared first on New York Times.