The news is by your side.

Judge does not want to punish Michael Cohen for relying on artificial intelligence

0

A Manhattan judge on Wednesday declined to impose sanctions on Michael D. Cohen, the former fixer for former President Donald J. Trump, after he accidentally gave his lawyer false legal citations concocted by Google Bard, an artificial intelligence program, for a motion filed by the attorney. was preparing on behalf of Mr. Cohen.

The lawyer, David M. Schwartz, cited the bogus cases in his petition, which was filed in Federal District Court.

The judge, Jesse M. Furman, said the episode was embarrassing and unfortunate, but he had accepted Mr. Cohen’s explanation that he did not understand how Google Bard worked and that he had not intended to deceive Mr. Schwartz. The judge also said he had not found that Mr. Schwartz had acted in bad faith.

“Indeed, it would have been downright irrational for him to have presented Schwartz with false cases to include in the motion, knowing them to be bogus,” Judge Furman wrote of Mr. Cohen, a former lawyer who has been disbarred, given the likelihood that Mr Cohen will be excluded. Schwartz, the government or the court would discover the problem, “with potentially serious adverse consequences for Cohen himself.”

The issue was raised in a case involving tax evasion and campaign finance violations committed by Mr. Cohen on behalf of Mr. Trump. Mr. Cohen pleaded guilty in 2018 and served time in prison. He had asked for an early end to the court’s supervision of his case, after he was released from prison and fulfilled the conditions of his release.

Judge Furman had denied three previous such requests from Mr. Cohen. In his latest filing, his lawyer, Mr. Schwartz, pointed out that his client testified for two days last fall in the civil fraud trial of Mr. Trump in New York state. Mr. Cohen’s “willingness to come forward and make truthful statements,” Mr. Schwartz argued, “demonstrates an exceptional level of remorse and a commitment to upholding the law.”

But Judge Furman said that Mr. Cohen’s testimony in the state trial “in fact provides grounds for denying his request, not granting it.” The judge cited Mr. Cohen’s testimony in the civil trial in which he admitted that he lied in federal court when he pleaded guilty to tax evasion, which he now says he did not commit.

A lawyer for Mr. Cohen did not immediately respond to a request for comment on Judge Furman’s ruling.

Mr. Cohen’s credibility will be at the heart of Mr. Trump’s first criminal trial, set to begin in Manhattan in mid-April. Mr. Cohen, one of the prosecution’s star witnesses, was involved in the hush-money deal at the center of the case, brought by the Manhattan district attorney’s office. Mr. Trump’s lawyers could try to use Mr. Cohen’s inconsistent statements during the civil fraud trial, and possibly even Judge Furman’s ruling, to portray him as a liar. But the district attorney’s office will likely counter that Mr. Cohen has told many of his previous lies on Mr. Trump’s behalf, and that he has been telling a consistent story about the hush-money deal for years.

The judge overseeing the civil fraud trial, Arthur F. Engoron, had said he found Mr. Cohen’s testimony “credible” and imposed a crushing $454 million judgment on Mr. Trump.

It was in his request to end judicial supervision of his case that Mr. Cohen sought to assist his lawyer, Mr. Schwarz.

Mr. Cohen said in an affidavit in December that he had not kept pace with “emerging trends (and associated risks) in legal technology and did not realize that Google Bard was a generative text service that, like ChatGPT, citations and could show descriptions. that looked real, but in reality were not.”

Mr. Cohen also said he had not realized that Mr. Schwartz “would drop the cases in their entirety without even confirming that they existed.”

Mr. Cohen asked Judge Furman to exercise “discretion and mercy.”

The case is one of several to surface in Manhattan federal court in the past year in which the use of artificial intelligence has tainted the trials. Nationally, there have been at least 15 cases in which lawyers or litigants representing themselves were alleged to have used chatbots for legal research that ended up in lawsuits, according to Eugene Volokh, a law professor at UCLA who has written about artificial intelligence and the law.

The issue entered the public consciousness last year after Judge P. Kevin Castel, also of Manhattan federal court, fined two lawyers $5,000 after admitting they submitted a brief filled with non-existent decisions and legal citations generated by ChatGPT.

A series of similar cases in federal courts in Manhattan followed.

In one case, an attorney acknowledged that she had cited a “non-existent case” – Matter of Bourguignon v. Coulated Behavioral Health Services, Inc. – which she said was “suggested by ChatGPT” after her own research failed to reach a decision supporting an argument she was making. In January, the U.S. Court of Appeals for the Second Circuit referred her to a court panel that investigates complaints against attorneys.

And in another case, Federal District Court Judge Paul A. Engelmayer rebuked a law firm in Auburn, NY, which openly admitted that it had used ChatGPT to substantiate a request for attorney fees in a lawsuit against the New York State Department of Education. York City.

Judge Engelmayer said ChatGPT’s “invocation of support for its aggressive bid is completely and unusually unpersuasive.”

The cases highlight the challenges facing the legal profession as lawyers increasingly rely on chatbots to draft legal briefs. The artificial intelligence programs, such as ChatGPT and Bard (now known as Gemini), generate realistic responses by guessing which text fragments should follow other sequences.

Mr. Cohen wrote in his statement that he understood Bard to be “a supercharged search engine” that he had used in the past to obtain accurate information. The cases he found and passed on to Mr Schwartz appear to have been “hallucinations” – a term used to refer to chatbot-generated inaccuracies.

The episode became public in December when Judge Furman said in an order that he could not find any of the three decisions Mr. Schwartz cited in his motion. He ordered Mr. Schwartz to provide him with copies of the decisions or “a thorough explanation of how the motion was made to cite cases that do not exist and what role Mr. Cohen played.”

Mr. Schwartz said in his own statement that he had not independently reviewed the cases Mr. Cohen had presented because Mr. Cohen indicated that another attorney had given him suggestions for the motion.

“I sincerely apologize to the court for not personally reviewing these matters before submitting them to the court,” Mr. Schwartz wrote.

Barry Kamins, a lawyer for Mr. Schwartz, said Wednesday: “We are pleased that the court has deemed this error as one not made in bad faith by Mr. Schwartz.”

Ben Protess reporting contributed.

Leave A Reply

Your email address will not be published.