Adam Hyman told the three Maryland appellate judges he did not intend on taking up much of their time.

In fact, Hyman said, there were moments when he considered simply relying on the filings he had submitted in the appeal, which stemmed from a divorce case in Harford County.

“Well, there’s an issue here apart from the merits,” Appellate Judge Kathryn Grill Graeff said, “with the brief.”

The filing, she noted, cited numerous cases that did not exist or stood for a different proposition.

Advertise with us

“And that’s a serious problem,” Graeff said. “So we would like you to address that, and what actions did you take?”

The source of the problem? Artificial intelligence.

Hyman, a family law attorney in Bel Air, said he was not aware his office had used AI. But he said he had to take responsibility because he submitted the brief under his name.

The phenomenon of lawyers using generative AI platforms only for them to produce citations that are incorrect or do not exist has emerged as an issue in the legal profession.

The American Bar Association Standing Committee on Ethics and Professional Responsibility in 2024 issued a formal opinion about the use of generative AI. It reminds lawyers that they must “fully consider their applicable ethical obligations.”

Advertise with us

This year, a federal judge publicly reprimanded two outside attorneys the state of Maryland hired to defend the conditions at the Baltimore Central Booking & Intake Center after they cited cases that ChatGPT hallucinated in an unrelated lawsuit.

U.S. District Judge Anna M. Manasco also referred the lawyers, Matthew Reeves and William Lunsford, to the Alabama State Bar and other licensing authorities for “further proceedings as appropriate.”

Large language models such as ChatGPT are trained on a huge universe of data to predict what’s likely to be a useful response to a prompt. But they do not actually know anything, said Amy Sloan, Dean Julius Isaacson professor at the University of Baltimore School of Law.

These platforms will produce a response that appears to contain real cases. But they might be “totally made up,” said Sloan, the author of “Using Generative AI for Legal Research.”

Other times, Sloan said, the case name might be real but the citation is fake. Or the citation is real but it’s paired with the wrong case name.

Advertise with us

Some judges, she said, have rules that require attorneys to document when they’ve used AI and certify that they’ve verified the information is correct.

Legal research services including LexisNexis and Westlaw have their own AI products, but they’re trained on verifiable databases and are more reliable, Sloan said.

At the same time, Sloan said, lawyers should use AI for gaining background information at most — and maybe not at all.

So why does this keep happening?

“It’s like holding penalties in football. If you get caught, you’re going to get penalized. But a lot of it doesn’t get caught,” Sloan said. “A lot of what AI produces sounds good, and people I think get lulled into believing it must be right.”

Advertise with us

During oral argument Oct. 3 in the Appellate Court of Maryland, Hyman said he thought AI was kind of like a “super Google.” He ended up speaking for more than 20 minutes.

Of 27 cases cited in his brief, 11 contained a “citation irregularity,” court records show.

In court documents, Hyman wrote that he has never used AI for professional reasons. He said he prepared the brief with an employee and supplemented his argument with some of her research, which he thought was “good Maryland law.”

Hyman reported that he’s taken measures including completing a continuing education course, finalizing a subscription to a legal research service and implementing a written AI policy.

Through his attorney, Ralph Sapia, Hyman declined to comment.

Advertise with us

“I’m not going to do anything but take responsibility for it. What more can I do?” Hyman said. “For me, it’s crystal clear. I submitted something that wasn’t accurate. It’s my name. I’m the only attorney in the office. That’s my responsibility.”

“Report yourself. That’s one thing that you can do,” Appellate Judge Kevin F. Arthur said.

Hyman said he’s spoken with his counsel and stated that they were preparing a report to the Attorney Grievance Commission of Maryland.