In a landmark decision, a Georgia appeals court has raised serious concerns regarding the use of generative AI in legal filings, highlighting potential issues that can arise when lawyers rely on artificial intelligence.
The judges pointed out that irregularities in the filings suggested they were drafted using generative AI, with the presence of bogus cases cited in the trial court’s order causing particular alarm.
This case, involving a divorce dispute, marks what may be the first time a Georgia appeals court has directly addressed complications stemming from the apparent use of AI-generated content by attorneys.
U.S. Supreme Court Chief Justice John Roberts, in his 2023 report on the judiciary, had previously warned about the pitfalls of commonly used AI applications, which are often prone to “hallucinations.” These inaccuracies can lead attorneys to submit briefs that reference fictitious cases, a concern echoed by the judges in this ruling.
The case originated when a husband filed for divorce in April 2022 in DeKalb County, receiving a divorce decree only three months later. However, his ex-wife later sought to reopen the case in October 2023, claiming she had not been properly served and had moved to Texas in 2021. DeKalb County Superior Court Judge Yolanda Parker-Smith rejected her request in a subsequent order issued in May 2024.
In her appeal, the ex-wife noted that the governing order relied on two fake cases, which rendered it invalid and undermined her ability to respond appropriately.
After examining the case record, appellate judges concluded that the order by Judge Parker-Smith appeared to have been prepared by the husband’s attorney, Lynch, who also cited the same fictitious cases in later filings. The judges further criticized Lynch for seeking attorney fees related to the ex-wife’s appeal and using a made-up case to justify her request.
The judges expressed their frustration, stating, “We cannot find the cited case, Johnson v. Johnson, either by case name or citation. And, not surprisingly, we could not locate the case by its purported holding, which is a blatant misstatement of the law.”
As a result of Lynch’s actions, the judges imposed the maximum penalty for what they deemed a “frivolous” attempt to claim attorney fees, taking the significant step of remanding the case back to Judge Parker-Smith for reevaluation of the ex-wife’s request to vacate the divorce decree.
The Georgia judges referenced a study conducted by Stanford researchers indicating that generative AI models, including ChatGPT, tend to “hallucinate” 75% of the time when it comes to answering questions about court rulings.
Moreover, the situation is not isolated. Recent filings in a consumer credit case before a federal judge in Atlanta displayed similar issues, as a plaintiff’s attorney was asked to clarify references to a non-existent case. The attorney, Naja Hawk, admitted she could not verify the citation and withdrew reliance on it, attributing the error to an oversight.
To gain a deeper understanding of the impact of artificial intelligence on the judiciary, a committee led by Georgia Supreme Court Justice Andrew Pinson is currently evaluating the situation. This committee, in collaboration with the National Center for State Courts, aims to devise recommendations that will help bolster public trust and confidence in Georgia’s judicial system as AI technology becomes increasingly prevalent.
image source from:ajc