Switch to ADA Accessible Theme
Close Menu
Kissimmee Personal Injury & Criminal Attorney
Call For a Free Initial Consultation
407-483-0500
Kissimmee Criminal Defense & Injury Attorney > Blog > Criminal Defense > AI Hallucinations in the Legal World

AI Hallucinations in the Legal World

CrimeDefense

When a California man facing illegal gun possession charges was ordered to be held without bail, the man’s defense team argued that the charges clearly did not warrant such a punitive response. Prosecutors produced multiple pages of their rationale to support their stance. Unfortunately for them, their exhaustive documentation was peppered with errors.

AI Mistakes 

As it so happens, the prosecutor’s office involved in the case was using AI to build and strengthen arguments in several cases. Ironically, in each situation that help led to severe misinterpretations of law, as well as to quotations that were non-existent in cited text. There were convincing clues revealing that AI was the culprit behind the blunders, leading defense attorneys to take the case to the California Supreme Court. They hoped that they could produce a pattern of erroneous legal interpretations and case citations. T

Problems with AI 

There were 22 technology researchers and legal scholars alongside the attorneys in court. These specialists counseled that the unrestricted use of artificial intelligence in the legal field could contribute to improper sentencing and /or wrongful convictions. Legal documents have been conspicuously interspersed with errors as a result of the use of Gemini and ChatGPT, which have been commonly used to prepare anything from essays and emails to legal briefs. Since these tools have been proven to contrive fictional answers to legal questions, when the use of AI goes unchecked, they said, the outcomes can be disastrous.

Gary Marchant , an Arizona State University law professor, granted that inaccuracies in court papers that are the result of AI are most likely a symptom of negligence as opposed to deliberate deception. But that doesn’t mediate the severity of outcomes of AI mistakes. Because sycophancy is a known characteristic of AI, answers it provides often stretch the truth in an effort to reveal an answer that supports a particular argument. Commonly referred to as hallucinated content, nearly 600 cases of the issue have been detected worldwide so far, with more than 60 percent occurring in U.S. courts. That leads to some fascinating questions:

  • Because studies indicate that as many as 82 percent of legal queries on chatbots result in hallucinations, (prompting a cautionary warning from Supreme Court Chief Justice Roberts in 2023) can court documents created with AI be trusted?
  • Since three out of four lawyers say they plan to use AI in their work, how extensive will problems related to hallucinations be?
  • Because even AI tools that claim to reduce hallucination issues produce errors in 17 to 34 percent of uses, should mandated restrictions on the use of AI in legal work be drafted?

Protecting Your Rights

The experienced Kissimmee and Orlando criminal defense attorneys at Salazar & Kelly Law Group always fight for the best outcomes for our clients. To discuss your case, schedule a confidential consultation in our office today.

Source:

nytimes.com/2025/11/25/us/prosecutor-artificial-intelligence-errors-lawyers-california.html?smid=nytcore-ios-share

Facebook Twitter LinkedIn

© 2020 - 2026 The Law Offices of Salazar & Kelly Law Group, P.A. All rights reserved.
This law firm website and legal marketing are managed by MileMark Media