The Court Will Now Consider... Whatever ChatGPT Just Made Up. Again!
- Frederick L Shelton
- Jul 13
- 2 min read

"Not only did the judge fail to check the source, they copied the homework without verifying whether it was done by a sentient being or Skynet on mushrooms!"
In a plot twist that makes Idiocracy feel like a documentary, a Georgia trial judge just cited cases that don’t exist, except in the mind of AI hallucinations. It looks like it was copied straight from a party’s filing. You read that right. The bench, once hallowed ground for logic and law, is now taking legal advice from Microsoft Word’s fever dream.
Let’s break this down: A judge, that's right a judge! used made-up citations in an order. The kind of error that would get a first-year associate publicly flogged in a partner’s office was committed by someone in a black robe, wielding real authority.
And why? Because apparently nobody thought to run the citations through Casetext, Lexis, Westlaw, vLex, CoCounsel, Harvey or my favorite - Clearbrief, or literally any of the dozen platforms built precisely to avoid this kind of clown show.
There are more safeguards available today than in a TSA line, and they still managed to miss all of them. So either someone’s running their courtroom tech off a refurbished Dell from 2009, or brace yourself—they went cheap. On legal tech. In a court of law.
Now for the cherry on top: the court “may have borrowed” these hallucinated hallucinations from one of the party’s filings. Not only did the judge fail to check the source, they copied the homework without verifying whether it was done by a sentient being or Skynet on mushrooms!
This isn't a funny little oopsie. This is systemic rot, wrapped in budget cuts, smeared with laziness, and deep-fried in bad judgment. It undermines trust in the judiciary, rewards incompetence, and lets legal Luddites think this is all just a phase. This is exactly what those seeking to sabotage the success of this new, innovative technology hope for in the wildest dreams!
If you're practicing law in 2025 and still relying on AI tools like they’re magic 8-balls, you're part of the problem. Vet your outputs. Use real platforms. Stop pretending free tools are “good enough” for serious legal work.
And judges? If you're citing ChatGPT without checking the case law, maybe it's time to swap the robe for a student's uniform and learn what the hell you should and should not be doing.
Because next time, it won’t just be embarrassing. It’ll be precedent.
Frederick Shelton is a Market Advisor and Consultant to law firms, legal MSO's and funds on subjects which include legal AI, ABS models, MSO's and M&A. He can be reached at fs@sheltonsteele.com






Comments