Ghosts at the Gate: A Call for Vigilance Against AI-Generated Case Hallucinations

By: Christina M. Frohock*

Introduction

As generative artificial intelligence programs become ubiquitous in modern life, case hallucinations are appearing in the legal profession. Courts have sanctioned attorneys for citing judicial opinions that do not exist, with penalties reaching thousands of dollars under Federal Rule of Civil Procedure 11. This Article proposes that the need for vigilance against AI-generated case hallucinations is even more urgent than it appears.

Apparitions in the law are nothing new, dating back to the glory days of paper maps. Cartographers drew phantom settlements as copyright traps to protect their maps and ensnare lazy competitors. No one expected a plot twist: reflecting the gravitas of its appearance in a respected map, the phantom, copyright-trap town of Agloe, New York, arose in real life. A similar fate may await phantom opinions. Without ascribing to AI any nefarious or mischievous intent to lay traps for lazy lawyers, this Article argues that hallucinated case citations may take on a life of their own, gaining traction and respect through repetition and reliance.

Attorneys have a duty of competence, always. They must check that cases are valid—and in existence—before citing them. Attorneys also have a more profound duty: to protect the body of law. Members of the bar should treat the corpus juris as a shared, common good that we are all obligated to keep pristine and healthy. Hallucinated cases lurk as ghosts at the gate, and attorneys must serve as gatekeepers.

*Professor of Legal Writing and Lecturer in Law at the University of Miami School of Law. J.D., magna cum laude, New York University School of Law; M.A. Philosophy, University of Michigan; B.A. Philosophy, University of North Carolina. I am grateful to Cecilia Silver for her support and feedback and to the Penn State Law Review students for their insights and edits.

[FULL TEXT]