AI Hallucination Reaches the Pennsylvania Superior Court

One not-so-great litigation trend in 2025 has been lawyers and even a few judges citing or relying on imaginary legal decisions created by generative artificial intelligence databases. This happens when someone asks one of these AI programs for legal authority supporting an argument and the program makes up — or “hallucinates” — case law that does not exist.

In late October, this phenomenon struck the Pennsylvania Superior Court in Saber v. Navy Federal Credit Union. This case arose in the context of an auto purchase. A credit union loaned a customer about $42,000 so he could buy a Jeep Grand Cherokee. Shortly after taking the loan, the customer claimed it was void. He filed suit against the credit union, asking for clear title to the Jeep. The trial court denied relief, and the customer appealed to the Superior Court. The customer pursued his appeal pro se, meaning he was not represented by a lawyer.

In his appellate brief, the customer contended the trial court’s decision conflicted “with established precedent, including D’Happart v. First Commonwealth Bank, 110 A.3d 167 (Pa. Super. 2015) and 291 A.3d 1026 (Pa. Super. 2023).” But there was one problem: no case with either citation exists. The customer also referenced “several other cases” that “do not exist.” That led the court to surmise the customer had used “totally fabricated court decisions” that were the hallucinations of a generative AI program.

When confronted with this situation, some courts issue warnings or sanctions. So what did the Superior Court do in Saber? Rather than warn or sanction the customer (who, as noted, was pro se), the court found appellate waiver. Under Pennsylvania appellate precedent (and that of most other appellate courts), parties must provide developed arguments in their briefs. This includes appropriate citations to “pertinent” legal authorities. In Saber, with the “non-existent” and “nonsensical” fake citations removed, the customer’s brief lacked citation to any relevant legal authority. This led the court to find he waived his issue. As the court explained:

The use of [generative AI] to draft legal filings (including by pro se litigants), without verification of the accuracy of the content so produced, may lead to misstatements and/or misrepresentations of legal authority. Such [generative AI]-generated misstatements and/or misrepresentations are not “pertinent” authority. Additionally, such misstatements and/or misrepresentations, if further disseminated, would undermine the sense of accuracy and reliability of the law they purport to reference.

In addition to waiver risk, the Saber decision also highlights the special problem that fake citations pose for online research services, like Westlaw and Lexis. On one hand, these “citations” are actually mentioned in the briefs and court decisions the services reproduce, so Westlaw and Lexis have little choice but to include the fictional citations’ text. But the cases do not exist, so Westlaw and Lexis cannot link the citations to real cases. Westlaw has addressed this predicament by including disclaimers like this one, which it included with the Saber case:

Editor’s Note: This decision contains discussion of citation references that are incorrect or do not actually exist. These invalid citations appeared in the original court opinion and have been preserved as written since they are part of the official record. Any links to these invalid citations have been removed.

The Superior Court’s Saber decision is just the latest of many cautionary tales involving the risks associated with using generative AI to prepare legal filings. It can lead to warnings and sanctions. It can also lead to a waiver finding, as in Saber. Until these programs become more dependable, the best approach is to ensure that a human has checked all the legal authorities cited in a court filing to ensure they actually exist and have been accurately portrayed.

Print
Close