Anthropic counsel has apologized for a citation ‘hallucination’ in an expert testimony submitted as part of a copyright battle with music publishers. Photo Credit: Igor Omilaev
Time to lay off the use of AI in legal documents? Amid a high-stakes copyright battle with music publishers, Anthropic attorneys have apologized for an apparent citation “hallucination,” pinning the blame mainly on Claude.
We broke down the citation crisis after counsel for the music publisher plaintiffs formally voiced related concerns to the court. Anthropic data scientist and expert witness Olivia Chen, the publishers maintained in more words, had seemingly referenced a non-existent academic paper.
Unsurprisingly, the serious allegation prompted the presiding judge to order an explanation on the part of Anthropic. And this explanation arrived in the form of a declaration from Latham & Watkins associate Ivana Dukanovic.
The way Dukanovic tells the story, an internal investigation confirmed “that this was an honest citation mistake and not a fabrication of authority.”
Running with the point, the Anthropic attorney indicated that the relevant American Statistician citation “includes an erroneous author and title, while providing a correct link to, and correctly identifying the publication, volume, page numbers, and year of publication of, the article referenced.”
So what happened? Well, according to the same declaration, Claude took some liberties when citing not just the American Statistician article, but other sources used in Chen’s testimony.
“After the Latham & Watkins team identified the source as potential additional support for Ms. Chen’s testimony,” Dukanovic penned, “I asked Claude.ai to provide a properly formatted legal citation for that source using the link to the correct article.
“Unfortunately, although providing the correct publication title, publication year, and link to the provided source, the returned citation included an inaccurate title and incorrect authors,” she continued.
Claude is also said to have fumbled with “additional wording errors” in different citations. Though so-called AI hallucinations aren’t exactly rare – including in legal settings – the situation certainly draws attention to the law firm’s review approach.
“During the production and cite-checking process for Ms. Chen’s declaration,” Dukanovic weighed in here, “the Latham & Watkins team reviewing and editing the declaration checked that the substance of the cited document supported the proposition in the declaration, and also corrected the volume and page numbers in the citation, but did not notice the incorrect title and authors, despite clicking on the link provided in the footnote and reviewing the article.”
These remarks may raise more questions than they answer. Chief among them: If one has to make all sorts of corrections to AI-powered legal citations, wouldn’t it be preferable to tackle the process without consulting a chatbot at all?
And at the risk of throwing salt on the imaginary-citation wound, it’s safe to say the reviewing team’s performance left something to be desired.
But as the (incorrectly) cited article actually exists, the “embarrassing and unintentional mistake” doesn’t mean “Chen’s opinion was influenced by false or fabricated information,” per the text.
“We have implemented procedures, including multiple levels of additional review, to work to ensure that this does not occur again,” added Dukanovic.
DMN asked Claude about the episode, and even it advised against using LLMs for legal citations.
“Regarding citation hallucinations more generally – this is a known limitation of large language models like myself,” Claude responded. “When asked to provide citations, if I don’t have perfect recall of specific sources, I might generate what seem like plausible citations based on my training patterns rather than accurate bibliographic information.
“For any situation requiring accurate citations, the best practice would be to use dedicated academic search tools and databases rather than relying on an AI system to recall specific publication details from memory,” Claude continued.
Content shared from www.digitalmusicnews.com.