The use of artificial intelligence in the courtroom is having a moment.
Fugees rapper Pras has accused his former trial attorney David Kenner of using an artificial intelligence program to write a closing argument that proved to be “damaging” to his case, according to a court document obtained by The Times.
The motion, filed Tuesday in a federal courthouse in Washington, D.C., is Pras’ attempt to demand a new trial after he was found guilty earlier this year of political conspiracy. He now faces 20 years in prison. That day of the filing, Pras reunited with Fugees members Wyclef Jean and Lauryn Hill for the Miseducation of Lauryn Hill 25th anniversary tour, performing in Pras’ native New Jersey. In the legal filing, the rapper’s new attorney, Peter Zeidenberg, cited a host of alleged errors committed by Kenner, including accusations that he “failed to properly prepare for trial” and didn’t understand the basic facts of the case.
But beyond Kenner, Pras’ motion also pointed fingers at the AI software that was used to write his closing argument: “The AI program failed Kenner, and Kenner failed Michel.”
While the use of AI expands across various industries, debate has swirled around how the technology fits into, and may change, human labor and culture. And now, with Pras’ legal filing, the technology’s use in the legal and criminal system is under added scrutiny.
By zooming in on AI’s role in Pras’ case, do the rapper’s new attorneys have the right culprit? Or did the AI do exactly what it was trained to do — help human attorneys be faster and more efficient?
Pras’ beef with Kenner and AI
In April, a federal jury found Pras, whose legal name is Prakazrel Michel, guilty on all counts of conspiracy, concealment of material facts, making false entries in records, witness tampering and serving as an unregistered agent of a foreign power. The charges included a scheme to illegally contribute to then-President Obama’s 2012 reelection campaign.
But the verdict was avoidable if Pras had had adequate legal representation, the rapper’s new attorneys argued in their recent motion for a new trial.
Kenner is a Los Angeles-based attorney who has defended other hip-hop celebrities, including mogul Suge Knight and West Coast rap icon Snoop Dogg. Amid legal controversies in the 1990s and early 2000s involving Knight’s Death Row Records, Kenner built a reputation for being aggressive and well-prepared in the courtroom.
However, Pras said the opposite was true during his federal trial.
Pras’ new legal team, which includes civil-rights attorney and former Democratic Sen. Doug Jones of Alabama, compiled a laundry list of Kenner’s alleged missteps: failing to familiarize himself with the charges against his client, outsourcing trial preparation to contract attorneys, not preparing himself for trial, missing objections during trial and not preparing Pras for his testimony and cross-examination.
And atop that list against Kenner is the use of the software EyeLevel, a generative AI technology that is designed to help attorneys prepare for cases.
“Kenner used an experimental AI program to write his closing argument, which made frivolous arguments, conflated the schemes, and failed to highlight key weaknesses in the Government’s case,” the motion said. It called the AI-generated argument “damaging to the defense” and “deficient, unhelpful, and a missed opportunity.”
The motion also blamed the AI for another closing-statement misfire, where Kenner quoted a lyric, “Every single day, every time I pray, I will be missing you,” and misattributed it to the Fugees song “Ghetto Supastar (That Is What You Are).” The lyric is actually from the 1997 hit “I’ll Be Missing You” by Diddy and Faith Evans.
Pras further accused Kenner in the court filing of having financial ties to EyeLevel and alleged that he hid the partnership and bragged about using it for monetary gain. The motion referenced an EyeLevel press release published after the case, touting the moment as “the first use of generative AI in a federal trial.” In the press release, Kenner called the technology “an absolute game changer” and said it “turned hours or days of legal work into seconds.” Pras’ attorneys alleged Kenner’s comments are proof that he has financial interests in the company.
Kenner did not respond to The Times’ request for comment.
‘They’re putting AI on trial’
Neil Katz, EyeLevel’s co-founder and chief operation officer, has dismissed Pras’ allegations as “fictional writing.”
“Had they used our AI, they might’ve filed something that told the truth,” Katz told The Times in an interview Thursday. He added that Pras’ motion mischaracterized the function of his AI tech.
Pras’ motion labeled EyeLevel as “experimental” and faulted it for various errors in the closing argument. However, EyeLevel AI is unlike other well-known generative AI software, such as ChatGPT, that draws information from large swathes of data on the internet. That kind of AI has been known for its tendency to produce incorrect answers.
Instead, Katz said, his company’s AI functions only on the narrow sets of information that humans feed it — in this case, relevant court documents, such as trial transcripts from legal proceedings. Describing the AI as “a tool,” Katz said that during Pras’ trial, the AI was used to beef up a draft of the closing arguments but did not write the entire thing. The human lawyer, he said, must ultimately use their discretion.
“The human attorneys used them to improve their remarks,” Katz said. “They ultimately were the final arbiters of what to say in court.”
Katz called EyeLevel “battle-tested” and said the AI is being used by 100 companies, including large clients such as Exxon and IBM. “Pras’ team is putting AI on trial,” Katz said. “We think overall, in time, AI will win that case broadly.”
However, Katz’s sights for the technology’s long-term benefits are set on low-income clients; he hopes it will help even the legal playing field. He said he hopes the AI’s ability to do complex, rapid research and produce key questions and other ideas within a case will give an individual attorney the same firepower as a wealthy legal team, only cheaper.
“One lawyer can act like five or six or seven,” Katz said. “The beneficiaries will be the least well-off Americans.”
Will lawyers continue to use AI?
Amid the scrutiny AI is drawing due to Pras’ case, more and more people in the legal field are using it in cases.
Sharon Nelson, president of Sensei Enterprises, a digital forensics, cybersecurity and information technology firm, told the Associated Press that many firms are already using similar AI technology and that surveys indicate more than 50% of lawyers expect to within the next year.
“It’s gone much faster than we thought,” she said. “The problem is, if you don’t work with it, you’re going to be left behind.”
However, legal governing bodies, such as the American Bar Assn., still don’t have any guidelines on the use of AI in cases. A new task force is studying the issue, an ABA spokeswoman told AP.
Even so, the ABA has long advocated for the use of AI, touting the benefits of earlier forms of the technology as early as 2017 as a means of reducing “attorney stress and frustration.” The association said AI can be used to save time on document review, proofreading and legal research and added, “Work produced by intelligent software — which doesn’t get tired, bored or distracted — can be truly error-free.”
As AI best practices continue to develop alongside the tech’s advancement, a recent Bloomberg Law article suggested that attorneys refer to what is already codified: the ABA’s rules for professional conduct, such as its opinions on outsourcing work.
And John Villasenor, a professor of engineering and public policy at UCLA, echoed what Katz argued: Humans must still play a major role when using AI. He cautioned in an interview with AP that “attorneys that use AI should make sure they very carefully fact-check anything they are going to use.”
“A good attorney coming up with closing arguments will be mindful of basic goals of the case,” he said, “but also of the specific ways in which the trial has played out.”