AI CHAOS — Top lawyers’ fake citations spark legal uproar as AI chaos spreads across professions
B-Metro Reporter
VETERAN lawyer politician Welshman Ncube has landed in hot soup after his law firm submitted fake case law generated by Artificial Intelligence to the Supreme Court of Zimbabwe, exposing the growing dangers of unsupervised AI use in critical professions.
In the case Pulserate Investments (Pvt) Ltd v Andrew Zuze and Others (SC202/25), Ncube’s firm Mathonsi Ncube Law Chambers represented the applicant. But what should have been a routine court filing turned into a nightmare when it emerged that 12 citations in the Heads of Argument were either non-existent or irrelevant — AI hallucinations smuggled into official court documents.
In a grovelling letter dated 3 July 2025, Ncube took full responsibility, stating the authorities were compiled by a junior researcher who relied on an AI tool and failed to verify the information.
“I take full and unequivocal responsibility. I never imagined a graduate researcher would present made-up legal references,” wrote Ncube, calling the incident a “catastrophic lapse in professional judgment.”
The legal community was stunned, not just by the error, but by who made it. Ncube is not a greenhorn. He’s a senior legal mind with decades of courtroom experience — and if AI can trip him up, who is safe?
The incident adds to a worrying global trend of AI-related blunders in the legal sector. In South Africa, multiple lawyers have been hauled before the Legal Practice Council for submitting court papers filled with fictional case law produced by generative AI tools. In one case, a junior advocate used a chatbot called Legal Genius to generate references — only for a judge to discover the cases didn’t exist.
South African law expert Refilwe Motsoeneng said the message from the courts is now clear: AI is not an excuse.
“Judges are warning lawyers, if you cite it, you must verify it. AI hallucinations are no longer funny. They can ruin careers.”
Prominent academic Dr Noni ChiExtra, known for her bold views on AI ethics, said Ncube’s case is a red flag.
“We are entering a phase where professionals look like they know what they are doing, but they don’t. And it’s because of AI,” she said. “We are going to need people to clean up the mess AI is causing, in law, medicine, and engineering, everywhere.”
Dr ChiExtra warned that the future won’t reward those blindly using AI but those who can audit and correct its failures.
“This is just the beginning. We’re seeing doctors misdiagnosing patients because they trust AI-generated reports.
Engineers are designing unsafe structures based on flawed AI simulations. The damage is creeping in, and we need forensic experts to trace it,” she added.
In Zimbabwe, legal minds are calling for a review of how AI is used in legal research. One retired judge said: “We may need a Law Society directive soon. Today it’s Ncube. Tomorrow it could be every law firm in Harare or Bulawayo.”
Ncube, in his apology, said the incident was personally humiliating and stressed there was no intent to mislead. He said the propositions cited were trite and not dependent on case law, but acknowledged that written arguments hold serious weight.
“The heads of argument are not just side notes. Judges rely on them heavily. There’s no room for fiction,” a legal analyst told B-Metro.
While the Supreme Court hasn’t ruled on the consequences yet, the reputational hit is done. The case continues, but Ncube’s AI embarrassment is now part of Zimbabwe’s legal history.
As courts, hospitals and boardrooms wrestle with the unintended consequences of AI, it is evident that, while AI might speed things up, one mistake could cost your career.
The post AI CHAOS — Top lawyers’ fake citations spark legal uproar as AI chaos spreads across professions appeared first on herald.