Deloitte gets caught using AI hallucinations in a government report - again

Tombstone icon

Seven weeks after Deloitte Australia agreed to partially refund a government contract over AI-fabricated citations, a Newfoundland and Labrador journalist discovered that Deloitte Canada's $1.6 million healthcare workforce report contained at least four fabricated academic citations from papers that don't exist. The fake references named real researchers as co-authors of fictional studies - researchers who confirmed they never wrote the cited work. Deloitte admitted AI was "selectively used to support a small number of research citations," stood by the report's findings, and offered no refund. The province's accounting watchdog launched a formal investigation, and Newfoundland became one of the first Canadian provinces to require AI disclosure in government contracts.

The Report

In March 2023, the Newfoundland and Labrador Department of Health and Community Services hired Deloitte to produce a Health Human Resources Plan - a 526-page document mapping a ten-year strategy for recruiting and retaining healthcare workers across the province. The contract, stemming from a request for proposals issued in March 2022, was worth $1,598,485 CAD, paid in eight installments between March 2023 and March 2025. The finished report was released publicly on May 29, 2025.

Newfoundland and Labrador was in the grip of a healthcare staffing crisis. The province needed a serious, evidence-based plan to address nurse and doctor shortages that were straining rural communities. It hired one of the world's largest consulting firms to deliver that plan. What it got, in part, was fictional scholarship.

The Fabricated Citations

On November 22, 2025, journalist Justin Brake at The Independent - a Newfoundland news outlet - published an investigation revealing that the Deloitte report contained at least four citations to academic papers that do not exist. The fake references shared a pattern: they named real researchers, paired them as co-authors on papers they never wrote, and attributed the fictional work to real journals.

One citation referenced "The cost-effectiveness of a rural retention program for registered nurses in Canada" and listed Professor Martha MacLeod of the University of Northern British Columbia as a co-author. MacLeod confirmed the citation was "false" and "potentially AI-generated." Her team does conduct rural and remote nursing research, she told The Independent, but they "never did do a cost-effectiveness analysis, nor did we ever have the financial data to do it." The AI had taken a real researcher in a related field and attached her name to a plausible-sounding study she never conducted.

A second citation - "The cost-effectiveness of local recruitment and retention strategies for health workers in Canada" - named Gail Tomblin Murphy, an adjunct professor at Dalhousie University's School of Nursing, alongside six other authors. Tomblin Murphy said the paper "does not exist" and that she had worked with only three of the six other researchers listed as her co-authors. "It sounds like if you're coming up with things like this, they may be pretty heavily using AI to generate work," she told The Independent.

A third citation claimed to reference an article in the Canadian Journal of Respiratory Therapy about COVID-19 impacts on respiratory therapists. The report included a hyperlink to the journal's website, but the link pointed to a completely different study. Jason Nickerson, Senior Director of Public Policy at the Canadian Society of Respiratory Therapists, confirmed the article "has not been published in the Canadian Journal of Respiratory Therapy" and said the organization was never consulted in the drafting of the report.

The pattern was consistent with AI-generated references: plausible titles in the right subject area, real researchers in adjacent fields, citations specific enough to look credible on a reference page, and entirely fabricated.

Deloitte's Response

Deloitte did not respond to The Independent's request for comment before the story was published. Four days later, on November 26, the firm issued a statement that attempted a careful distinction: "AI was not used to write the report; it was selectively used to support a small number of research citations."

The gap between "writing" a report and "generating its citations" is narrower than that framing suggests. Citations are the evidentiary foundation of a consulting deliverable. They're the mechanism by which a reader can verify that an analysis rests on real research rather than plausible-sounding assertions. When the citations are fabricated, the report's recommendations become claims the reader has to take on faith - which is a problem when those recommendations are supposed to guide a province's healthcare staffing strategy for the next decade.

Deloitte said it "firmly stands behind the recommendations" and committed to "expeditiously conducting a full review of all the citations." The firm did not address questions about a refund.

The Political Fallout

Premier Tony Wakeham - whose Progressive Conservative government inherited the report from the previous Liberal administration that commissioned it - was direct: "When you hire professional organizations to prepare reports, then you don't expect to get this type of thing happening." He ordered his health minister to review guidelines for AI use in government-commissioned work and said he would look into recouping some of the cost.

NDP Leader Jim Dinn called the use of AI in government reports "disgusting" and said it was "undermining the confidence in our government." He pointed to the Australian precedent, where Deloitte had already issued a partial refund for a similar AI-hallucination incident, and demanded the premier seek money back from the $1.6 million contract. "You're playing with people's lives," Dinn said - a reference to the fact that the report was supposed to guide healthcare staffing decisions in communities already struggling to provide basic medical care.

Jerry Earle, president of NAPE (the province's public and private employees' union), said the document meant to serve as a roadmap for public healthcare HR challenges was now being doubted.

Second Time in Seven Weeks

This was the second Deloitte government report in less than two months found to contain AI-generated fabrications. In October 2025, Deloitte Australia agreed to partially refund a $440,000 AUD contract after its welfare compliance review for the Australian Department of Employment and Workplace Relations turned out to contain fabricated academic citations and a fictitious judicial quote. University of Sydney researcher Christopher Rudge identified those errors, and then found that Deloitte's revised version of the Australian report had introduced even more hallucinated references than the original - a correction that made things worse.

The Australian incident drew international coverage and is already documented on this site. Deloitte admitted to using Azure OpenAI GPT-4o for the Australian report and added a belated AI use disclosure to its revised version.

The Canadian report, released months after the Australian scandal had made headlines worldwide, contained no AI use disclosure at all. Two governments, two reports with fabricated citations, two cases where one of the world's largest consulting firms failed to check whether its own references pointed to real research. The Canadian report cost nearly four times as much as the Australian one.

The Procurement Gap

The original 2023 contract between Newfoundland and Labrador and Deloitte contained no provisions about AI use. No disclosure requirements, no restrictions, no verification obligations. That wasn't unusual for a 2023 contract, but it left the province with no contractual mechanism to demand accountability for AI-generated content in the deliverable it paid $1.6 million to receive.

The province moved quickly after the scandal. Ten days after The Independent's story, the Public Procurement Agency updated its contracting templates with AI-risk mitigation language. By January 15, 2026, new rules required vendors to declare all intended AI use, acknowledge the government's right to assess AI-related risks, and agree to restrict or disable AI functions on request. Newfoundland and Labrador became one of the first Canadian provinces to require AI disclosure in government contracts.

The speed of the procurement overhaul - ten days from scandal to new contract language - suggests the province understood both the gap and the urgency. Whether the new requirements would have prevented the Deloitte situation is unclear: Deloitte's own statement implies the AI use was ad hoc rather than planned, which means a pre-contract disclosure requirement might not have caught it. The harder problem is policing what happens inside a vendor's workflow after the contract is signed.

The Investigation

In late November 2025, Codroy Valley resident Chris Bruce filed a complaint with Chartered Professional Accountants Newfoundland and Labrador (CPANL). On March 7, 2026, the CPANL Complaints Authorization Committee formally announced an investigation into Deloitte. Potential sanctions include suspension or restriction of Deloitte's licence, referral to a disciplinary panel, or formal counsel.

The investigation is ongoing. Deloitte has not offered a refund. The report remains on the government website without an AI use disclosure.

What $1.6 Million Bought

The province's healthcare system is understaffed and under pressure. Nurses and doctors are in short supply, particularly in rural areas. The Health Human Resources Plan was commissioned to provide an evidence-based roadmap for fixing those shortages over the next decade. Parts of that roadmap now rest on evidence fabricated by a machine and never verified by the consultants who billed $1.6 million for expert analysis.

Deloitte told Fortune that the citation corrections "do not impact the report findings." That may be true - the fake citations supported claims that real research might well support too. But a consulting firm that charges government rates for expert work and then uses AI to generate the scholarly apparatus beneath its recommendations has misrepresented what it delivered. The province didn't pay for AI-generated citations to plausible-sounding papers. It paid for consultants who know what the real papers actually say.

Discussion