Commonwealth Bank reverses AI voice bot layoffs
Commonwealth Bank of Australia replaced 45 call-centre agents with an AI voice bot in July 2025, then apologised, rehired the staff, and admitted the rollout tanked service levels after call queues exploded, managers had to jump back on the phones, and the Finance Sector Union filed a Fair Work Commission dispute.
Incident Details
Tech Stack
References
The Commonwealth Bank of Australia is the largest company on the Australian Securities Exchange, with more than 18 million customers, over 55,000 employees, and a market weighting of 11.5 percent of the index. In the 2024-25 fiscal year, CBA posted a record profit of $9.8 billion. It is, by any measure, a company that can afford to take its time with major operational changes.
In July 2025, CBA announced that 45 customer service positions would be eliminated. The bank had introduced an AI-powered "voice-bot" to handle incoming customer calls, and leadership determined that the human roles were no longer necessary. "Our investment in technology, including AI, is making it easier and faster for customers to get help, especially in our call centres," a CBA spokesperson said at the time.
The 45 layoffs were part of a broader restructuring. CBA also cut 304 of its 38,000 Australian staff in favor of 110 new positions in India. The bank was trimming costs while reporting billions in profit. None of this went unnoticed.
The Union Response
The Finance Sector Union reacted immediately and without diplomatic hedging. FSU national secretary Julia Angrisano said: "Just when we think CBA can't sink any lower, they start cutting jobs because of AI on top of sneakily offshoring work to India."
The FSU's critique had two layers. The first was the job losses themselves - 45 people told their work would now be done by software. The second was the context: CBA was eliminating Australian positions and adding cheaper ones in India while sitting on nearly $10 billion in annual profit. The AI narrative provided convenient cover for a cost-cutting exercise. The union was not buying it.
FSU members began collecting accounts from CBA employees about the impact of AI, automation, and offshoring on their workloads and job security. The union filed a dispute with the Fair Work Commission, Australia's workplace relations tribunal. A hearing was scheduled.
The AI Did Not Actually Work
Here is where the story transitions from labor dispute to operational farce. CBA had told the 45 workers their jobs were redundant because the AI voice-bot could handle the call volume. The AI voice-bot could not, in fact, handle the call volume.
FSU members reported that call volumes were increasing, not decreasing, after the AI deployment. The voice-bot had been presented to management as a tool that would reduce the need for human agents. In practice, it appears to have either failed to resolve customer calls (routing them back to humans), frustrated customers into calling again, or both.
The situation deteriorated to the point where CBA management asked team leaders - people whose job is to manage and supervise, not to take customer calls - to get back on the phones. The bank also offered overtime to remaining staff to cover the gap. These are not the actions of an organization whose AI deployment is going well. They are the actions of an organization scrambling to maintain service levels after removing capacity it still needed.
The Reversal
Under pressure from the FSU, the Fair Work Commission dispute, and the evident reality that the voice-bot was not a replacement for 45 human agents, CBA reversed the decision in August 2025.
The bank's statement was notable for its directness, by corporate standards: "CBA's initial assessment that the 45 roles were not required did not adequately consider all relevant business considerations, and this error meant the roles were not redundant." The word "error" is doing significant work in that sentence. CBA was acknowledging that it had told 45 people their jobs were gone based on an assessment that turned out to be wrong.
CBA apologized to the affected employees. Staff were offered the option to return to their positions, be redeployed elsewhere in the bank, or take a voluntary exit payment and leave. The bank said it would review its internal processes "to improve our approach going forward."
CEO Matt Comyn acknowledged the mistake, though he offered a broader hedge in an ABC News interview, saying it was "difficult to predict the impact of AI on jobs in the long term." This is true in the abstract. It is less compelling as a defense when your company just fired 45 people based on a prediction about AI's impact on jobs that turned out to be wrong within weeks.
The AGM Confrontation
The reversal did not end the matter. At CBA's annual general meeting in October 2025, where leadership reported the bank's $10 billion profit to shareholders, FSU members and CBA staff confronted executives over the AI push.
FSU national assistant secretary Nicole McPherson framed the reversal bluntly: "CBA's backflips and admissions show pressure and panic, not principle. Workers deserve better."
The union's position was that CBA had been forced into the reversal not because leadership recognized the AI deployment was premature, but because the union applied enough pressure through the Fair Work Commission to make the reversal less painful than the fight. The apology came after the dispute filing, not before. The review of internal processes came after the public embarrassment, not before.
CBA subsequently said the review had been "completed and has been resolved," reflected in a memorandum from the Fair Work Commission to both sides.
The Assessment Failure
The most revealing part of this story is the speed at which the incorrect assessment became apparent. CBA announced the layoffs in July. By August, call queues were exploding, managers were back on the phones, and the bank was reversing course. That's roughly one month between "the AI can handle this" and "the AI cannot handle this."
This timeline raises a question: how thoroughly was the AI voice-bot tested before 45 people were told their jobs were being eliminated? CBA had claimed the chatbot was diverting 2,000 calls per week from contact centres. That's a real number, but diverting a call is not the same as resolving a call. An AI voice-bot can divert a call by answering it and then failing to help the customer, who then calls back and waits in the human queue. The volume metric looks good on a dashboard. The customer experience does not.
The bank's own statement used language that suggested the failure was in the business assessment rather than the technology: the roles "were not redundant." This framing implied that the AI worked but the organization still needed the humans for reasons not fully captured in the initial analysis. The FSU's account, based on member reports, suggested something simpler: the AI didn't handle the volume, customers waited longer, and the humans were needed because the work was still there.
A Familiar Pattern
CBA's experience follows a pattern that was becoming recognizable by mid-2025. A company deploys AI to automate customer-facing work, announces headcount reductions, discovers the AI does not actually replace the humans, and reverses course after service quality drops or public pressure mounts.
Klarna, the Swedish fintech, went through a version of this cycle. It announced in early 2024 that its AI assistant was doing the work of 700 human agents. By mid-2025, CEO Sebastian Siemiatkowski admitted to Bloomberg that the AI-only approach had produced "lower quality" service, and the company began rehiring humans.
The consistent theme across these cases is that the decision to cut staff is made before the AI has proven it can actually do the job at the required level. The headcount reduction is treated as a fait accompli, with the AI deployment as the justification, rather than waiting for the deployment to demonstrate sustained performance before adjusting staffing levels. The result is a predictable sequence: announce, deploy, discover the AI isn't ready, scramble, reverse, apologize.
For CBA, the cost of this sequence included overtime payments, manager time spent answering phones instead of managing, a Fair Work Commission dispute, an AGM confrontation, and the reputational damage of publicly admitting you fired 45 people based on a faulty assessment. The bank could have avoided all of it by running the voice-bot alongside the existing team for a quarter and measuring actual performance before making staffing decisions. It chose the faster path.
Discussion