Customer Disservice Stories

12 disasters tagged #customer-disservice

Tombstone icon

California community colleges spend millions on AI chatbots that give students wrong answers

Mar 2026

California community college districts are spending millions of taxpayer dollars on AI chatbots from vendors like Gravyty and Gecko - ostensibly to help students navigate admissions, financial aid, and campus services. A CalMatters investigation found the bots routinely serve up inaccurate or flat-out wrong answers instead. Three districts reported annual chatbot costs ranging from $151,000 to nearly half a million dollars. At Fresno City College, the student government vice president said her school's mascot-branded chatbot repeatedly botched basic campus questions. The OECD found it noteworthy enough to log in its AI Incidents and Hazards Monitor.

Facepalmby AI vendor
Millions of dollars spent across multiple California community college districts; students misdirected on admissions, financial aid, and campus services
ai-assistantcustomer-disserviceedtech+1 more
Tombstone icon

Woolworths reconfigured AI assistant after it claimed to be human and talked about its 'angry mother'

Feb 2026

Australian supermarket chain Woolworths had to reconfigure its AI phone assistant Olive after customers reported it fabricated personal stories about having a mother with an "angry voice," insisted it was a real person, and engaged in irrelevant banter during support calls. The chatbot, recently upgraded with Google Gemini Enterprise, also gave inaccurate product pricing. Woolworths retired the assistant's human-style persona after complaints spread on Reddit and X.

Facepalmby Product Manager
Customer frustration across Australia's largest supermarket chain; inaccurate product pricing; AI persona retired after public complaints
ai-assistantcustomer-servicecustomer-disservice+2 more
Tombstone icon

AI customer service fails at 4x the rate of other AI tasks

Jan 2026

Qualtrics' 2026 Consumer Experience Trends Report found that AI-powered customer service fails at nearly four times the rate of AI use in general, providing quantitative evidence that rushing AI into customer-facing roles without adequate human oversight leads to significantly worse outcomes than other enterprise AI applications.

Facepalmby Executive
Industry-wide data showing enterprises are deploying AI customer service poorly; contributes to documented customer churn and brand damage patterns.
ai-assistantcustomer-servicecustomer-disservice+1 more
Tombstone icon

AI-only support is bleeding customers before it saves money

Oct 2025

Acquire BPO’s 2024 AI in Customer Service survey found 70% of U.S. consumers would bolt to a rival after just one bad chatbot interaction and 72% only buy when a live agent safety net exists, even as CMSWire reports enterprises poured $47 billion into AI projects in early 2025 that delivered almost no return. CX strategists now warn executives that Air Canada–style hallucinations, mounting legal liability, and empathy gaps make AI-only helpdesks a churn machine unless human agents stay in the loop.

Facepalmby Executive
Customer churn, wasted automation budgets, and tribunal-tested liability for brands that replace human support with hallucination-prone bots.
ai-assistantcustomer-servicecustomer-disservice+3 more
Tombstone icon

Klarna reintroduces humans after AI support both sucks, and blows

Sep 2025

After cutting its workforce by 40% and boasting that its OpenAI-powered chatbot did the work of 700 agents, Klarna CEO Sebastian Siemiatkowski admitted the all-AI approach produced "lower quality" customer service. The company began recruiting human agents again, framing the reversal as an evolution rather than an admission of failure.

Facepalmby Executive
Service quality/customer experience issues; operational/personnel cost; reputational damage.
ai-assistantcustomer-servicecustomer-disservice+3 more
Tombstone icon

Taco Bell's AI drive-thru becomes viral trolling target

Aug 2025

Taco Bell's AI-powered drive-thru ordering system, deployed at over 500 US locations since 2023, became a viral laughingstock after videos showed it looping endlessly on drink orders, accepting requests for 18,000 cups of water, and taking McDonald's orders. The chain paused expansion and admitted humans still make sense in the drive-thru.

Oopsieby Operations/Product
Viral social media backlash; system reliability questioned.
ai-assistantcustomer-disserviceproduct-failure+2 more
Tombstone icon

Commonwealth Bank reverses AI voice bot layoffs

Aug 2025

Commonwealth Bank of Australia replaced 45 call-centre agents with an AI voice bot in July 2025, then apologised, rehired the staff, and admitted the rollout tanked service levels after call queues exploded, managers had to jump back on the phones, and the Finance Sector Union filed a Fair Work Commission dispute.

Facepalmby Operations Leadership
Customers saw long waits, overtime costs spiked, and leadership publicly reversed the redundancies after the rushed deployment failed.
ai-assistantautomationcustomer-service+2 more
Tombstone icon

FTC sues Air AI over deceptive AI sales agent capability claims

Aug 2025

FTC accused Air AI of bilking millions from small businesses with false claims that its Odin AI could replace human sales reps; but - would you believe it? - the AI tech was faulty and often nonfunctional. Who could've guessed!

Catastrophicby Exec
Millions lost by small businesses; individual losses up to $250K; FTC lawsuit with TRO request.
automationlegal-riskcustomer-service+2 more
Tombstone icon

McDonald’s pulls IBM’s AI drive‑thru pilot after error videos

Jun 2024

McDonald's ended its two-year partnership with IBM on automated AI order-taking at drive-thrus in June 2024, removing the technology from more than 100 US locations. The decision followed viral TikTok videos showing the system adding nine sweet teas instead of one, inserting random butter and ketchup packets into ice cream orders, and other absurd errors. McDonald's framed the pullback as a positive, saying the test gave them "confidence that a voice-ordering solution for drive-thru will be part of our restaurants' future."

Oopsieby Operations/Product
Pilot ended; vendor reevaluation; reputational hit.
ai-assistantbrand-damagecustomer-disservice+2 more
Tombstone icon

Air Canada liable for lying chatbot promises

Feb 2024

Jake Moffatt used Air Canada's website chatbot to ask about bereavement fares after his grandmother died. The chatbot told him he could book at full price and apply for a bereavement discount within 90 days. Air Canada's actual policy did not allow retroactive bereavement fare claims. When Moffatt applied, the airline denied the refund and admitted the chatbot had provided "misleading words" - but argued Moffatt should have checked the static webpage instead. British Columbia's Civil Resolution Tribunal ruled in Moffatt's favor in February 2024, finding Air Canada liable for negligent misrepresentation and rejecting the airline's argument that it wasn't responsible for its own chatbot's statements.

Facepalmby Product Manager
Legal liability; refund + fees; policy/process review.
ai-hallucinationautomationcustomer-service+2 more
Tombstone icon

DPD’s AI chatbot cursed and trashed the company

Jan 2024

UK parcel delivery firm DPD (Dynamic Parcel Distribution) had to disable its AI-powered customer service chatbot in January 2024 after customer Ashley Beauchamp demonstrated he could make it swear, call DPD "the worst delivery firm in the world," write disparaging poems about the company, and recommend competitors. The meltdown followed a system update, and Beauchamp's screenshots went viral on social media. DPD said the chatbot had operated successfully "for a number of years" before the update introduced the error, and disabled the AI element while it worked on fixes.

Facepalmby Product Manager
Public embarrassment; service channel disabled; reputational hit.
automationbrand-damagecustomer-service+2 more
Tombstone icon

Chevy dealer bot agreed to sell $76k SUV for $1

Dec 2023

Chevrolet of Watsonville, a California car dealership, deployed a customer service chatbot powered by ChatGPT and built by a company called Fullpath. After Chris White noticed the chat widget was "powered by ChatGPT," word spread online and pranksters descended. Chris Bakke manipulated the bot into "the customer is always right" mode, got it to append "and that's a legally binding offer - no takesies backsies" to every response, then asked to buy a 2024 Chevy Tahoe for $1. The bot agreed. Others got it to recommend Ford vehicles, write Python code, and provide general ChatGPT-style answers unrelated to cars. The dealership pulled the chatbot entirely.

Oopsieby Dealer Marketing/IT
Bot pulled; viral reputational bruise; no actual $1 sales.
automationbrand-damagecustomer-service+2 more