McDonald’s pulls IBM’s AI drive‑thru pilot after error videos

Tombstone icon

McDonald's ended its two-year partnership with IBM on automated AI order-taking at drive-thrus in June 2024, removing the technology from more than 100 US locations. The decision followed viral TikTok videos showing the system adding nine sweet teas instead of one, inserting random butter and ketchup packets into ice cream orders, and other absurd errors. McDonald's framed the pullback as a positive, saying the test gave them "confidence that a voice-ordering solution for drive-thru will be part of our restaurants' future."

Incident Details

Severity:Oopsie
Company:McDonald's
Perpetrator:Operations/Product
Incident Date:
Blast Radius:Pilot ended; vendor reevaluation; reputational hit.

McDonald's operates about 27,000 drive-thru locations worldwide. The drive-thru is the company's most important ordering channel - more than 70% of US McDonald's revenue comes through the window. Speed and accuracy at the drive-thru directly affect revenue, customer satisfaction, and labor costs. So when McDonald's announced a partnership with IBM in 2021 to test automated order-taking (AOT) at drive-thrus using AI-powered speech recognition, the ambition was clear: if you could replace or augment the human taking orders through a speaker with an AI system that understood natural language, you could reduce labor costs, increase speed, and potentially improve accuracy.

The test ran for about two years across more than 100 US locations. In June 2024, McDonald's pulled the plug.

What the AI was supposed to do

The IBM-developed system used speech recognition and natural language processing to take customer orders through the drive-thru speaker. A customer would pull up, speak their order, and the AI would interpret it, display it on a confirmation screen, and send it to the kitchen. The system was designed to handle the full complexity of McDonald's menu - customizations, combo meals, special requests, and the ambient noise of a drive-thru lane.

Drive-thru ordering is a deceptively difficult speech recognition problem. The audio environment includes engine noise, wind, multiple speakers in a car, toddlers screaming in the backseat, music from the car stereo, and the ambient sounds of a fast-food kitchen bleeding through the speaker. Customers speak at different speeds, with different accents, at varying volumes, and often change their minds mid-sentence. They use shorthand ("a number two with a Coke"), reference menu items by nickname, and expect the system to understand context ("make that large").

Human order-takers handle this by asking follow-up questions, using context clues, and drawing on experience with common ordering patterns. They can also recognize when someone is not ordering at all - when a passenger is talking to a child, when there is a side conversation in the car, or when a customer is still deciding.

The AI struggled with all of these scenarios.

The viral videos

What ended the IBM partnership was not a technical assessment or a cost-benefit analysis presented in a boardroom. It was TikTok.

Customers began posting videos of their interactions with McDonald's AI ordering system, and the results were entertaining in the way that only corporate AI failures can be. TikTok user Ren Adams shared a video of attempting to order a hash brown, a sweet tea, and a Coke. The AI interpreted her order and added nine sweet teas. She abandoned the order.

In another video, TikTok user Madilynn Cameron ordered water and vanilla ice cream. The AI added two sides of butter and four ketchup packets to her order. A screenshot of her checkout screen confirmed the additions. "McDonald's, I'm done," she said.

Other videos showed the AI adding hundreds of chicken nuggets to orders, interpreting background conversation as menu items, and generating orders that bore no resemblance to what the customer actually said. The videos accumulated millions of views. The comment sections were filled with similar stories from other customers.

McDonald's did not comment on specific incidents. The company's response to the viral content was silence until the June 2024 announcement that the IBM partnership was ending.

The corporate framing

McDonald's press communications about ending the partnership were a masterclass in saying something positive about a failure. Mason Smoot, the company's SVP and chief restaurant officer, sent an internal email about the decision. The public statement read: "Our work with IBM has given us the confidence that a voice-ordering solution for drive-thru will be part of our restaurants' future."

Read that again: testing a system that produced absurd orders, became a viral joke, and had to be removed from every location where it was deployed gave McDonald's "confidence." The statement continued that the company would "continue evaluations to make an informed decision on a future voice ordering solution by the end of the year."

IBM's statement was similarly upbeat: "IBM developed automated order taker technologies with McDonald's to support the emerging use of voice-activated AI in restaurant drive-thrus." Both companies framed the end of the partnership as a natural conclusion of a successful test phase rather than what it visibly was: the cancelation of a technology that was not working.

Why drive-thru AI is hard

The gap between the AI's capabilities and the requirements of drive-thru ordering was not about individual bugs. It was structural.

Speech recognition systems work by converting audio to text and then interpreting that text. Each step introduces errors. In a drive-thru, the speech-to-text accuracy is lower than in a quiet room because of environmental noise. The natural language understanding step then has to map the recognized text to menu items, accounting for synonyms, nicknames, and ambiguous references. When the speech-to-text layer makes an error, the NLU layer has to work with garbled input.

The nine-sweet-tea incident is instructive. The customer said "a sweet tea" once. The system registered it nine times. This suggests either an echo or feedback loop in the audio processing, or the system interpreting ambient noise as repeated requests. The butter-and-ketchup additions suggest the system was interpreting background audio - a passenger's voice, car noise, or cross-talk from the kitchen - as order items.

These are hard problems. They are not unsolvable, but they require a level of audio processing, noise cancellation, speaker isolation, and contextual understanding that the 2021-2024 state of the art did not reliably deliver in the specific environment of a fast-food drive-thru lane.

Scale and the 100-location constraint

The test was limited to just over 100 of McDonald's 27,000 drive-thru locations - less than 0.4% of the fleet. Even at this small scale, the failure rate was visible enough to generate sustained viral content and national news coverage. Scaling the same technology to thousands of locations, across different regional accents, dialects, and noise environments, would have amplified the problems.

The 100-location test was the correct approach for evaluating new technology. The problem was that the results of the evaluation were clear - the technology was not ready - and the decision to end the test came after the failures had already become a public relations issue.

The industry context

McDonald's was not alone in testing AI at the drive-thru. Chipotle opened more than 500 digital drive-thru "Chipotlane" restaurants and tested a robotic kitchen assistant called Chippy for making tortilla chips. Taco Bell and Pizza Hut, both owned by Yum Brands, were investing in AI for ordering and kitchen operations. The fast-food industry's labor costs were a significant expense, and the promise of reducing them through automation drove investment across the sector.

The McDonald's-IBM failure did not kill industry interest in drive-thru AI. But it demonstrated that the gap between controlled demonstrations and real-world deployment was wider than the marketing suggested. A system that works in a lab or a quiet test kitchen is a different product from one that works when a family of five is arguing about what to order while idling behind a pickup truck with a loud exhaust.

What happened after

McDonald's said it would evaluate other voice ordering solutions and make a decision by the end of 2024. The company had an existing partnership with Google Cloud and was working with other technology vendors. The IBM system was removed from all 100+ locations.

The removal was quiet. The viral videos were not. The lasting impression for millions of people who saw the TikTok clips was that McDonald's tried to replace humans with AI and the AI could not tell the difference between one sweet tea and nine. That impression will follow any future AI ordering initiative McDonald's pursues, and any vendor pitching the technology will have to explain how their system is different from the one that could not handle a drive-thru window.

Discussion