FTC sues Air AI over deceptive AI sales agent capability claims
FTC accused Air AI of bilking millions from small businesses with false claims that its Odin AI could replace human sales reps; but - would you believe it? - the AI tech was faulty and often nonfunctional. Who could've guessed!
Incident Details
Tech Stack
References
On August 25, 2025, the Federal Trade Commission filed suit against Air AI Technologies, Inc. in the U.S. District Court for the District of Arizona. The complaint named the Delaware-incorporated company (which also operated as "Scale 13"), several affiliated entities including Apex Holdings Group LLC and Apex Scaling LLC, and individual defendants Ryan Paul O'Donnell and Thomas Matthew Lancer.
The FTC's case was straightforward: Air AI had made big promises about what its conversational AI product could do, charged small businesses significant sums of money based on those promises, and the technology did not work as advertised. The agency asked the court for a temporary restraining order and equitable relief while it pursued a preliminary injunction.
The Product
Air AI's flagship offering was a conversational AI system called "Odin." The pitch was that Odin could replace human sales representatives entirely. According to Air AI's marketing, the technology could conduct "long phone conversations that sound like a REAL human, with infinite memory, perfect recall, and can autonomously take actions across 5,000 plus applications." The AI allegedly required no ramp-up time, no training, no management, and no motivation. It would simply be better than a human employee at selling, day one, every day.
The company sold access to Odin through an "Air AI Access Card" product. Beyond the base product, Air AI also offered a "Licensing Business Opportunity" - essentially, customers could purchase a license to resell "the world's most sophisticated conversational AI" to other businesses. This layered structure is familiar to anyone who has studied business opportunity schemes: the product itself and, on top of it, the opportunity to sell the product to others.
The Earnings Claims
Air AI's marketing did not stop at technology claims. The company told prospective customers they could make significant money. According to the FTC complaint, marketing materials claimed consumers were making a million dollars using Air AI's services, or that purchasers would earn back tens of thousands of dollars within 30 days.
These earnings claims were unsubstantiated, the FTC alleged. Most consumers who purchased Air AI's products did not make the promised profits. Some lost substantial amounts. The FTC said individual losses reached as high as $250,000, with some entrepreneurs and small business owners going into debt to cover what they'd spent on a product that wasn't delivering returns.
The combination - bold technology claims plus aggressive earnings promises - is a well-documented pattern in enforcement actions. The technology claim gets the customer interested. The earnings claim gets them to open their wallet. When neither materializes, the customer is left with a product that doesn't work and a bill they can't justify.
The Technology Didn't Work
This is the part of the story where the FTC complaint stopped being about marketing and started being about the actual product. According to the agency, even when Air AI's conversational AI was available to customers, it "does not function as advertised."
The specifics were damning. The AI was "faulty, often not able to perform basic functions like placing outbound calls to businesses, scheduling appointments, taking down email addresses, or responding accurately to questions." These are not edge cases or advanced capabilities that a reasonable person might understand as aspirational. Placing a phone call, writing down an email address, and scheduling an appointment are the absolute minimum of what a sales tool needs to do. Odin could not reliably do any of them.
The FTC complaint described an additional layer of failure. Rather than being the autonomous, zero-effort solution Air AI had promised, getting the conversational AI to function at all required "a substantial time commitment where consumers must pre-script answers for every potential question, making it nearly impossible to use." In other words, the "AI" that was supposed to replace human sales representatives required a human to manually write out responses to every possible customer question in advance. That's not artificial intelligence augmenting human work. That's a human doing all the work of anticipating a conversation and the AI (poorly) reading a script.
The FTC's Strategy
The Air AI case was the FTC's fourth "AI-washing" enforcement action in 2025, and its twelfth such case since it began the campaign in 2024. The term "AI-washing" refers to companies making deceptive claims about AI capabilities - either exaggerating what their technology can do, or claiming to use AI when the product is something else entirely.
The case was also the FTC's seventh AI-washing case involving the sale of business opportunities or coaching services. This sub-category of AI fraud targets entrepreneurs and small business owners with the dual promise of cutting-edge technology and income potential. The structure often leans more toward a business opportunity scheme that happens to use "AI" as its hook than toward a legitimate technology product.
DLA Piper's analysis of the case noted that it may be the first consumer protection action to allege deception regarding two specific claims: the marketing of agentic AI (autonomous AI systems that operate with minimal human oversight), and claims about AI replacing human employees. Both claims have become standard marketing language across the AI industry. The Air AI case was a signal that the FTC considered "our AI replaces humans" to be a claim that requires substantiation, not just an aspiration.
The timing was notable for political reasons as well. The White House's AI Action Plan had directed the FTC to review investigations started under the previous administration "to ensure they do not advance theories that unduly burden innovation." The FTC proceeded with the Air AI case anyway. Perkins Coie's analysis observed that the agency's AI-washing enforcement did not appear to treat the White House directive as a constraint, particularly for cases like Air AI where the legal theories were straightforward false advertising rather than novel regulatory interpretations.
The Same-Day Context
On the same day the FTC announced the Air AI case, it also announced a proposed settlement in a separate AI-washing case against Click Profit LLC. That case involved business opportunity sellers who had promised to create and operate online stores using an "AI-backed bot." The proposed settlement banned the defendants from selling any business opportunities and imposed partially suspended judgments totaling $20.9 million ($7.3 million against one set of defendants, $13.6 million against another).
The FTC was sending two messages on the same day: we are filing new cases (Air AI), and we are resolving existing ones with serious penalties (Click Profit). For companies selling AI products to small businesses with inflated capability and earnings claims, the enforcement pipeline was active in both directions.
The Pattern
Air AI represents a specific category of AI failure that is different from a chatbot hallucinating or an image generator violating copyright. This is about a company selling AI capabilities that did not exist, to customers who could not afford to lose the money, with earnings promises that could not be fulfilled.
The $250,000 maximum individual loss figure in the FTC complaint is striking in context. The customers buying Air AI's products were not enterprise clients with dedicated procurement teams and legal departments reviewing vendor claims. They were entrepreneurs and small business owners, often operating without technical expertise to evaluate whether an AI product could actually do what the marketing said. The person spending $250,000 on an "AI sales agent" that can't place a phone call or record an email address is someone who trusted marketing claims that the FTC says were false.
The FTC's Section 5 authority covers unfair or deceptive business practices. The Telemarketing Sales Rule and Business Opportunity Rule provide additional frameworks for exactly the kind of business-opportunity-plus-technology structure Air AI used. The legal tools for this enforcement were not new. What was new was the AI framing - the idea that a magical AI agent would do the hard work of sales for you, cheaper and better than a person, with no setup required.
Air AI promised an autonomous sales agent that never needed to sleep, eat, or be motivated. What it delivered was software that couldn't place a phone call, paired with a manual scripting requirement that turned the "AI" into an expensive text-to-speech reader. The FTC's case was not about whether AI can replace salespeople. It was about what happens when a company says it can, charges money for it, and it can't.
Discussion