AI “Biden” robocalls told voters to stay home; fines and charges followed

Tombstone icon

Two days before New Hampshire's January 2024 presidential primary, between 5,000 and 25,000 voters received robocalls featuring an AI-cloned version of President Biden's voice, complete with his trademark "what a bunch of malarkey" catchphrase. The calls urged Democrats to "save your vote" for November and skip the primary - a blatant lie, since voting in a primary doesn't prevent voting in the general election. Political consultant Steve Kramer, who was working for Dean Phillips' campaign, commissioned the deepfake audio from a New Orleans magician using AI voice-cloning tools. The FCC levied a $6 million fine against Kramer, Lingo Telecom settled for $1 million, and Kramer faced criminal voter suppression charges in New Hampshire.

Incident Details

Severity:Facepalm
Company:Lingo Telecom / Steve Kramer
Perpetrator:Political Consultant
Incident Date:
Blast Radius:Voter confusion; enforcement actions; national scrutiny of AI voice-clones.

The Call

On January 21, 2024 - two days before New Hampshire's presidential primary election - thousands of registered voters received a phone call. The voice on the line sounded like President Joe Biden. It used his cadence, his vocal patterns, and one of his signature phrases: "What a bunch of malarkey."

The message told recipients that voting in the primary would prevent them from voting in the November general election. It encouraged them to "save your vote" for November instead. The implication was clear: stay home on primary day.

The claim was completely false. Voting in a primary election in no way affects a person's ability to vote in the general election. The two are separate processes. The call was voter suppression - dressed up in the voice of the sitting president.

To make the deception more convincing, the calls were spoofed to appear as though they came from the personal cellphone number of Kathy Sullivan, a former chair of the New Hampshire Democratic Party who was involved in running Granite for America, a super PAC supporting the Biden write-in campaign. Biden had kept his name off the New Hampshire ballot in deference to South Carolina's new position as the first Democratic primary state, so his supporters were organizing a write-in effort. The spoofed caller ID made it look like the call came from the heart of that campaign.

Tracing the Source

New Hampshire's Department of Justice launched an investigation almost immediately. Attorney General John Formella's office traced the calls to Life Corporation, a Texas-based telecom company that had placed them, and to Lingo Telecom, which had transmitted them. But these were just the delivery mechanisms. The question was who had ordered the calls.

The trail led to Steve Kramer, a Democratic political consultant based in Texas. Kramer had been working for the presidential campaign of Minnesota Congressman Dean Phillips, who was running a long-shot primary challenge against Biden. Phillips' campaign had paid Kramer $260,000 in December 2023 and January 2024 for help getting on the ballot in New York and Pennsylvania.

Kramer admitted to commissioning the robocalls. He later told reporters he had come up with the idea himself, without the Phillips campaign's knowledge or approval. He said he wanted to draw attention to the dangers of AI in elections. This explanation was met with skepticism given that the calls were designed to suppress votes for Biden - Phillips' opponent - and went to thousands of voters without any disclosure that they were fake.

The Phillips campaign moved fast to distance itself. Spokesperson Katie Dolan issued a statement saying Kramer had acted "of his own volition" and that the campaign was "disgusted" to learn of his involvement. Phillips' own position, premised on the importance of democratic competition, made the association with voter suppression via deepfake particularly damaging.

The Magician

The deepfake audio itself was created by Paul Carpenter, a New Orleans street magician who told NBC News and the AP that Kramer had hired him for the job. Carpenter said he met Kramer through a mutual acquaintance and that they were staying at the same house in New Orleans when Kramer asked him to use AI to clone Biden's voice from a script.

Carpenter told reporters he had been making social media content for about 20 years and that he used AI voice-cloning tools to generate the audio. He said Kramer paid him $150 through Venmo (the payment came from an account with the same name as Kramer's father). Carpenter told the AP he believed Kramer was working for the Biden campaign and didn't know the audio would be used for voter suppression. He said Kramer instructed him to delete the Biden script and related emails after the calls became public, and that he complied.

The connection between a seasoned political consultant, a street magician, and an AI voice-cloning tool illustrated how accessible the technology for election interference had become. Manufacturing a convincing deepfake audio of the President of the United States cost $150 and took a few hours.

The FCC Response

The Federal Communications Commission acted on multiple fronts. First, in February 2024 - weeks after the New Hampshire incident - the FCC issued a declaratory ruling confirming that AI-generated voices in robocalls qualify as "artificial" under the Telephone Consumer Protection Act (TCPA). This ruling gave the FCC explicit authority to pursue enforcement actions against deepfake robocalls, closing a gap that had existed because the TCPA was written decades before AI voice cloning existed.

In May 2024, the FCC proposed a $6 million fine against Kramer for the illegal robocall and caller-ID spoofing campaign. The fine was formally adopted on September 26, 2024. FCC Chairwoman Jessica Rosenworcel framed it in terms of both consumer protection and election integrity: "Every one of us deserves to know that the voice on the line is exactly who they claim to be. If AI is being used, that should be made clear to any consumer, citizen, and voter who encounters it."

Lingo Telecom, the carrier that transmitted the calls, reached a separate settlement with the FCC in August 2024, agreeing to pay $1 million and revise its business practices. The company's liability stemmed from its role in delivering the spoofed calls without adequate vetting of the content or caller identity.

Criminal Charges

New Hampshire's attorney general brought criminal charges against Kramer for voter suppression. The charges were filed under state law prohibiting interference with voters' rights to cast ballots. New Hampshire AG Formella said the enforcement action sent "a strong message that election interference and deceptive technology will not be tolerated."

Kramer was among the first individuals in the United States to face both federal fines and state criminal charges specifically for using AI-generated content to interfere with an election. The case became a reference point for regulators, lawmakers, and election security officials across the country.

The Technology Problem

The New Hampshire robocall demonstrated how AI voice-cloning tools had matured to the point where a convincing presidential deepfake could be produced with minimal cost, minimal expertise, and minimal time. Carpenter, the magician who made the audio, was not an AI specialist. He had social media content creation experience but no deep technical background in machine learning or audio synthesis. The tools did the heavy lifting.

AI voice-cloning services were readily available online in 2024. Many offered free tiers or low-cost plans that could generate convincing voice clones from just a few seconds of reference audio. For a public figure like the President, whose voice appears in thousands of hours of publicly available recordings, the training material was abundant and free.

The cost barrier for election interference via deepfake audio was functionally zero. The $150 Kramer paid Carpenter was less than the cost of a decent pair of headphones. The technology that makes this possible was developed primarily for legitimate uses - voice assistants, audiobook narration, accessibility tools, entertainment - but the dual-use nature of voice cloning meant that the same tools could produce election disinformation with equal ease.

Regulatory Aftermath

The New Hampshire incident accelerated regulatory attention to AI in elections. The FCC's February 2024 declaratory ruling on AI voices was a direct response to the robocall. Several states introduced or advanced legislation specifically addressing deepfakes in political communications. The Federal Election Commission received petitions to update campaign advertising rules to require disclosure of AI-generated content.

The case also prompted discussion about the responsibilities of AI companies whose tools are used for illegal purposes. ElevenLabs, a prominent AI voice-cloning company, was identified as the likely source of the cloning technology used in the robocall (though the exact tool Carpenter used was not definitively confirmed in public reporting). Voice-cloning companies generally include terms of service that prohibit using their tools for fraud or deception, but enforcement of those terms relies on self-reporting or post-hoc detection.

For New Hampshire voters, the immediate effect was confusion. Sullivan, whose phone number was spoofed for the calls, said she had never heard of Kramer before the incident and had received no apology from the Phillips campaign. The write-in effort for Biden succeeded - he won the Democratic primary despite not being on the ballot - but the robocalls added an element of manufactured distrust to an already complex primary process.

The case set early precedent for how U.S. regulators would handle AI-generated election interference: a combination of existing telecom law (TCPA violations, spoofing rules), new regulatory interpretations (the FCC's AI voice ruling), and state criminal statutes. The $6 million FCC fine against Kramer, combined with the $1 million Lingo Telecom settlement and criminal charges, established that the consequences for AI-powered voter suppression would be severe - at least when the perpetrator got caught.

Discussion