Cody Enterprise reporter resigned after AI fabricated quotes from real people
The Cody Enterprise was forced into public apologies and corrections in August 2024 after reporter Aaron Pelczar resigned amid evidence that an AI tool he used to help write stories had inserted fabricated quotations. A competing reporter at the Powell Tribune spotted robotic phrasing, suspiciously polished source quotes, and one article that bizarrely ended by explaining the inverted pyramid style of news writing. The resulting review found seven stories that included invented or altered quotes from seven people, including Wyoming Gov. Mark Gordon. The paper removed many of the quotes, issued corrections, and then adopted an AI detection and policy response after learning, a little late, that generative text tools are not interchangeable with reporting.
Incident Details
Tech Stack
References
Local newspaper scandals usually involve familiar forms of decay: budgets cut too hard, copy desk gutted, public records fight lost because no one had time. The Cody Enterprise managed something newer in August 2024. One of its reporters resigned after a competitor documented that several published quotes appeared to have been fabricated by a generative AI tool he was using to help write stories.
The details are almost too on the nose. Robotic phrasing. Quotes that sounded polished but somehow wrong. Public officials who were surprised to learn what they had supposedly said. And then the giveaway that pushed the whole thing from suspicious to ridiculous: a local parade story that ended by explaining the inverted pyramid, as if the article had been generated by a system that had read a journalism textbook and become briefly excited to share what it learned.
That sentence did not merely look odd. It exposed the underlying problem. The tool was not gathering information. It was assembling a newspaper-shaped object.
How the story unraveled
The reporting that broke the scandal came from CJ Baker of the Powell Tribune, a competing paper that regularly reads the Enterprise because that is what local newspapers do when they are still alive enough to care what the other paper printed this week. Baker noticed that some quotes in Cody Enterprise stories felt off. They sounded generic in a way real people usually do not. Some lines carried the smooth, faintly synthetic cadence common to chatbot output. The Larry the Cable Guy parade story then ended with a short lecture on article structure, which is not something a human reporter typically tacks onto local event coverage unless they have suffered a sudden concussion.
Baker dug further. According to his reporting, seven people told the Tribune they had been quoted in Pelczar's stories even though they had not spoken to him. That group included Wyoming Gov. Mark Gordon, whose staff said one quote attributed to the governor was wholly fabricated and another was partly fabricated.
When confronted, Pelczar did not offer a robust defense of the quotes. In the Powell Tribune account, he said he had never intentionally tried to misquote anyone and conceded that the disputed lines may have been generated by an AI program he was using to help write articles. He resigned shortly after the confrontation.
The paper later reviewed his work and, according to the AP, found seven stories containing AI-generated quotes from six people. The publisher and editor apologized, removed or corrected many of the problem quotes, and admitted the newsroom had allowed AI to put words into stories that were never actually spoken.
Why fabricated quotes are a uniquely bad failure mode
Journalism can survive clumsy writing. It can survive a bad headline. It can even survive the occasional factual correction, because mistakes happen and credible outlets correct them. Fabricated quotes are different. A quote is supposed to signal that someone actually said the words attached to their name. Once readers learn a newsroom published lines that no source ever uttered, the publication stops looking sloppy and starts looking counterfeit.
That is what makes this incident more serious than "a reporter used AI badly." The problem was not merely that Pelczar had rough drafts polished by software or asked a chatbot to help summarize information. The problem was that the system appears to have introduced invented statements into published reporting and those statements cleared whatever editorial review the paper had in place.
A fabricated quote from a governor is embarrassing enough. Fabricated quotes from ordinary local people are worse in a quieter way. Local journalism depends on communities believing that if their name appears in the paper, someone actually called them, met them, or heard them speak at a meeting. Once that expectation breaks, every quoted source becomes a little suspect.
The newsroom response
The Cody Enterprise response was what many small outlets do when technology failure lands in public: apologize, correct, and improvise policy after the damage. AP reported that the paper did not have an AI policy at the time because management considered it obvious that journalists should not use generative AI to write stories. That is one way to describe it. Another is that the newsroom had a rule so obvious it did not bother writing it down, enforcing it, or building a process to detect violations before publication.
After the scandal surfaced, the paper's publisher described AI as an advanced form of plagiarism and said the outlet had implemented a way to recognize AI-generated stories. Wyoming Public Media also reported that the publisher confirmed AI had been used to misquote people and that some sort of detection system was now in place.
The response makes sense as crisis cleanup, but it also exposes the weakness of the original setup. A newsroom does not get to treat AI misuse as unthinkable and then act shocked when a struggling reporter reaches for the obvious shortcut. Generative tools were already everywhere by mid-2024. If management had not decided how those tools could and could not be used, then management had already made a decision. It had chosen ambiguity.
Why local news is especially vulnerable
This kind of failure was probably more likely in a small local newsroom than at a big national paper with layers of editors, standards desks, and media reporters waiting to pounce. Local papers run lean. They cover a wide range of beats with tiny staffs. Newer reporters may arrive without much training. Deadlines are constant and the pressure to produce quickly is real. Generative AI is pitched directly into that weakness: faster draft, faster rewrite, fewer calls, fewer notes, cleaner prose.
The sales pitch is especially seductive when the subject matter looks simple. A parade announcement. A court case. A local sentencing. A wildlife citation. Feed the source material into a chatbot, ask for a clean article, maybe have it add a quote or two that "sounds right," and suddenly the work that used to require actual reporting looks compressible into prompt engineering plus a quick edit.
That is not reporting. It is counterfeit reporting. Cody Enterprise discovered the difference the hard way.
The episode also shows why AI failures in journalism are not confined to giant publishers making giant strategic bets. Sometimes the whole scandal is one reporter, one weak editorial chain, and one competitor paying close enough attention to notice that the prose has started talking like a workshop handout. Local journalism still has one advantage over slop at scale: the people quoted in the stories often know one another, and they know when they were never called.
The real damage
Pelczar resigned. That was the immediate consequence. The deeper consequence was trust leakage. Every correction tells readers to revisit what they thought they knew. Every apology invites them to wonder how many other lines slipped through. Every statement that "we now have a policy" quietly admits there was not one before.
The Cody Enterprise was not running a grand AI transformation initiative. That almost makes the incident more useful as a cautionary tale. The failure did not require a newsroom-wide automation mandate or a chief innovation officer with a deck full of promises. It only required access to generative text, weak verification, and a publication process willing to treat polished sentences as evidence that reporting had occurred.
That is the central lesson. Journalism is not the article-shaped thing at the end. It is the reporting process that makes the article trustworthy. Generative AI can imitate the shape. It can imitate the tone. In Cody, it apparently imitated the quotes too. What it could not do was the one part that mattered: talk to the people whose names it used.
Discussion