Nota shut down its AI local news network after it was caught copying local reporters

Tombstone icon

Nota launched an 11-site local news network in 2025 with the usual "underserved communities" rhetoric and the less-usual decision to let AI-assisted workflows repurpose other people's reporting. By early April 2026, Axios Richmond and Poynter had documented widespread plagiarism, including lifted quotes, paraphrased reporting, and reused photos from local outlets. Nota fired one editor, took down the network, and signaled the sites were likely gone for good. The promised fix for news deserts lasted about as long as it took actual local reporters to notice their work had been stolen.

Incident Details

Severity:Facepalm
Company:Nota
Perpetrator:Publisher
Incident Date:
Blast Radius:Eleven local news sites shut down; copied work traced to at least 29 outlets and 53 journalists; public credibility collapse for Nota's local-news experiment
Advertisement

The Pitch Was Familiar

Nota's local news network arrived with the sort of language the AI industry now produces almost as reliably as the models themselves. It would serve underserved communities. It would address news deserts. It would bring bilingual reporting and civic information to counties that lacked robust local coverage. Eleven sites launched across multiple states under the Nota News banner, and the product story sounded tidy: automate the boring parts, fill the gaps, help democracy.

The problem was that local reporters in those supposedly underserved communities were still very much alive, still very much reporting, and still very much able to recognize their own work when someone laundered it through an AI workflow and hit publish.

By early April 2026, Axios Richmond and Poynter had shown that Nota's network was not just relying on automation for formatting or summarization. It was publishing stories and photos lifted from working local journalists without attribution. According to Poynter's reporting, the copied material appeared in more than 70 stories dating back to October and drew from at least 29 outlets and 53 journalists. That is not a one-off editorial miss. That is a content supply chain.

How It Unraveled

The reporting started close to the ground, which is fitting. Axios Richmond looked at Nota's Henrico and Chesterfield sites and found complaints from Henrico Citizen publisher Tom Lappas that stories looked like stolen versions of his reporting and that staff photos had been used without permission. Once people started pulling on that thread, the sweater came apart fast.

Nota's CEO Josh Brandau told Axios the company had immediately pulled problematic work after learning of the concerns. Poynter then widened the picture. The issue was not one copied article or one rogue caption. Poynter documented copied quotes, copied phrasing, copied reporting structure, and copied photos spread across the network. Some of the borrowed material came from organizations that were themselves Nota clients, which adds a nice extra layer of business-model humiliation.

The company reacted in the standard sequence for this genre of failure. First the pieces came down. Then one editor, Jorge Rodriguez, was fired. Then the five sites he worked on were removed. Then the other six disappeared too. By the time Axios published the shutdown piece on April 3, the entire 11-site network had been pulled offline and Brandau was indicating the experiment was probably finished.

It is hard to overstate how quickly the grand project of civic AI journalism turned into "we fired the guy and took down the sites."

The Important Detail Was the Workflow

One of the most useful pieces of reporting in the Nota story was Poynter's description of how the system was supposed to work versus how it actually worked. Nota said the articles were meant to be generated from publicly available civic information such as press releases and local government meeting videos. That story sounds defensible, if dull. Many newsrooms rewrite meeting notes and press releases every day.

What Poynter found was different. Rodriguez said he repurposed stories from local outlets using Nota's AI tools and published the rewritten versions under his own byline. Axios also noted that Nota had publicly claimed each story was fact-checked and written by editorial staff, while a demo on the company's site showed an internal tool that could draft stories.

That gap matters. This was not just a plagiarism scandal with an AI logo attached after the fact. The product itself appears to have been designed around a workflow where AI accelerated production and editors, under light staffing and wide coverage expectations, could turn existing local reporting into new site copy. Once a system is built to scale that way, plagiarism stops being an ethical exception and starts becoming an operational temptation.

Two part-time editors across eleven sites is not an editorial structure. It is a stress test for how much corner-cutting a content pipeline can absorb before it detonates in public.

News Deserts Are Not Empty Lots

One reason the Nota story lands so cleanly is that it exposes a recurring fiction in AI-for-journalism pitch decks: the assumption that a "news desert" is an empty field where nobody has done any reporting, so whatever the machine emits must count as additive coverage.

Real local news ecosystems are messier than that. Some counties have one thin but functioning newsroom. Some have tiny independents. Some have regional TV stations, newsletters, or niche civic sites doing selective but real work. Those organizations are often underfunded, and that underfunding makes them vulnerable to being strip-mined by better-capitalized automation firms that describe the process as innovation rather than theft.

Nota ran directly into that reality. The company did not arrive in a vacuum. It arrived in places where reporters, editors, and photographers were already doing the hard part. Then its system appears to have treated that hard part as reusable input.

That is the core Vibe Graveyard move in journalism: mistake the existence of text for permission to industrialize it.

Why the Shutdown Matters More Than the Apology

A lot of newsroom AI controversies end with a note, a correction, a promise to revise internal policy, and the hope that everyone moves on. Nota's story is stronger because the whole network effectively collapsed. The public consequence was not merely a statement about standards. It was removal of the product itself.

The shutdown is also a cleaner signal than most company apologies. Firms can talk endlessly about guardrails, review processes, and learning opportunities. Taking down eleven sites is the market admitting the system cannot be trusted in its current form. You do not disappear your own network if the issue is a couple of isolated copy mistakes. You do that when the credibility problem is structural.

The broader lesson is ugly but simple. A company can frame AI local news as a civic rescue mission, but if its economics depend on a tiny staff using automation to cover many markets at once, the pressure to reuse existing reporting is not an accident. It is built into the promise. Scale without reporters has to get its words from somewhere.

The Local Journalism Version of Content Theft at Scale

Plenty of AI media scandals involve obvious nonsense: fake authors, invented quotes, hallucinated statistics, botched explainers. Nota's version was more parasitic. Instead of making up a fake article from scratch, the system seems to have taken working journalism as substrate and run it through a blender. That can look more coherent than pure hallucination, which is exactly why it is dangerous. The output sounds like journalism because, in part, it is journalism. Just not theirs.

That is also why the backlash was so immediate. Local reporters know the details of their own stories. They know their phrasing, their photos, their reporting trails, and the weird little facts that only appear because someone took a call, watched a county meeting, or showed up in person. When another outlet suddenly publishes suspiciously similar material under a fresh byline, the machine camouflage is not especially convincing.

Nota did not discover a new model for local reporting. It discovered that local reporters can still identify plagiarism when the plagiarism arrives wrapped in automation rhetoric. Then it shut the whole thing down.

Discussion