Sports Illustrated: Fake-Looking Authors and AI Content Backlash

Tombstone icon

Futurism reported in November 2023 that Sports Illustrated had published product reviews under fake author names such as "Drew Ortiz" and "Sora Tanaka," whose headshots were traced to AI-generated portrait marketplaces. When questioned, SI deleted the profiles without explanation. The articles came from third-party content partner AdVon Commerce. SI said AdVon used pen names without authorization and terminated the partnership. The SI union demanded answers. Within weeks, Arena Group - SI's parent company - fired CEO Ross Levinsohn and three other executives.

The Sleuths at Futurism

On November 27, 2023, Maggie Harrison Dupre at Futurism published an investigation with a simple premise: who were the authors behind Sports Illustrated's product review articles? The answer turned out to be: nobody.

Futurism had noticed that SI's website carried product reviews - "best hiking boots," "best coolers," that sort of thing - authored by people with polished headshots and chirpy bios but no discernible existence outside the SI website. One author, listed as "Drew Ortiz," had a profile portrait that Futurism traced to a website selling AI-generated headshots. His bio read: "Drew has spent much of his life outdoors, and is excited to guide you through his never-ending list of the best products to keep you from falling to the perils of nature." Another profile, "Sora Tanaka," claimed she was an experienced product reviewer. Neither could be found anywhere else on the internet - no LinkedIn profiles, no social media, no prior bylines. Nothing.

When Futurism contacted Sports Illustrated for comment, the magazine didn't offer an explanation. Instead, all the author profiles with AI-generated headshots were quietly deleted from the site. As Futurism's headline put it: "We asked them about it - and they deleted everything."

An unnamed person at the magazine told Futurism that AI had been used in creating some of the content as well - "no matter how much they say that it's not."

AdVon Commerce

Sports Illustrated responded with a statement blaming a third-party content partner called AdVon Commerce. SI said that AdVon had produced the articles and assured the magazine they were written and edited by humans. The pen names and AI headshots, SI claimed, were AdVon's doing - "actions we don't condone." SI said it was removing the content while conducting an internal investigation and had ended its partnership with AdVon.

AdVon Commerce was not an unknown quantity. In October 2023, a month before the SI scandal broke, writers and editors at Reviewed - a product recommendation site owned by Gannett under the USA Today umbrella - had raised similar alarms. Staff suspected that management had published product reviews written by AI under the names of authors who didn't appear to exist. Gannett denied the accusations. The Washington Post reported that mysterious bylines had appeared on the Reviewed site from writers whose identities couldn't be verified, and whose headshots had the telltale smoothness of AI-generated portraits. This earlier incident received less attention, but Futurism later connected the dots, noting AdVon Commerce operated across several media outlets.

A May 2024 Salon report described AdVon's reach as extending to other publications as well, with the company supplying affiliate marketing content to outlets that may not have known - or cared to investigate - how the content was actually produced.

The Arena Group

Sports Illustrated in 2023 was not the weekly powerhouse it had been under Time Inc. The magazine was operated by the Arena Group, a digital media company that licensed the SI brand. The Arena Group had already been dealing with financial instability, staff departures, and questions about editorial direction. The AI author scandal hit a publication that was poorly positioned to absorb the blow.

The Sports Illustrated Union issued a statement saying it was "horrified" by the Futurism report. The union demanded "answers and transparency from Arena Group management about what exactly has been published under the SI name" and called on the company to "commit to adhering to basic journalistic standards, including not publishing computer-written stories by fake people."

Tom Rosenstiel, a University of Maryland journalism ethics professor, told the AP that nothing was inherently wrong with media companies experimenting with AI, but "the mistake is in trying to hide it, and in doing it poorly." He added: "If you want to be in the truth-telling business, which journalists claim they do, you shouldn't tell lies. A secret is a form of lying."

Jeff Jarvis, author of a book on the magazine industry, noted that SI's "ambitions were grand" in its heyday. The distance between that legacy and fake author profiles with AI headshots and auto-generated product copy was considerable.

Heads Rolled

The fallout moved fast. On December 11, 2023, the Arena Group fired three senior executives - its operations president, chief content officer, and one other - in connection with the scandal. The next day, December 12, the board terminated CEO Ross Levinsohn. Manoj Bhargava, an Arena Group board member and the founder of 5-hour Energy, was named interim CEO.

The Levinsohn firing was not solely attributable to the AI content scandal; the Arena Group had been dealing with a range of management and financial issues. But the timing was unmistakable. Futurism published its investigation on November 27. Sixteen days later, the CEO was out.

A Pattern, Not an Anomaly

The SI incident landed in a concentrated period of AI-in-journalism scandals. In January 2023, CNET had been discovered using AI to write financial explainer articles under the byline "CNET Money Staff," with the only disclosure being a small note accessible through clicking the attribution. In the summer of 2023, Gannett paused an experiment using AI-generated high school sports articles published under the byline "LedeAI" after the outputs were filled with errors and odd phrasing. Then came the Reviewed situation under AdVon, followed by SI.

Each case followed a similar arc: a publisher outsourced or automated content production, used real or fake bylines to obscure the process, got caught by reporters or their own staff, and scrambled to contain the damage. The consistent element was not the technology itself but the decision to conceal its use.

For Sports Illustrated, the particular sting was reputational. This was a publication built over decades on the quality of its writing - on the profiles by Frank Deford, the prose of Gary Smith, the photography that defined how Americans saw sport. Publishing product reviews under fake names with AI-generated headshots was not the same species of journalism. The scandal didn't just raise questions about AI; it raised questions about what the SI brand had become.

The content was commerce copy slapped onto a legacy masthead. When the masking slipped, what showed through was a cost-cutting content pipeline that either didn't know or didn't care how the sausage was made. And the people who noticed first - Futurism's reporters, SI's own staff union - were the ones who had to demand that someone do something about it.

Discussion