Arrow-right Camera
The Spokesman-Review Newspaper
Spokane, Washington  Est. May 19, 1883

Wyoming reporter resigns after using AI to fabricate quotes

By Aimee Ortiz New York Times

A novice reporter for the Cody Enterprise in Wyoming resigned this month after he was caught using generative artificial intelligence to help write his stories, resulting in numerous fabricated quotes, according to editors at the paper and published reports.

Aaron Pelczar left the newspaper Aug. 2 after the Powell Tribune, a competing newspaper, presented him with evidence of made-up quotes in several stories.

CJ Baker, the Powell Tribune staff writer who broke the news, said he regularly read competitors like the Enterprise as a way of “keeping tabs” on what’s going on in his area.

After Pelczar started at the Enterprise in June, Baker and his colleagues “kind of noticed there were some weird patterns and phrases that were in his reporting,” like awkward text that seemed like an attempt to sum up the story. Baker, a veteran reporter of 15 years, said that the situation escalated after a late July story by Pelczar had direct quotes that sounded as if they were from a news release rather than spoken aloud in court.

Baker said that he then began digging, and government agencies that were quoted said they did not know where the quotes came from. Then there was a rather odd quote that was supposed to be from the governor, but it was in his capacity as a rancher and not as the CEO of the state.

“It was at that point that I basically went back and started checking on quotes that appeared in this reporter’s stories that had not appeared in other publications or in press releases or elsewhere on the web,” Baker said in an interview.

He then found seven people who said they had never spoken with Pelczar.

In an editorial Monday Chris Bacon, the editor of the Cody Enterprise, apologized for failing to catch the error, writing that “it matters not that the false quotes were the apparent error of a hurried rookie reporter that trusted AI. It was my job.”

“I apologize, reader, that AI was allowed to put words that were never spoken into stories,” Bacon wrote.

John Elchert, the chief operating officer of Blackbird, the parent company of the Cody Enterprise, said that the newspaper took action as soon it knew what happened.

The newspaper launched an investigation into Pelczar’s work, and he resigned, Elchert said, calling it “the right decision.” The newspaper also reached out to sources who may have been misquoted, and ran retractions.

Blackbird does not have a companywide policy on AI, Elchert said, but “we’ve discussed it, and while it’s not a written policy, our policy is that we do not use AI in our newsrooms. Period.”

“We’re going to be more diligent,” Elchert said, adding that the paper owed it to readers and sources. “It’s so important that they trust their local newspaper, and we want to make sure we continue to earn that trust.”

Attempts to reach Pelczar on Wednesday were unsuccessful.

In an acknowledgment Aug. 7 of the plagiarism, Megan Barton, the publisher of the Enterprise, wrote that AI “is the new, advanced form of plagiarism and in the field of media and writing, plagiarism is something every media outlet has had to correct at some point or another.”

Barton noted that the paper now had a system to catch AI-generated stories.

Alex Mahadevan, the director of Poynter Institute’s digital media literacy project MediaWise, said in an interview that research showed that audiences are “not very trusting of journalists using AI,” and that such lapses “just furthers the hesitancy that people have about journalism right now.”

While rare, there’s a long history of journalists whose careers ended after they were caught fabricating stories. More recently, the industry has grappled with the emergence of AI. Last year, Sports Illustrated came under fire for product reviews published under fake author names with fake author biographies.

Mahadevan said he believed that AI could help reporters with “boring” news gathering tasks, like sifting through PDFs, and gain time for interviews.

“There’s just no way that these tools can replace journalists,” he said. “But in terms of maintaining the trust with the audience, it’s about transparency.”

Newsrooms must be open about how they use AI, he said, and have ethics policies in place.

“If a journalist churns out seven articles in the time that it usually takes them to write one, then you might ask them, ‘Hey, what’s your source for this?’ Or, you know, ‘How did you get this quote?’” he said.

The dead giveaway for AI text, Mahadevan said, are “weird summary” sentences at the end of a story, similar to something in a school essay.

Such tools, he said “are spitting out, what is in essence, an SAT-level five-paragraph essay.”

This article originally appeared in The New York Times.