A Blog by Jonathan Low

 

Sep 5, 2023

Gannett Papers Stop Using AI To Write Articles - Because They Were So Awful

Gannett started using AI to write local sports news, because after all, it only involves reporting scores and teams held together by well-worn cliches. It isnt that important - and what could go wrong? 

Turns out, a lot. Starting with the revelation that Gannett ignored its own guidelines about using AI, which insist on human journalistic review before printing. Just as a number of lawyers have learned the hard way, AI is a tool for use by professionals, not a time-saving, cost-cutting panacea to be employed without oversight. JL

Timothy Geigner reports in Tech Dirt:

Gannett has to admit that its attempt at injecting AI-written articles for local sports news coverage was a failure. In the case of several of these attempts, the problem went viral and everyone had a good laugh at how terrible the output was. Ethical guidelines state that AI content has to be verified by humans before being used in reporting, but it’s unclear whether that step was taken. An AI-written sports story failed to generate team names, publishing “[[WINNING_TEAM_MASCOT]]” and “[[LOSING_TEAM_MASCOT]].” AI is a tool for journalists to use, not one that can do the job of journalism.

There may come a time when journalists around the world are left to point at massive datacenters housing AI journo-bots that have perfectly replicated what human journalists can do, screaming “Dey took ‘er jerbs!” like a South Park episode, but today is not that day. And frankly, it doesn’t feel particularly close to being that day. Over the past few months, as AI platforms have exploded in number and notoriety, as have genuinely interesting ways for using those tools exploded, so too have we written a number of posts on attempts to have bots write journalistic articles only to find them to be sub-par in the extreme.

The world of sports journalism has always been considered the kid brother to the big boy and girl journalists. So perhaps you won’t think it as big a deal when a company like Gannett has to admit that its attempt at injecting AI-written articles for local sports news coverage was a failure, but it’s ultimately all the same problem. And in the case of several of these attempts, the problem went viral and everyone had a good laugh at how terrible the output was.

If you’re not much of a sports fan, or don’t read any sports journalism, allow me to highlight the issues in that brief post. First, it sounds as though it was written by a robot. That was drunk. Or possibly high. Or perhaps had played football itself and taken one too many hits to its primary server. It’s devoid of any specifics, such as named players or descriptions of any plays, particularly important scoring plays. Also, scoring 6 points in the final quarter of a football game and losing is not a “spirited fourth-quarter performance.” And “close encounter of the athletic kind”? What the actual hell?

But in case you thought that these publications would have a policy for these articles being reviewed by actual human meat-sacks, or that the above example is as bad as it could get, allow me to disabuse you of both notions with a single article that was written by LedeAI for the Columbus Dispatch.

The Dispatch’s ethical guidelines state that AI content has to be verified by humans before being used in reporting, but it’s unclear whether that step was taken. Another AI-written sports story in the Dispatch initially failed to generate team names, publishing “[[WINNING_TEAM_MASCOT]]” and “[[LOSING_TEAM_MASCOT]].” The Dispatch has since updated AI-generated stories to correct errors.

So, no, clearly these papers aren’t doing any serious form of human-checking of these AI-written posts. And that’s a pretty big god damned problem in a world in which the credibility of journalists and journalism is under such constant assault. Outsourcing journalism to AI that writes like a 7 year old that is thumbing through a thesaurus isn’t going to inspire a great deal of confidence in journalism.

Look, there is likely a place for AI in journalism. But it just isn’t, as of now, in a state in which it can replace journalists. That seems to be the misunderstanding here. AI is a tool for journalists to use, not one that can do the job of journalism.

0 comments:

Post a Comment