Issue:

April 2025

AI-balling journalism’s future

Artwork by Julio Shiiki

The Los Angeles Times has launched an online counterpoint to its op-ed section called Insights, an artificial intelligence tool that on command ranks where an editorial sits on the political spectrum. Its “bias meter” is complemented by a summary of the editorial, suggestions for alternative views, and links to related content. However, even with the newspaper owner’s pledge that AI “supports our journalistic mission”, Insights has already generated controversy.

Some of the newspaper’s readers canceled their subscriptions after an AI-created editorial appeared to defend a hate group. Insights, which has no human journalistic oversight, has also been criticized by the newspaper’s union. The Times has claimed its editorials are not intended to be echo chambers, but the paper did remove the Insights function, while the service remains available for editorials in other sections.

Tokyo University Professor Yutaka Matsuo, an expert on AI who spoke at the FCCJ in March, said technological advances would have a significant impact on the media. “One point that’s clear is the need for deeper insights and opinion pieces,” he said. “That’s the opportunity … not to rush the news.

“Media has an important role in society, as newspaper reporters in Japan see the benefit of reducing communication costs by defining what knowledge should be conveyed.”

The AI push is happening across all media. U.S. newspaper publisher Gannett’s local websites post AI-generated content through a tool called Expresso, which is “designed to draft polished articles from community announcements”. Unlike the Los Angeles Times, Gannett says all AI content is vetted by humans, but it has drawn criticism over AI-generated score recaps and other functionality.

Many of the objections to AI have centered on not knowing if it has been used. AI technologies have been used to generate headlines, content, audio-text transcription, image optimization, news notification, as well as crunch data and to conduct basic search. A recent Thomson-Reuters survey of more than 200 trained journalists found cautious optimism about its potential. More than 80% of those surveyed already employed AI in some way. However, a nearly identical percentage work in offices without established AI policies, and 58% said they were AI self-taught. 

Has a Wild West emerged in which AI tools are debuting before rules and best practices have been codified? The answer is yes, somewhat, although training is available at institutions such as the Knight Center for Journalism in the Americas, which will launch a Prompt Engineering 101 online course for free from April 14. Nonetheless, much remains a case of trial and error.

I turned to the Society of Professional Journalists homepage, which features a list of AI tools that I will use to finish this article. I first signed up for the free services offered by Lex.page, which gave me the following analysis:

Headline: The current headline, “AI-balling Journalism's Future”, is somewhat unclear and may not effectively capture the article's focus. "AI-balling" is an unfamiliar term that could be confusing for readers. It's not immediately clear what it means in the context of journalism.

Introduction: The opening paragraph provides a good overview of the key issue being explored - the LA Times' use of an AI "bias meter" and the controversy it has stirred up. However, the introduction could be strengthened by providing more context on the broader trend of AI in journalism and how it is transforming the media landscape.

Body: The body of the piece does a good job of exploring different ways AI is being used in journalism, from AI-generated content to data analysis. However, the transitions between sections could be smoother.

Conclusion: The conclusion is a bit abrupt, simply noting the author will use AI tools to finish the article. A stronger conclusion could summarize the key takeaways and implications of the piece, highlighting the central tension between the opportunities and risks of AI in journalism.

This is the constructive chastisement that Number 1 Shimbun contributors hope to avoid receiving from the editor, Justin McCurry, who trawls through our copy in search of spelling, grammatical, syntactical, factual, and thematic errors. I doubt he would be so blunt, even if I’m sure he occasionally curses a poorly crafted sentence.

I then turned to the free Notion AI, which basically changed my headline to: AI Reshaping Journalism's Future

That was too kind, so I opted for one last bruising, this time from GrillBot, which offers tools to paraphrase and summarize, check grammar, and detect plagiarism and AI use. Most services require a paid upgrade, but its grammar checker initially gave me a grade of 83/100, with three main suggestions:

Deeper - more profound; AI policies, - add comma; put commas inside quotation marks.

I ran the text through again, and my grade fell to 65/100 with 16 recommendations, and some definitely did not follow the AP/Reuters stylebooks. Do I embrace all of Justin’s edits? Of course not, but at least there is consistency and a human who can explain them. GrillBot couldn’t even agree with itself.

The SPJ website has numerous links to AI tools for writing, editing, and Chat, and lists of content about ethics and how to leverage technology in the newsroom. As someone who entered journalism in the era of clippings and film, the future of the industry is clearer on display, particularly its opportunities and risks. 

AI is seen democratizing journalism, but there is anxiety about creating an analog and digital newsroom in the same organization. For more, I asked the Murrow chatbot the following question: Is journalism in danger from AI? (I edited the answer).

AI technology is rapidly advancing, (so) it's important to consider both its potential and ways it can be used to support journalism … AI can automate certain tasks, such as data analysis and fact-checking, which can help journalists work more efficiently. However, AI might replace some jobs, especially those involving routine tasks … AI lacks the human qualities essential for journalism, such as creativity, critical thinking, and ethical judgment. AI also cannot replace the trust that readers place in human journalists to report fairly and accurately.

I suspect Edward R. Murrow, the site’s namesake, would agree.


Dan Sloan is president of the FCCJ. He joined the club in 1994 and previously served as president in 2004 and 2005-06. He reported for Knight-Ridder and Reuters for nearly two decades.