Issue:
July 2025
AI is driving rapid changes to journalism, but reporters and editors still have the upper hand

“Journalism has so closely merged with computer science that they are effectively one undertaking.” Those words came from the dean of Columbia School of Journalism, Jelani Cobb, during an appearance at the FCCJ in June. “This is one of the most astoundingly innovative moments in our field," he added. "People are doing journalism in all kinds of ways”.
That sounds hopeful, but Cobb and other media watchers also see the potential pitfalls of artificial intelligence (AI).
A Pew Research survey in April put journalists among the top three careers seen at risk from AI, along with cashiers and factory workers.
“I am aware of the threats and possibilities posed by AI,” Cobb said, noting how AI can handle gigantic data sets but may also make some newsroom roles superfluous. “Will it get rid of jobs? Yes. Will it create new positions? Likely.”
In 2023 and 2024, U.S. and UK newspapers, broadcasters, and digital media shed around 12,000 jobs, according to Press Gazette. These were not directly tied to AI, but were symptomatic of the vocational anxiety gripping the industry.
Cobb said journalism education must train reporters how to embrace and take advantage of these new technologies to ensure their newsroom value. Columbia and other schools, he added, were prepared for ChatGPT and other AI developments.
“We didn’t have to stock up on expertise. When ChatGPT debuted and everything changed, we already had people in our data journalism program … various parts of the journalism school were working deeply on AI and its utility.”
AI-assisted work for newsrooms includes writing headlines, articles and captions, creating infographics, analyzing and transcribing interviews, managing content and archives, polling, posting reader discussion prompts and chatbot summaries, personalizing news aggregation, translating, and creating AI images, voices, and services such as deepfake detectors, fact-checkers, news quizzes, and podcasts.
Still, critics say ethical standards for best practice vary and are far from universally adopted.
Some contend that AI tools will lead to greater productivity, but historian and author Carl Benedikt Frey of Oxford University noted in a recent Financial Times article that research productivity actually declined from 2% annually in the 1990s to 0.8% in the last decade. His message: AI mainly boosts efficiency, not creativity.
Frey said the main advantage humans have over AI in the newsroom is their ability to formulate original ideas and concepts. “AI continues to rely heavily on vast amounts of data for effective training, making it less capable when historical precedents or data are limited,” he said in an email reply to questions from the Number 1 Shimbun.
“Human journalists, therefore, maintain a comparative advantage when covering events unfolding in real time or when conducting investigative journalism that involves gathering original information.”
“Humans remain superior at generating novel ideas. Thus, rather than employing AI primarily to scale up content production, journalists should focus on creating genuinely original content. That also makes them less likely to be replaced by AI.”
Several public and private initiatives have been launched to help journalists add to their skillsets and master the use of AI in the newsroom.
Polis, the journalism think tank at the London School of Economics, launched Journalism AI, to promote responsible AI use and democratize access. The program is supported by the Google News Initiative and has already trained more than 3,000 journalists.
Poynter introduced an AI toolkit for journalists created by MediaWise in partnership with the Associated Press and supported by Microsoft. The venture’s aim is to build awareness of AI resources while focusing on tech literacy and ethics. The guide says media should communicate how AI has been employed in the journalistic process, such as transcription or analysis, as well as detailing its human component, which would include verification of AI-related content.
In a March seminar on AI and the Future of News held by the Reuters Institute and University of Oxford, panelists noted how AI was the latest effort by tech companies to acquire and control social media audiences. This had led to greater collaboration, they said, and also direct competition with news organizations, including in Japan.
Last month, the Japan Newspaper Publishers and Editors Association called on AI companies for the second straight year to comply with existing rules on intellectual property (IP) rights, and for the government to introduce laws and regulations to better protect content creators and IP. It contended that AI companies were “freeriding on their news content”, even though the firms had been asked to stop aggregating, using and repackaging news reports.
According to the Asahi Shimbun, many Japanese news companies have websites that explicitly prohibit generative AI from using their content, which resulted from an increase in Retrieval-Augmented Generation systems and other AI technologies using search to grab content and generate responses to user questions. Google rolled out such services in 2023
Japanese news organizations say that while the use of AI-generated responses is on the rise, their own original content is being ignored or seeing user-access limited via so-called zero-click searches. In addition, AI-generated searches could produce incorrect information.
What is the outlook for journalism and AI? In a Poynter podcast in June, Alex Mahadevan, director of the U.S.-based MediaWise and co-author of its AI ethics guide, was optimistic. “I see a lot of promise in AI. It can help us personalize videos, newsletters … it can help turn anyone into a data journalist,” he said.
“Bureaus in rural areas are closing everywhere, newspapers are not covering communities that they used to. AI can help do that. It can hold the powerful to account.”
Dan Sloan is president of the FCCJ