» It’s not about replacing people, but about making journalists more powerful; it’s a ‘force multiplier’. When used thoughtfully, AI can support creativity and efficiency in ways previously unimaginable «
Dave MacKinnon, vice president of product management at Clear- Com, has a different view. “AI has incredible potential to improve how we work, especially in journalism,” he believes. “It’s not about replacing people, but about making journalists more powerful; it’s a ‘force multiplier’. When used thoughtfully, AI can support creativity and efficiency in ways previously unimaginable.” From ink to algorithm Should AI have a place in broadcast journalism and, if so, what place is that? All four interviewees believe that the answer, in some capacity, is yes, and they also largely agree on how it should be used. For instance, Pfitzner argues that AI can “automate tedious tasks, freeing up journalists for creative and investigative work.” More specifically, he continues, “AI could help draft new forms of content like summaries, newsletters or audio scripts. It can also analyse large volumes of source material to identify promising stories; this includes datasets, PDFs, document translation, as well as monitoring social media, government websites and financial reports.” MacKinnon adds that AI is best used “as an assistant, doing heavy lifting behind the scenes, handling repetitive tasks like transcription or analysing large amounts of data quickly, so reporters and producers can focus more on the big stories.” Soso echoes much of what’s been said, naming ‘research, article ideation, fact checking and data analysis’ as possible applications of AI; while Samake sees it more as a jumping-off point, helping strapped- for-time journalists with generating
or summarising ideas. AI can be particularly beneficial for smaller newsrooms that are understaffed or low on budget. Despite AI’s potential positives, maintaining credibility is the highest priority for journalists, with Pfitzner suggesting that “using AI for creative content generation carries inherent risks around eroding trust.” Other words, like transparency and plagiarism, also came up in conversation, with Soso arguing that “using ChatGPT to generate full articles to then pass off as your own is not something that should be encouraged.” He suggests regulation to avoid this. Meanwhile, Samake argues that AI’s use should always be flagged. She believes the technology “should never be used to pass as a genuine piece of media. For example, to generate a video or photo and claim that it is ‘real’. This leads to misinformation and distrust.” In a world where many already lack trust in the media, this risk is unaffordable.
MacKinnon neatly summarises the issue: “AI has a role to play, but it’s all about balance. It can be a great tool, but should never replace the human element that makes journalism what it is. It’s the journalists and producers who bring the heart, context and credibility to the final product.” Friend or foe? With generative AI on the rise, many creatives (that includes writers and reporters) are becoming fearful of job displacement and copyright infringement. “There’s a reason why it was such a topic of contention during the 2023 Hollywood labour disputes, specifically the WGA strike,” states Soso. There’s a growing sentiment that the journalism industry – which is currently ripe with mass layoffs – is also under siege. “The practice of scraping content without the proper compensation poses a serious existential threat to creators, with the potential to undermine their livelihoods and
Powered by FlippingBook