» While studios can save time and money by using AI for repetitive tasks, the real investment should go into training up their teams and making sure human oversight is part of every step «
losing relevance with audiences,” MacKinnon concludes. “If others are using AI to speed up workflows and make them more engaging, those who don’t will struggle to keep up.” While many companies are racing towards AI, some are purposefully opting out. Pfitzner sees this as potentially being a good business strategy: “Avoiding AI could serve as a unique quality marker in an AI- saturated market – like the appeal of handcrafted goods in today’s automated world.” Rules, regulation and responsibility In the grand scheme of things, it’s still early days for AI. ChatGPT launched in 2022, but widespread adoption of AI-based technology is only just getting in full swing. “The industry is still figuring it out,” admits MacKinnon. “There are conversations happening, but transparency is key, and audiences should know when AI is involved. It’s up to us to make sure it’s enhancing the content experience for audiences and not simply replacing human decision-making in journalism.” Samake alludes to ‘some guidelines’ but doesn’t necessarily see them being followed. “On social media platforms like Facebook, I don’t always see statements that an image has been created artificially (when I know that it has). That is incredibly dangerous,” she thinks. Since our conversation, Meta has announced that it would end its fact- checking program – an undeniable step backwards and an invitation for rampant misinformation. While Meta’s decision is a disturbing one, the bigger picture needn’t be so bleak. “Across industries, focus on
ethical AI is gaining momentum, supported by initiatives such as the EU AI Act,” describes Pfitzner. “By prioritising transparency, accountability and innovation, early adopters of responsible AI practices are setting a powerful example. Aligning with these principles fosters trust and ensures that AI continues to serve as a force for good.” Integrity is irreplaceable Most newsrooms are privately funded, so staying in business is the first, most basic goal. Once they’ve avoided bankruptcy, companies can then deliver on their mission – for newsrooms, this is usually to provide high-quality, trustworthy journalism. There are two key ways to do this: to hire talented humans and follow ethical workplace practices. “Studios and companies can strike a balance between profitability and journalistic integrity by prioritising responsible AI usage,” suggests Pfitzner. This involves installation of
‘robust cybersecurity measures’, as well as ensuring ‘transparency about AI practices’. He also encourages companies to advocate for fair compensation and to participate in industry-wide discussions. As AI evolves, updated guidelines will have to follow. Samake believes that ‘talented, passionate writers and journalists’ are the backbone of any newsroom. “Every reporter learns about ethics in school or on the job,” she claims. MacKinnon agrees: “The key is keeping people at the centre of it all.” He continues: “While studios can save time and money by using AI for repetitive tasks, the real investment should go into training up their teams and making sure human oversight is part of every step. ”At the end of the day, journalism is about trust – something that’s been heavily eroded in the past few years,” he admits, “and we don’t want to
compromise it more with any missteps while leveraging AI.”
Powered by FlippingBook