FEED Autumn 2023 Web

rtificial intelligence is everywhere, whether we like it or not, with programs such as

IS IT OPTIMAL? AI is a useful tool for automating smaller tasks, but isn’t the solution to all problems

ChatGPT catching the attention of media, businesses and individuals alike. As it becomes increasingly commonplace, organisations should consider how to leverage its capabilities, from saving money to boosting workflow efficiency. “If you’re not talking to your teams about AI and the tools they have access to, you should be,” states Jenn Jarvis, product manager of editorial workflow at Ross Video. “How can AI help us work better, faster and more accurately?” Jarvis asks. But there are a few things to consider before implementing AI. IDENTIFYING THE PROBLEM “When a customer comes to me and says, ‘I want to use AI in my workflow,’ my question is, ‘What is the problem you’re trying to solve?.’ Because AI may be part of that solution and it may not,” explains Jarvis. AI can be a useful tool for many, but it’s important to consider why it needs to be used at all. It’s easy to get caught up in the next best thing and be tempted by the prospect that AI complement current processes, rather than replacing them altogether. It often comes down to time management – if businesses remove the small, more tedious tasks, they can then focus on creating better content. For broadcast journalists, this extra time could prove immensely valuable. will solve all your problems. Jarvis suggests using AI to

AI AS A BUSINESS DECISION AI is as much a financial decision as a strategic one. As Jarvis points out, “AI isn’t free. We’ve been led to believe we just all get access to it, that’s not how it works – there’s a business around it.” Whether to use AI will come down to a cost analysis, and each organisation is a separate case. Jarvis believes business leaders should ask: “How much time and money are we saving by using this technology? And then weigh that against the cost.” A GAME CHANGER While it may seem like it, the concept of AI is not new. “Some news agencies have been using AI to write stories for almost a decade”, explains Jarvis. That being said, the broadcast industry is no stranger to change and the rate is only getting faster. “Most of our customers used to do massive workflow overhauls once a decade. Now, it’s not unreasonable to have pretty significant workflow changes every 12 months,” Jarvis says. With change comes anxiety and excitement over new tools, faster workflows, better results and bigger returns. But Jarvis urges the industry to take a step back from the buzzwords and do some critical thinking. “I don’t think as an industry we’re reflective enough about how we can do our jobs better. That’s really what it comes down to – taking a hard look at the ways we work and looking for inefficiencies in our workflow.”

as right or wrong. For example, audio transcription services can replace the laborious process of note-taking, but users will notice if a word or phrase doesn’t seem quite right. In this instance, AI can be useful and time- saving, but its limitations are obvious. AI lacks editorial judgement, opinions and original thought – all essential qualities of a good journalist. There are certain types of content, like the score of a game or financial information, that won’t be impacted if written by AI. But longer features, essays and more subjective pieces require a relationship with either the publisher or writer. UNCHARTED TERRITORY Consumers want to know precisely where their information is coming from. Since AI is being used more frequently to create original content, some key risks arise: namely, trust, transparency and misinformation. “As we start incorporating AI into work, we’re clear with our audience and viewers to maintain that level of trust as far as where and how we’re using it,” explains Jarvis. When Ross Video integrated ChatGPT into Inception News, it created a sort of paper trail to flag where the software had been implemented. This way, consumers would know exactly which content had been touched by AI, allowing them to apply extra scrutiny as needed. ChatGPT – still in its ‘investigative phase’ according to Jarvis – has also been known to produce factually incorrect information. If businesses choose to rely on AI, they still need to take legal responsibility for the content they’re sharing. Human fact-checkers will be more necessary than ever.

THE ROLE OF THE HUMAN

Most of what AI does is finite, according to Jarvis – it can easily be judged

WATCH ME! Make great news stories with Ross Video’s end-to- end newsroom technology

@feedzinesocial

Powered by