FEED Winter 2020/21 Web

automatically,” says Dawes. “By doing this, we were able to extract video clips that were then used by the digital production team in their live shows.” The resulting tool was based on the YOLO neural network and runs on the Darknet open- source machine-learning framework. It was able

to automatically detect and locate animals in each scene using object recognition. “We’ve trained a network to recognise birds and mammals, and it can run just fast enough to find and then track animals in real-time on live video,” Dawes says. “We store the data related to the timing and content of the events, and use this as the basis for a timeline we provide to members of the production team. They can use the timeline to navigate through the activity on a particular camera’s video. “We also provide clipped-up videos of the event. One is a small preview to allow for easy reviewing, and a second is recorded at original quality with a few seconds’ extra video either side of the activity. This can be immediately downloaded to be viewed, shared and imported into an editing package.” A similar workflow was used for Autumnwatch which aired in October, with more AI-production likely for Springwatch 2021. WE INVESTIGATED HOWWE COULD APPLY TECHNOLOGY TO PERFORM TASKS AUTOMATICALLY

COVER YOUR TRACKS For Springwatch, the cameras used the YOLO neural network and machine learning technology to detect and track wildlife in a scene

@feedzinesocial

Powered by