FEED Winter 2023 Web

soundscape as the mixer intended, and create the downmix accordingly to meet the listening requirements of the consumer equipment. In this way, formats like Dolby Atmos can work for 3D soundbars as well as a stereo or binaural format; it is the metadata that contains all the information to produce every mix. GETTING PERSONAL Objects unlock opportunities for broadcasters to create more immersion as the embedded metadata also provides the ability for end users to personalise their listening experience. It enables viewers to change the contribution of enabled objects such as crowd noise or commentary, and while we are not there quite yet, broadcasters are close. AI is being trialled to create multiple audio mixes in real time, and the BBC is trialling personalised mixes from events like Eurovision, which Edwards was also involved in. “At Eurovision, metadata enabled the BBC to generate an immersive mix from the feeds I was sending them, but also offer a choice of commentary. It was an opportunity to look at real- time metadata manipulation where consumers can choose to listen to the show. In the future, this will extend to other languages. “Viewers will have access to select not only which audio they want to listen to but also where that commentary might fit in the sound field. You could choose to have the commentator as if they were sat alongside you, in the front and centre as normal, or even behind you. You could even turn it off. It goes beyond just immersive; it’s taking what’s being generated within the immersive environment and using it in a different way.” GOING OVER THE TOP Ultimately, as much as consumers are waking up to the benefits, we’re still waiting for the tipping point. Consumer buy-in is there, delivery methods like

the Audio Definition Model (ADM) are proven, and consumer equipment that enables spatial listening is everywhere. Content providers are also looking at ways to increase the value proposition for their customers, with Netflix already offering its stereo customers programming that creates spatialised experiences. Since June 2022, the streaming platform has been using Sennheiser’s Ambeo 2-Channel Spatial Audio renderer on more than 700 presentations to create what Sennheiser calls an ‘enhanced two-channel mix’ from an immersive signal. According to Edwards, it’s likely that it is in streaming where immersive audio will find a home. “While Sky’s standard production format for delivery is 5.1 and Dolby Atmos for premier sports, ITV has no requirement for any programming to be anything more than stereo, and other terrestrial broadcasters who have experimented with multiple- channel formats have pulled away because of delivery issues,” he suggests. “However, the Eurovision 5.1 multichannel mix is available on the BBC’s streaming platforms and YouTube in that same format. The market for this technology isn’t generally terrestrial; it is being led by the streaming services. “It’s going to come from the commercialisation of the product. Rather like Apple has done with its music service, somebody will take ownership and say we’re going to be a channel that’s going to do UHD and immersive delivery, rather like Sky have done for the Premier League, because those two things will go side by side. “Broadcasters have got to have a business plan for it. In the meantime, we’re all happy to experiment, to dabble creatively around and learn about it, because at some stage somebody in a suit will make it a selling point. And then all of a sudden they’ll ask where the content is. “And we can all go, yeah, we can do that – and it becomes the next big thing.”

@feedzinesocial

Powered by