ROUND TABLE
virtual elements are added. It also ensures confidence in the presenters when they interact with virtual elements. Overall, the creative advantages outweigh the technical challenges, which are now often known and tested. As a result, we are seeing more and more productions investigating these technologies to enhance the audience experience. Sport is a perfect test space where a lot of on-screen graphics are used for game information and data. Building new and creative ways for the pundits to interact with virtual and XR elements and present this to the audience has been creatively engaging. DEF: What challenges and opportunities do you foresee with the integration of virtual production, XR and real-time rendering into broadcast settings? QJ: Opportunities are expanding as VP and real-time rendering technologies become increasingly accessible and affordable. Each year, live rendering quality improves significantly without a major increase in performance costs. We’re also seeing ICVFX workflows becoming more refined, making them adaptable to a broader range of use cases. This opens the door to more interactive and immersive environments, offering flexibility as well as the ability to make adjustments on the fly. That said, there are still some challenges. These systems remain expensive and complex, requiring careful integration from pre- production onward. Pre-production planning is crucial, which isn’t always easy in live and broadcast settings. As more tech layers are added, the risk of bugs, latency and performance issues increases. Plus, the steep learning curve for training on the various systems and pipelines can be a real hurdle for teams to overcome. address the challenges of ICVFX and workflow efficiency, focusing on colour accuracy and metadata handling for VP. Key innovations include: ARRI Color Management for Virtual Production, Live Link Metadata Plug-in for Unreal Engine and ARRI Digital Twins for previsualisation and techvis, which simulate their colour PC: ARRI’s Solutions team has developed an array of tools to
FINE TUNING Blackmagic Design’s innovations include an AI-based Voice Isolation tool in DaVinci Resolve Studio, able to analyse audio clips and recognise the different sources between unwanted background noises and human voices
PC: ARRI developed REVEAL to significantly enhance image quality, capture colours more accurately and closer to the human visual perspective, and maintain the same colour tones for SDR and HDR – while at the same time, significantly increasing workflow efficiency. ARRI’s solution divides the creative render transform from the display render transform, acknowledging that today’s displays cannot fully capture the range of colours and detail our cameras can. This flexibility allows us to future-proof our systems, ensuring that as display technology improves, the captured images will retain their artistic integrity. Our cameras also support simultaneous SDR and HDR output, enabling broadcasters to cater to various display capabilities without compromising image quality. QJ: One of the most exciting innovations right now is GhostFrame technology, which lets each camera capture different content from the same LED wall, opening up possibilities in broadcast. On the LED panel front, RGB cyan and RGB cyan-amber configurations are being tested to improve colour rendering, both for in-camera displays and as lighting sources, delivering a more accurate visual experience. We’re also seeing rapid progress in deployable LED
science accurately. Plus, SkyPanel X features different light engines. One of the biggest challenges in VP is maintaining colour consistency when filming content from LED walls. ARRI’s Color Management for Virtual Production addresses this by enabling precise LED wall calibration, minimising post-production work to recover colour information. This maximises shooting efficiency and gives creative teams confidence in achieving high-quality images on set. The ARRI Digital Twin allows for VP set- ups to be previsualised and fine-tuned before physical production. By linking real-world and virtual lighting systems, it guarantees seamless synchronisation and reduces both pre- and post- production costs. ARRI’s Live Link Metadata Plug-in for Unreal Engine further enhances production efficiency by streaming real- time camera and lens metadata directly into Unreal Engine, enabling dynamic, real-time adjustments to virtual environments. This tool reduces manual data entry and enables smoother collaboration between on-set teams and post-production. DEF: What are the biggest innovations in display technologies that are influencing the future of broadcast environments?
51
definitionmags
Powered by FlippingBook