INDUSTRY. VIRTUAL PRODUCTION ROUND TABLE
possibilities for user experiences, ranging from rendering quality to workflows. The implementation of calibrated RGBW in LED processing is expected to be a game changer for dynamic lighting quality. The extra emitter offers improved colour rendering, and having this last emitter correctly calibrated will ensure accuracy continues to meet expectations. Additionally, as panel components continue to improve, the pixel pitch will get finer, and the refresh rate will also increase. Integration with other technologies will improve the overall quality of virtual production processes and make them more efficient. J.T. ROONEY: Client interest and adoption has significantly increased since our team began pioneering extended reality (XR) technology. Likewise, democratisation and education have had broad changes within the virtual production landscape, in the last year I would say. It’s interesting to look back to when our team was first exploring this in 2018- 2019. The ecosystem was quite different, heavily focused in the cinematic space with films like Avatar and The Jungle Book , and advanced previsualisation set-ups. In the last few years, the industry overall has expanded and we now see VP present across the entire spectrum of productions – from short-form content creation, commercial projects and independent films, all the way up to major film and TV studio work or live broadcast. Because of this incredible expansion and ever-growing demand, we are excited to have been able to uniquely position ourselves as VP specialists in multicamera broadcast, virtual set extension and mixed reality work. We receive emails and phone calls daily from various brands, organisations and productions looking to adopt this new workflow and we honestly couldn’t be happier to be their partner on this new pathway. OLAF SPERWER: Changing processes are driven by the question of what parts of the production life cycle can be made virtual. What makes sense? How do we merge traditional and virtual workflows? A good example is set construction. We are looking for a new, overarching approach to optimally merging physical and virtual sets. What will drive the next generation of virtual production technology will kick off when the organisation and workflows are truly implemented and
What’s the biggest change happening in the industry driving the next generation of virtual production technology? TIM KANG: Two things: On-set image- based lighting (OS-IBL) build-outs will become the focus as LED tile backdrops have now permeated production facilities worldwide. Volume walls cannot do this well. I saw this problem two years ago and foresaw that volume stages will build out accurate image-based lighting volumes using a mix of different fixtures arrayed out on stage walls and ceilings beyond the LED wall itself. We are just at the cusp of this, especially since it won’t cost as much, as LED walls, OS-IBL products like Quasar Science’s RR and R2 units keep coming out, and the techniques to control and drive them democratise. Secondly, AI-based technology will reduce the need for overly large ‘brain bar’ or volume tech teams. As an example, Neural Radiance Fields (NeRF) might allow for video plates to have the same interactive parallax that currently only 3D engine environments can supply. DAVID LEVY: Education! Many more in the industry have now been exposed to, or have taken the initiative to educate themselves about, virtual production and the variations of the technology across pre-, on-set and post-production. Once the opportunities that virtual production offers across each of these stages are understood, productions can identify if and where it can add value or efficiency, and whether virtual production is the most effective solution needed to deliver the project. This informed adoption and utilisation will drive the next generations and iterations of virtual production. CONOR MCGILL: Increasingly reliable and robust video over IP protocols such as SMPTE 2110 will greatly enhance flexibility, scalability and performance of virtual production workflows. This is because the vast amounts of visual data necessary for photorealistic real-time environments and HDR video content can be handled much more economically and creatively over a unified fibre- optic or copper network compared to traditional dedicated audio-visual signal transmission infrastructure. CESAR CACERES: The video industry is continually evolving, and advancements in technology are creating new
THE INTERVIEWEES
TIM KANG
Quasar Science | Principal engineer, colour and imaging
DAVID LEVY Arri | Global business development director
CONOR MCGILL Pixera | Global business development manager
CESAR CACERES
Brompton Technology | Product lead
J.T. ROONEY
XR Studios | President
OLAF SPERWER Roe Visual | Business development virtual production & XR stages
52. DEFINITIONMAGAZINE.COM
Powered by FlippingBook