DEFINITION January 2022 – Web

VIRTUAL PRODUCTION ROUND TABLE INDUS TRY.

short-form content and adverts – it’s no longer just for high-budget TV and film. HUNT: It’s great that Epic (and others) have seen the value of focusing on training and developing features geared solely towards virtual production. Having their team available to guide you through the process and help solve any problems is invaluable. KAESTNER: Getting images rendered in real time displayed within a camera frustum, that’s also being tracked in real time, is quite a technical challenge, and latency will always exist. If you combine that with the difficulty of capturing these images with a physical camera, you have plenty of areas that are constantly improving. Camera tracking systems are always getting more accurate, able to provide tracking data to the game engine faster. The game engine will benefit from more powerful render nodes with better GPUs and data handling. LED walls and their firmware are regularly upgraded to allow for new protocols, or to accommodate new camera software. This means refresh rates can continue to increase and keep camera shutter and display phase in sync across render nodes, on a display several meters tall and maybe more than 50 meters wide. Any tweak or technical advancements to these components in this delicate chain of command – at 24Hz or more – demands constant feedback loops and optimisation. In the world of VP, all components evolve all the time – exponentially. HOCHMAN: Wow, this is quite a short question with a lot of answers! We’ve

of that rendered content to get distributed and displayed. LEVY: We have observed that all technology partners involved in defining workflows and providing hardware/ software solutions (including Arri) are engaged in a very open dialogue with each other. This has allowed the industry to move much faster than before, and lets us develop in a more collaborative and efficient way. PRAK: Using LED volumes, latency is now being brought down to one frame. At the moment, nobody is able to drop below that figure.

synchronised LEDs forever, because that’s a requirement for making a display with thousands of tiles operate as a single entity. But, camera tracking and rendering have always been separate systems, operated by different departments and involving entirely alternate workflows. Now that all of these technologies are being brought together, we’re seeing an enormously fast-paced advancement, with all the interconnected equipment becoming an ecosystem. We’re at the infancy of this, and Epic Games/Unreal is pushing to be a big part of it, from the software/render side of things. Megapixel is working diligently, to make Helios the core central infrastructure needed for all

JEREMY HOCHMAN CEO, Megapixel VR

DAVID LEVY Director of business development, global solutions, Arri Rental

MARINA PRAK Marketing manager, Roe Visual

DAN HAMILL Co-founder and commercial director, 80six

Hochman is an entrepreneur and designer, who made tech history in 2002 when co-founding Element Labs, the company that gave birth to the creative LED industry.

With over 30 years in the entertainment industry, 20 of which were in marketing, Prak is currently responsible for growing Roe Visual’s brand in Europe and the Middle East.

Levy comes from a creative background, and was lead camera and lighting specialist at Al Jazeera for 11 years, before joining Arri Rental in 2017.

Hamill co-founded 80six, with a passion for providing spectacular visual events, utilising over 15 years of professional experience in production.

33. JANUARY 2022

Powered by