LIVE Mar/Apr 2025 – Web

INDUSTRY ICONS

27

live content because things can go wrong and it’s a bit nerve-racking. I think readers of your magazine will probably know and understand this. What I enjoy most is combining all my different interests and finding the links between these different mediums. For example, on live projects, VR isn’t always the final output medium, but it can be used as a tool to make the final show better. I use my skills from virtual production, VFX and VR for the big screen and it all comes together in the end to make a new project. I don’t have one favourite medium, but the biggest is probably VR – I think it’s still got so much potential. But I enjoy the combination of working across different mediums. Do you have any favourite live projects you’ve worked on? I would count 1899 as a live project because our job was to make it work on-set, while 100 people were running around the studio. It was cool, making Unreal work for this environment. It was super interesting to be at the production in the studio and play my part to make the show shine.

most software packages can talk to each other nowadays, you can do fun stuff like stream your viewport from your compositing tool or renderer with NDI directly to the virtual media plane in VR, and teleport around the venue to see it from new angles. In the early days of production, it’s crucial to understand the limits and challenges as fast as possible. I see a lot of good use cases in XR tech, especially live entertainment – be it a concert, theatre or immersive art exhibition – to preview what you’re doing by going into a virtual space and experiencing how it will look. In my opinion, as soon as the final product isn’t going to be watched on a flat two-dimensional screen at arm’s length, it’s worth thinking about VR. You’ve worked across quite a wide variety of mediums. Is there a particular area that you prefer working in? I don’t have a specific favourite, but I like the live components. During live shows, you can immediately see how different people are reacting to things, and they have this shared experience. It’s more interesting with

After being with Dark Bay, I left to join Woodblock Animation Studio as head of XR and real time. Can you give us an example of where real time, VFX and XR come together in your daily life? At Woodblock, I worked on our first project for Sphere. We created a spot for Aston Martin displayed on the outside of the spherical LED screen in Las Vegas during the Formula 1 race weekend. The cars in the spot were rendered in Unreal, combined with Houdini cloud simulations. Since it was our first project for such a special screen, we debated a lot about how to best use this canvas for maximum effect and even built a web viewer in Babylon.js you could use to play the spot, fly around and jump to specific locations. A few projects for Sphere’s exterior later, and this simple viewer is still being used for internal review purposes – and with clients. It’s easy to use and runs on almost any device. For a more recent project, we had to produce a lot of content for a live show and used VR extensively to review the shots and simulate how the show would look in the end. Since

During live events, you can immediately see how people are reacting to things, and they have this shared experience”

Stärk worked on Aston Martin’s spot at Sphere in Las Vegas during the city’s Formula 1 race

Powered by