Definition April 2025 - Newsletter

TECH VIRTUAL PRODUCTION

The most complex way to get an image onto a screen involves a full 3D graphic design effort. Since Unreal Engine isn’t ideal for modelling, assets are often created in software like Maya before being textured and lit to balance visual fidelity with server performance limits. Lighting the real space to match might require tools like Assimilate’s Live FX, which turns video images into lighting control data, then rigging lighting devices to suit. Calibrating lighting to match the screen image is another process that may still be somewhat manual. That description is inevitably incomplete, and it still sounds like a lot (because it is). VP facilities are built from a disparate stack of equipment mostly inherited from other industries. PlayStations and Xboxes have created a vast market for 3D rendering devices, but the appetite for VP studios is probably too small for anyone to pay for the R&D on a convenient, single-purpose box which does it all – even if such a thing were possible to imagine. This all sounds a bit grim, but we also know that a lot of productions are having a wonderful time shooting car interiors against LED video walls. Clearly, not every episode of this season’s new police procedural is shouldering the VFX workload of a nine-figure superhero movie. How do we wrangle such a pile of equipment on a smaller show? Well, to a great extent, we don’t because the world is realising that many applications of ICVFX only need a subset of the full arsenal. Aliens didn’t use camera tracking, 3D rendering or even real-time colour correction. Interactive lighting involved waving flags in front of lights. Screen content came from a model unit on the adjacent stage, working under the gun

FROM A TO B Signiant Media Shuttle (above) has portals for accessing and transferring large files

to create backdrops around the main unit’s schedule (which makes it difficult to complain about the pre-production workload of preparing material for an LED wall). Better yet, LED walls wouldn’t exist for decades after Aliens, so it relied on 35mm projection. Black picture areas were still a white screen, so the slightest stray light would destroy contrast (that’s what compromises the least-successful shots). It was in-camera compositing on hard mode, and it worked. That’s not to propose classic back projection as the right solution for 2025, but between those two extremes lies a huge range of options. Using a live-action plate rather than real-time rendering is standard procedure when it comes to convenient car interiors. A couple of early ICVFX experiments utilised video walls rented from live- events companies, and without even synchronising the screen to the camera. That demands a crew who knows what they’re doing, to put it mildly. Fortunately, such crews are readily available.

Even at the high end, time and experience have smoothed out some of the early wrinkles. Particularly, all that equipment is separate, but that makes it highly configurable. High-contrast LED walls will always be easier to light around than a white back projection screen. Hybrid approaches such as 2.5D backdrops – where flat images are projected onto approximate geometry – can save time. There will always be skills to learn – shooting good plates is an art form of its own – and experienced professionals will concede that most set-ups rely on at least some form of manual adjustments. Things may still change. AI promises to do some of the content-generation work, as it has promised so much (and sometimes delivered). What matters, though, is that the range of options which make ICVFX complex also make it flexible enough to cover many different scenarios. It’s probably that realisation, as much as any technology, which has made VP so much more approachable.

22

DEFINITIONMAGAZINE.COM

Powered by