WORKFLOW V I RTUA L PRODUC T I ON .
“The other name we have for the LED volume technique is in- camera compositing, which follows a pretty traditional VFX approach,” Griffith explains. “The layers of an Unreal-built world need to render to give parallax. On the virtual production stage, you’re still composing through the view of a camera’s lens, which is exactly what you’d do in post. Now, you’re just compositing in the very moment you shoot, rather than afterwards. “When shooting a green screen with a 50mm lens, I would take it into a visual effects platform, recreate the view of that lens, then create the backgrounds using that view. We still do that when working with a volume, just, again, ahead of time. “But in a traditional workflow, you’d spend a lot of time rendering the image. You’d have to emulate the camera and lens that were used; set up the digital environment; render it and give to a compositor; and have it put through a compositing package, then render that out as an image. It goes through several different processes to get you to a final image. Instead, when you’ve done all that ahead of time and rendering through Unreal, it just happens immediately. The image layering process is the same, but the time to product is immediate.” HIGHER RESOLUTION To wrap a shoot day with what may well be an all-but-finalised image is of great benefit, all the way down the production chain. The marketing department has footage to feed into a trailer or other promotional materials. Editors can work with a clear idea of the shot’s full composition. Talent can perform with a real reference on-screen, rather than in an empty green space. And directors see their vision brought to life immediately.
WHERE THE MAGIC HAPPENS Garden Studios’ virtual production volume comprises a 12x4m back wall with ceiling, but this can transform into an almost fully encompassing space, with wall extensions and LED panel totems
“The image layering process is the same, but the time to product is immediate”
which we can take existing plate photography, or just non-moving still imagery and – through photogrammetry – project it onto rough geometry. As long as the camera move is not too big, that dimensionalised plate will have enough parallax on-set to make it look like it was a full 3D build, without actually doing the months of work Unreal demands. “Resolution also needs to be considered. In some cases, we’ve had to send 24 4K outputs to an LED wall. Your normal visual effects shot is just one 4K or 8K scene, but we have to do that 24 times over and render immediately. The amount of computing power required is massive. It’s hundreds of times more than you’d normally need.” As such, highly photorealistic VFX scenes we’d usually see in the post-production space are, for now, virtually off limits. “Typical visual effects platforms can still render to a higher level of detail than what we can do in Unreal and virtual production. Video-game engines weren’t designed to do photorealistic 3D rendering. That’s the goal, but the hardware and software isn’t quite there yet. We’re close, and depth- of-field helps, but 2.5D is based off photographic images, which is why many find it the most successful use of LED volume.”
The pre-production VFX work itself begins with pre-visualisation, which can serve as a beneficial step in traditional workflow. Here, however, it’s essentially mandatory. “The director’s vision is essential, to know what we’re going to create for the set,” Griffith says. “Pre-visualisation, tech- visualisation and virtual location scouting all assist greatly. If you’re doing these stages in Unreal, the other advantage of this whole process is that we’re just taking those assets, making them higher and higher resolution. We start with a low-resolution image or asset, then add more detail, and by the time we finally get on-set, it’s just an upgraded version of tech-vis and pre-vis work. If those assets need to be used in post-production VFX, we upgrade a little more and pass on to the next department.” However, virtual production isn’t all about Unreal 3D builds. This exciting opportunity is widely publicised, due to its groundbreaking potential. But more photorealistic and slightly less demanding approaches may well offer the finest results. “We can apply a 2D environment to the volume, which is useful for any static camera shot where parallax isn’t needed,” Griffith states. “Then, there’s another process called 2.5D, in
MIND-BLOWING TECH While static images with limited parallax can be spread across the entire volume at all times, much more demanding 3D worlds will often only be displayed when within the camera’s frustum
63. MAY 2022
Powered by FlippingBook