FEED issue 31 Web

64 TECHFEED Game Engines

but you’re still modelling, rigging and animating all your content in a third-party application like Maya, 3ds Max, Blender or Cinema 4D – you bring all that in, and then you basically shade it and light it.” “Game engines are, as the name implies, toolboxes to build games. They are not turnkey solutions for creating media productions,” explains Brodersen. “Game engines themselves are fairly cheap,” adds Canning. “It’s the talent and infrastructure that are expensive. If game engines can become accessible to a point where you don’t need a coder, you’ll see a wider user base. But even then, there will probably be levels to real-time content creators. Right now, a lot of people can buy Maya, but does that mean everyone is doing film-level VFX?” A VIRTUAL GALAXY FAR, FAR AWAY Pioneered over a decade ago on films like Avatar and Hugo , virtual production has been revolutionised by bringing game engines into the mix. Dispensing with the sequential pipeline of the past, CG cameras, rendered lighting and set assets can be moved in real time on a IF YOU’VE GOT BLENDER AND UNREAL, YOU’VE GOT AN AMAZING FREE SUITE RIGHT THERE

DOUZE POINTS The Eurovision Song Contest 2019 used The Future Group’s

virtual studio and augmented reality platform, Pixotope, for live broadcast graphics during the competition

virtual stage, and final-frame animations rendered out. “Moving a column in a palace in Unreal Engine is a drag and drop,” says Painting Practice’s Dan May. “Moving a real column

would involve thousands of pounds, and lots of people shouting at you.” Tools like multi-user editing in Unreal Engine allow potentially dozens of operators to work together on-set during a live shoot. An evolution of this approach was employed on Disney’s The Mandalorian where UE’s nDisplay multiple-display technology powered the content rendered out on the ‘Volume’ – the name given the show’s curved-screen virtual set. The LED screens blended with real-time parallax on the real-world sets, set extensions and VFX during production, and even lit the set. The Disney TV Animation series Baymax Dreams made use of tools in Unity such as its multitrack sequencer Timeline, the Cinemachine suite of smart cameras, Post- Processing Stack to do layout, lighting and compositing, and Unity’s High Definition Render Pipeline (HDRP). These tools are not just being used for previs – the actual rendered frames from Unity are what appear in the final show. In combination with game engines, consumer items like iPads, mobile phones

THE BEAUTIFUL GAME Mo-Sys and Epic Games developed an Unreal Engine interface to integrate photorealistic augmented reality into live production with almost no latency. The tracking system can be used on any camera, including ultra-long box lenses for sports

feedzinesocial feedmagazine.tv

Powered by