FEED issue 31 Web

66 TECHFEED Game Engines

industry-standard SDI video input/output support, genlock/synchronisation, time code support, camera tracking parsing and real-time chroma keyer,” adds Brodersen. “Pixotope also has proprietary technology for solving the compositing of graphics and video in a non-destructive and highly performant manner.” Computer software company Disguise is also moving into this space, trialling integration with Unreal Engine in its latest systems, setting the stage for immersive live visuals from its legions of users. The essential component in integrating game engines into live mixed-reality experiences is an effective control system. Experimental support in the UE 4.25 release allows connection from the Unreal Engine to external controllers and devices that use the DMX protocol. With bidirectional communication and interaction, this means creatives will be able control stage shows and lighting fixtures from Unreal and previs the show in a virtual environment during the design phase. GEARING UP Many content creation tools are low cost – Blender is free – while sites like Epic Marketplace, KitBash3D and TurboSquid offer a host of free and low-cost models, assets and environments. “If you were starting out a studio and just wanted to do work, if you’ve got Blender and Unreal, you’ve got an amazing free suite right there,” says Dan May.

After using it during the production of His Dark Materials , Painting Practice released the free Plan V visualisation software. Capable of multi-editor control, it allows users to experiment in real time with different lenses, cameras, animations, lighting and more without extensive knowledge of 3D modelling software. As well as recent link-ups with Arri, Mo-Sys and Ncam to support commercial virtual productions, On-Set Facilities (OSF) offers StormCloud, a VPN that allows production crews and talent to come together on virtual sets that are located in the cloud. OSF also recently joined the SRT Alliance to employ SRT, the open- source protocol to transport low-latency timecoded video from any location into Unreal Engine. “We are approaching a time where the interactive and immersive capabilities of the game world will merge with the fidelity of the video/pre-rendered world delivered through thin clients, no matter where we are,” says Canning. “The combination of edge computing, 5G connectivity and real- time engines will make for a powerful and creative ecosystem.” And all this virtual production technology might just become indispensable in the new era of social distancing and remote production. “With powerful tools and software in the cloud, and an enhanced ability for artists and technicians to collaborate around the world, it’s going to be even easier to create amazing content as time goes on,” concludes Canning.

IT’S GOING TO BE EVEN EASIER TO CREATE AMAZING CONTENT AS TIME GOES ON

GAME THEORY Unreal Engine

offers a specialised Cinematic Viewport, which gives users the ability to preview cinematics in real time

feedzinesocial feedmagazine.tv

Powered by