FEED Issue 26 Web

42 TECHFEED Production in the Cloud

Underpinning the entire HPA exercise was the concept of the ‘content lake’, where content resides in one centralised location for collaborators to access using any number of applications. The content lake architecture allows for easy locating of all your assets, since they’re all in one place, and enforcement of security standards at rest or in transit because you have encryption. You can also leverage local zones to access more bandwidth and lower latency compute resources. But one of the most important elements is the ability for anyone, anywhere, to collaborate on content. This was shown off in the Lederhosen project, with footage being captured at the HPA hotel location and then immediately colour corrected in a Technicolor office in Vancouver. In filmmaking, technology and content have always been inextricably linked. The universal access provided by cloud workflows will ultimately start to shape the content itself. Zell sees all this connectivity and instant feedback leading to a kind of near-live filmmaking, where tasks usually relegated to post-production become part of the on-set activity. “Sam Nicholson, the CEO of Stargate Studios, said to me, ‘We have to stop saying we’ll fix it in post. It’s a very lazy attitude towards the industry. We have to get it right in capture’,” Zell recalls. “By that he meant, we have to set the video formats in which we’re going to be capturing, the frame rates and resolutions, and set the time code between the audio capture and the video capture, and figure out how we’re going to synchronise all of these events in the metadata, to bring them forward into a production environment where we can automate some of these processes. “We can then allow creators to be creative throughout the entire process because we’ve done the exercise up front, as opposed to stitching the content and audio together downstream.” For The Lost Lederhosen , Stargate Studios’ on-set VFX system ThruView was set up on the presentation stage to create a scene that simulated being on a moving train. The three-camera shoot

integrated a previously photographed moving plate that was tied together in the Unreal Engine, eliminating the need for a green screen process. EVEN LIGHT BULBS HAVE DATA Jack Wenzinger is a solutions architect in the M&E vertical for AWS who helps post- production facilities migrate their assets and teams into the cloud, and was also a contributor to the HPA Lederhosen project. He believes that the cloud is ready to go, at least in Los Angeles post world. “The AWS LA Local Zone enables artists to scale up to thousands if not hundreds of thousands of GPUs to support their rendering projects,” he says.

But even Wenzinger was surprised at the ubiquity of useful data in a cloud- connected infrastructure. “In the content industry, we’re always talking about how we want to capture metadata because it’s an invaluable source of information, enabling people to search, find and manage content easily and quickly,” he says. “But I didn’t realise before we started working on The Lost Lederhosen that lighting and lenses also have metadata. I did not expect a light bulb to have data. It blew my mind. If you’re able to capture all of these aspects and bring them together, you’re then able to virtualise that scene a lot faster than before… Game engines will be big a part of production in the cloud.”

WE HAVE TO STOP SAYING WE’LL FIX IT IN POST. IT’S A VERY LAZY ATTITUDE

feedzinesocial feedmagazine.tv

Powered by