Definition May 2023 - Web

PRODUCTION. OUR UNIVERSE

Beyond the stars

A new Netflix series explores the fascinating relationship between Earth and the cosmos

WORDS. Robert Shepherd IMAGES. Netflix & Stephen Cooter

W hen Morgan Freeman lends his voice to something, it’s sure to be a hit. Having played the character Red, who narrates The Shawshank Redemption (1994), he went on to become the voice behind several documentaries and TV series, including Cosmic Voyage (1996), Slavery and the Making of America (2004), March of the Penguins (2005) and Breaking the Taboo (2011). Freeman also hosted and narrated Through the Wormhole from 2010-2017. His latest project is Our Universe , a six- part nature documentary made by BBC Studios and Netflix. So, with one of the most well-known voices in the business secured, the team was charged with making it visually spectacular. Paul Silcox, VFX director at Lux Aeterna and VFX supervisor for the series, explains how – in terms of technological challenges – processing huge amounts of space data into the VFX pipeline was incredibly significant. The team worked with Durham University’s Institute for Computational Cosmology,

VFX team also developed in-house tools to deal with project-specific challenges. “ShotGrid and Deadline managed the render of all our VFX and have become a permanent part of our pipeline,” adds Rob Hifle, Lux Aeterna creative director. “It’s because of the power and flexibility of these tools, combined with the talent of our 25 artists, that we delivered the project on time and budget, keeping open lines of communication with the production companies to ensure everyone was satisfied with the final shots.” Early in development, director and producer Stephen Cooter had extensive conversations with showrunner Mike Davis and Netflix to define the look. “The show is pretty unique – combining natural history with VFX to tell animals’ stories in the context of

processing simulation data to show the formulation of Earth’s moon on screen. “It took four weeks to process a massive 30TB of information into the various formats needed for the show,” he describes. “With 100 million points per frame, 1.4 trillion data points were processed for just one scene. We had to change quite a few things about the way we work to manage that pipeline, including turning to the cloud and using AWS to scale up our rendering output. However, the result of upscaling our efforts means we now have a robust and powerful pipeline to work on even larger projects in the future.” To get the desired look, the team worked mostly in SideFX’s Houdini, alongside Maya and Nuke, for the 700 shots it produced for Our Universe . The

“It took four weeks to process 30TB of cosmological data into the various formats needed for the show”

44. DEFINITIONMAGAZINE.COM

Powered by