VIRTUAL PRODUCTION WATER SCENES
DEEPER DIVE: SHARK ATTACK 360, SEASON 2 Diving into the factual landscape of shark behaviour, the second instalment of Disney+ show When Sharks Attack 360 investigates why sharks bite humans. As the evidence mounts, the international team of experts analyse data in a VFX shark lab, all in order to understand in forensic detail why sharks attack. For the docuseries, animation studio Little Shadow developed a hybrid VP workflow. Instead of using a traditional LED volume, it used a mix of custom- built systems and off-the-shelf tools to facilitate live green-screen keying, camera tracking and live monitoring. “We blended live-action footage with CGI, allowing us to transform an underground theatre in Islington into a virtual 360 shark lab,” explains Simon Percy, director at Little Shadow. “Initially, we used a LiDAR scan to create a 3D model of the venue, which we then employed to plan and scale the project. Due to the venue’s layout and lack of sound proofing, we ran a 4K signal across 110m and four floors using BNC cable, which allowed us to keep most of the equipment separate from the set.” The creation and integration of CGI assets, such as the shark models and virtual marine environments, were key for building the immersive underwater settings, which were then played back on-set using the VP box, providing immediate visual feedback. Percy continues: “We built the flight case around a custom PC for Unreal Engine, a pair of Blackmagic HyperDeck
THE ULTIMATTE ALLOWED US TO INTEGRATE virtual 3D elements INTO THE SCENES USING AR”
Studio recorders, the Ultimatte 12 4K for keying and a Videohub 20x20 for signal management. We also frame synced our cameras to Unreal using the DeckLink 4K Pro. This approach proved both mobile and flexible, ensuring quick playback with real-time asset generation and comping adjustments on shoot days.” A private Wi-Fi network connected the flight case to an on-set laptop, allowing them to control it remotely, including live switching via ATEM 1 M/E Constellation 4K.
“To bring the underwater scenes to life, we used a green screen and the Ultimatte, which allowed us to integrate the virtual 3D elements into the scenes using AR. This enabled the presenter to have a precise real-time interaction with the sharks. In combination with DaVinci Resolve’s AI functions such as Magic Mask for rotoscoping, we were able to blur the lines of where real and virtual production meet,” Percy adds. Looking ahead, technological advancements in water and fluid physics simulations are moving quickly. “With the advent of powerful RTX GPUs from NVIDIA and tools like JangaFX's Elemental suite, we can now simulate water dynamics in closer to real time – a process that would have previously taken days to complete. Blender’s capabilities for large ocean simulations, augmented by plug- ins like Physical Open Waters, hint at the possibilities for increasingly realistic and cost-effective water effects in the future of television production.”
MAKE A SPLASH Using live-action footage with CGI for the virtual 360 shark lab allowed the presenter to interact with the sharks in real time
16
DEFINITIONMAGAZINE.COM
Powered by FlippingBook