DEFINITION March 2018

52

FEATURE GAMES ENGINES

IMAGE The ADAM series of films has nothing to do with games, it is filmmaking.

all CG film production: we did. Obviously, there are lots of things I would like to see advanced and pushed further both in workflows and simply in real-time rendering capabilities. The rendering of hair for example is problematic in real-time engines currently, but many people are working on solving that issue. But I should point out that there are lots of possible applications for real-time engines like Unity in film production: VR, AR, and even as a tool for live action VFX driven films, whether it be for pre and post-viz or use on set as virtual cameras or virtual characters bring the virtual to the physical production world. It’s very exciting times. Def: Do you have any other information that might help our readers who work in film production understand what is happening here? CH: If you’re really interested in the use of real-time engine in production: do your research. There is so much information available out there now, even though it’s still new. Then start experimenting with it. Check out the Made with Unity blogs; they have lots of practical case studies there. Also take a look at companies like Magnopus or Digital Monarch Media and see how they are using the technology in other ways.

use when they work with the motion. Yes, we use motion capture but it still requires an animator’s eyes and hands to finalise the performance. And of course we also use facial performance capture, very similar to the photogrammetry described earlier except that it’s happening 60 times a second capturing all the subtle performance nuances in the actors’ faces. With a lot of clever work from our rigger again, these captured performances are grafted onto the final animation of the body. All of this work is pretty standard in high-end visual effects work, the major difference is that once it’s all done we cache it to a format called Alembic which allows the high detail to be streamed into the real-time engine for final shot lighting. Def: How different is it making a movie to making visuals for, say, a video game? How much will Unity have to change to help traditional production to be more cinematic? CH: Well I make movies, not games so I might not be the best person to ask but they are fundamentally different. There are many shared skills and techniques but they are different beasts. How much will Unity have to change? I guess that depends on what you want to do with it! I mean you can use it now as it stands for

to break it down we started with casting them, just like in any production. Then we cyberscanned our actors using a process called photogrammetry. Essentially computer software analyses tens to hundreds of photos and using trigonometry reconstructs an accurate 3D volume of them as a point cloud. Our artists then take that, along with many more reference photos, and construct the final digital replication of the actor, adding finer detail and eventually colour and texture. At the same time, we had a traditional costume designer, designing and creating practical clothing for the cast. We would later use these practical costumes to derive high-resolution fabric scans and video reference for how the various textiles would move. Our digital tailor would then recreate these physical costumes in the computer, making sure even down to the physical simulations that they were a match. Then our look development artists would set up the shading networks so that the models respond to light correctly. At this point we have digital versions of the actors, in costumes that essentially look real, but they don’t move and that’s where the next set of people come in. Our rigger creates a digital skeleton and puppeteering handles for the animation team to

OUR DIGITAL TAILOR WOULD RECREATE PHYSICAL COSTUMES IN THE COMPUTER

ABOVE Chris Harvey, VFX Supervisor, Oats Studios.

https:// oatsstudios.com

DEFINITION MARCH 2018

DEFINITIONMAGAZINE.COM

Powered by