DEFINITION March 2018

49

GAMES ENGINES FEATURE

BELOW ADAM and other droids are released in to the world. See the films on the Oats Studio website. RIGHT Everything in ADAM is digitised including locations and motion.

Clothing was videoed and photographed for reference and the actual physical patterns were sourced to recreate and later simulate the clothing in digital tailoring software. Actors were scanned and likenesses recreated so that in effect they would later ‘play’ digital versions of themselves in the films. They would also later be motion captured for their body performance and again have performance facial capture done so that their exact performance would be maintained and seen in the final piece. On the motion capture stage we loaded up the now digitised real-world physical locations. We could then look through the monitor in a virtual camera and see the real location. This allowed us to recreate rises and falls in the ground, obstacles and props so that the actors could directly interact with the virtual environment as if they were there at the real physical location. We could choose our camera angles and ultimately film them just as we would on a physical set even though it was all virtual and only seen through the monitor. The motion capture cameras would capture the movement of the virtual camera; it has the ability to switch lenses, zoom, dolly, pan, just like any physical camera would and so felt very familiar to us. Essentially it allowed us to film as we would any other regular project. The one key

these things game engines. It’s clear they are no longer just for games. Secondly a huge stumbling block to using them in other fields was how you could get data into them. There have been great steps taken over the last year to remove that barrier. Allowing artists to work in ways that are familiar to them and then to be able to transport their work into the engine via familiar tools like Alembic as an example. Also within the engines themselves, things like Timeline in the case of Unity allow it to essentially act like a non-linear editor expect it’s not managing flattened video clips, it’s managing whole 3D scene files and worlds. Def: Explain in detail the processes in play. Digital capture is followed by what tools to create lifelike drama? Is the process agnostic to creative software that is available? CH: To go into great detail would take far too long, so I will try to move through it quickly but without missing any key components. And yes, in theory it’s software agnostic, in theory one could use whatever software they prefer to do the various individual steps (I mean so long as it’s what the software is intended for). I guess our ADAM short films (the original was created by the Unity demo team in Europe) began in a very traditional and practical approach to film. Scripts and concept

art, location scouting, scouting and costume design. All these were approached from a typical physical approach, it’s what we know and it works so why change it. In fact, we thought that coming at it from this direction may help to give the all CG animated films end result a more ‘grounded’ feeling. Each of these physical components would then go through their own ‘digitisation’ process. Locations were photographed tens of thousands of times and these photos (through the use of a technique called photogrammetry) were translated into exact detailed digital replications of themselves. Concept art was handed to digital artists to recreate. Props were sourced by our production designer and more photogrammetry photo shoots were completed, translating these props into digital versions. We would then ‘set-dress’ the virtual location with the now virtual props. ON THE MOTION CAPTURE STAGE WE LOADED UP THE NOW DIGITISED REAL-WORLD PHYSICAL LOCATIONS

@DEFINITIONMAGAZINE |

@DEFINITIONMAGS |

@DEFINITIONMAGS

MARCH 2018 DEFINITION

Powered by