DEFINITION February 2019.pdf

SHOOT STORY | WELCOME TO MARWEN

Framestore delivered one of the film’s key sequences, with VFX supervisor Romain Arnoux overseeing close to 100 shots. Kevin Baillie, the film’s overall VFX supervisor, engaged Framestore to create the film’s first scene, creating a dramatic plane crash and bringing Marwen’s inhabitants to life. “I was seduced, because I knew there was no pipeline that would do this. I had never seen a project like this before,” says Romain. Framestore’s pipeline consisted of animation files going straight to lighting, where artists extrapolated from the setups used on the motion capture stage to create credible exterior lighting that matched the live action. Shots then went to compositing, where the team performed de-aging and completed the integration of the live-action and CG components. Framestore’s pipeline was tweaked, in that it allowed for back and forths between tracking and compositing departments to ensure high-quality tracking. Framestore used advanced motion- capture technology to turn the film’s stars into realistic looking dolls. At the beginning of the film, Zemeckis wants the audience to be tricked – he doesn’t want them to think the dolls are actually being filmed, or that the story is taking place in a doll- like world. To create the perfect hybrid, Framestore’s artists projected 75% of the actors’ faces onto their respective dolls. As soon as the plane crashes, the audience is introduced to a fully doll-like world, where only Steve Carell’s mouth, eyes and part of the chin is preserved to achieve the desired plasticised look. “Because we only had a texture from one camera point-of-view to project on the doll, we had to be perfectly aligned,” Arnoux noted. “There was always some slight variation between the doll’s face and the actor’s face. We used custom tools to calculate the disparities and realign the face, but it wasn’t easy – most of the time, the tracking team had to do it by eye.” FRAMESTORE’S MARWEN

which would mean the actors acting the scenes while multiple cameras shot all the different angles. But what if we did motion capture, and simultaneously lit and photographed their faces? That’s what we tested, and it seemed to work.” LIGHTING They tested two ways of lighting, including flat lighting – so VFX could create whatever lighting direction they wanted – and they tried it with more cinematic lighting, as though the characters were in situ. “The tests proved that lighting them as though they were in situ was the much better way to go, it was much less artificial. So fast-forwarding to the days that we shot the motion capture, we physically shot their faces with real cameras and created shots with real cameras within the motion capture space, while the motion capture equipment was recording the movements. “The way we got around the lighting situation on their faces was to very heavily prep the motion capture. Before we even started shooting the movie, Bob and I would spend weekends with the visual department to create all our doll environments in the Unreal video game engine. They had these 3D environments with 3D characters and 3D representations of the camera, which was mathematically matched to an ARRI Alexa 65 in terms of sensor size and focal lengths. Bob and I would go into these virtual rooms and look at monitors while we moved this virtual camera around the space and created the shots, at least created the general blockings and how we wanted to figure out each scene. What that did was give me enough information to say, if the scene was going to play out like this then we would want to put our sun on this side of the set and put our fill on this side of the set. “That gave us a way to plan for our lighting, so I lit the Unreal virtual sets then used all of our lighting direction and

lighting quality information from some of those pre-visualisations to go to our motion capture stage, and recreate them with physical lights. So instead of the traditional way of us lighting something and then VFX matching it, VFX lit it with my guidance and then it was us matching the lighting on-set. It seems to have worked; we stayed true to everything that we planned and the other information being so extensively prepped gave us extra parameters, like shadows, from door frames and the like. So we were able to create those elements within the motion capture stage and have the actors interact with their set, so to speak, even if there wasn’t any set.” GAME INDUSTRY DISCONNECT For the first couple of days of working like this Kim admits to a kind of disconnect between his world and the video game rooms and look at monitors while we moved cameras around the space Bob and I would go into these virtual

RIGHT The build of a shot from Framestore.

24 DEF I N I T ION | FEBRUARY 20 1 9

Powered by