DEFINITION February 2019.pdf

CAPTURE STORY | WELCOME TO MARWEN

It was so critical that we got the lighting right in that mocap stage

SUCCESS THAT COMES THROUGH FAILURE

“We could use that to compose and make sure that our lighting that we had set was actually going to look good in the final shot. That ended up being so important and I really didn’t know how important until we entered post-production. In post- production we found that if the angle of the light hitting the actor’s faces was even just a few degrees off in our digital renders compared to what it was in the real stage renders, it broke the entire illusion. It was so critical that we got the lighting right in that mocap stage because if we didn’t there was no room for error. It was kind of a ‘moon shot’ to get things right in this stage and this virtual production process really helped us to achieve that.” MEETING OF MINDS Of course there are plenty of big movies that use mocap techniques and virtual production processes to achieve the shots but this movie more than others had to blend traditional cinematography techniques with virtual ones. “On this movie we were so constrained in terms of time, budget and resources that we had to get inventive; we had no choice but to really heavily lean on each other. I certainly couldn’t have lit the film in post without Kim’s lighting designs and Kim couldn’t have lit the movie on the mocap stage without our help.”

Kevin and his team realised through all their failed efforts to make the doll side of the movie work that this new way of achieving it was the only way and they had to make it work. “It took a big leap of faith for Kim the DOP to believe he could light this big grey void that is the motion capture (mocap) stage as if it’s the final movie. That takes some imagination. “To help him with that we created a real-time version of Marwen in the Unreal video game engine. The app allowed him to tweak sun direction and intensity with other custom light. He was actually able to work with Robert to understand what all the blocking and shooting orbits for every scene in the mocap world was going to be. He then spent about two weeks working in this real- time interactive Marwen, pre-lighting every single set-up we were going to be doing in the mocap world. “By the time we got to the mocap stage we knew what the lighting was going to be. We had one monitor which showed us what Kim’s ARRI Alexa 65 was seeing on the mocap stage – which was actors in grey suits and blue screen – and then another monitor that showed us a real-time view of what that same camera was seeing in the Marwen universe.

ABOVE Kevin and his team had to develop new motion capture techniques due to the amount of ‘doll’ time in the film, this included a tilt shift effect.

Having a film crew on a mocap stage isn’t a regular event but for this film it was the only way to shoot. “The unusual thing that happened on our motion capture stage was the intersection of every single normal live action component combined with every single normal motion capture component. There have been films dating back to even Avatar where James Cameron used what he called the Simulcam to see what the Avatar world looked like through a digital camera. But his actors were in these helmet cameras with these dots on their faces in flat lighting. You then have a movie like

32 DEF I N I T ION | FEBRUARY 20 1 9

Powered by