DEFINITION May 2019

FEATURE | V I RTUAL PRODUCT ION

We’re talking about a pre-vis that is more of a ‘vis’. Shea puts this speculation in more real-world terms: “The holy grail is to be able to take the final effects assets and use them at the very beginning of the pipeline in pre-vis. That’s not only a workflow shift, it’s also a psychological shift for the industry as a whole. It’ll work for some filmmaker, but may not be preferable for other filmmakers. What it essentially would mean is that, at the very beginning of the process, you build your characters, your sets and your props for all your CG work for final at the very beginning. It wouldn’t be a proxy. “Right now, the way that filmmaking works is that it’s an evolution, where you start with a proxy and then look at it more evolved, more evolved and more evolved. That’s kind of the way people have been working now. But to have the ability to LEFT VR is now used on set to ‘walk through’ a virtual set, so you know what works and what doesn’t before building the environment People in the VFX world who know about these things say that, in ten years, VFX for high-end movies and TV will be a live event on set

VFX MEETS CINEMATOGRAPHY You could see cinematography’s relationship with VFX as being a case of bringing the two together kicking and screaming. But Shea sees it as a great opportunity for both disciplines, with each needing the other in a reciprocal relationship. “How do these different parts of the film process work together?” she asks. “When everything was changing and motion capture was being introduced into the film industry, actors were concerned they were going to be replaced, but as it turned out, instead of replacing them, they now realise they can actually perform as a virtual character. They see what the value is, and so it has become more readily adopted.” Among directors of photography, there has been a concern that more and more VFX would encroach on their territory – but Shea insists that’s exactly what she doesn’t want to happen. “We want to be able to benefit from the talents of a skilled and experienced DOP and camera operator,” she stresses, “so we can put the tools they’re used to working with in their hands. They can actually drive the camerawork, as opposed to visual effects trying to interpret what they’ve done on set. That’s the biggest change – we’re trying to go back to traditional film production almost, not looking to replace it.” CAPTURE TO CGI When Definition featured Welcome to Marwen , directed by Robert Zemeckis (February 2019), we also talked with DOP C. KimMiles about using the Unreal Engine to light his actors. He admits to a kind of disconnect between his world and the video game engine world he was filming. He describes his struggle, saying: “I had a hard time articulating, saying for instance that I needed a hard light over here and a really diffused fill light over here. It took a bit of translating to get on the same page. So, what they graciously did was to talk with our electrical department and get photometric data frommost of our lighting package, which they then input to their system.

“They then created an iPad app for me, so I could sit in the game engine room with them and turn dials in the app to move the position of the sun, lower the intensity and quality of it, including the colour – it was supremely helpful in communicating how we wanted it to look. In effect, they matched their library of lighting tools to our physical sources,” he recalls. Shea admits that the lighting encoding side is getting better. “It’s not 100% yet, but it’s getting closer and, right now, it’s still an excellent tool for a DOP,” she says. LIVE VFX If you speak to the people in the VFX world who know about these things, they say that, in ten years, VFX for high-end movies and TV will be a live event on set. Let that prediction sink in for a second...

60 DEF I N I T ION | MAY 20 1 9

Powered by