Dive into our latest issue, featuring behind the scenes looks at 28 Years Later, F1 and The Handmaid’s Tale, plus our Colour Special: a journey into the cutting edge of colour in filmmaking. There’s also kit reviews, industry news and lots more to explore – enjoy!
JUL/AUG 2025
£5.49/$11.99
Claudio Miranda, ASC straps cameras & audiences into a 180mph thrill ride for F1 IN THE FAST LANE
BEHIND THE iPHONE 15 LENS ON 28 YEARS LATER INSIDE MISSION: IMPOSSIBLE’S AUDACIOUS AERIAL CINEMATOGRAPHY MOBILE 3D SCANNING: THE NEXT VIRTUAL PRODUCTION GAME CHANGER?
COLOUR SPECIAL A JOURNEY INTO THE CUTTING EDGE OF COLOUR IN FILMMAKING
Use this QR code to view the issue online, find our socials, visit the website and more!
WELCOME
EDITORIAL Editorial director Nicola Foley nicolafoley@bright.uk.com Senior staff writer Katie Kasperson Features writer Oliver Webb Chief sub editor Matthew Winney Sub editors Zanna Buckland, Minhaj Zia Contributors Josh Ighodaro, Amin Jafari, Adrian Pennington, Phil Rhodes,
Daniele Siragusano ADVERTISING
B ack in 2002, surprise indie hit 28 Days Later rocked the box office with its post- apocalyptic tale of an England torn apart by a ‘rage’ virus. Fast forward to today, and the virus is still raging in 28 Years Later , which sees cinematographer Anthony Dod Mantle reunite with Danny Boyle for a suitably horrifying follow up to the beloved zombie movie – largely shot on an iPhone. Read all about the process on page 12. Elsewhere this issue, we get behind the wheel of F1 , directed by Joseph Kosinski and shot by long-time collaborator Claudio Miranda, ASC. Together they capture the full-throttle thrills of Formula One racing – check it out on page 6. As The Handmaid’s Tale reaches its conclusion, we also catch up with the dystopian drama’s cinematographer, supervising sound editor and sound designer for a look at the visual and sonic language that shaped the show. And with the latest Jurassic Park instalment riding high in theatres, we take a look back at the original 1993 film – that launched one of the biggest franchises of all time. It brought CGI to the mainstream, got a whole generation dinosaur-obsessed (myself included) and put Hawaii on the map as a top filming destination. That’s in this month’s Take Two: have a read on page 27. Our AI & The Craft series continues as well, looking at how film festivals are responding to the rise of AI, plus we explore how compact, handheld scanners promise a brave new world for pre-production. Meanwhile, our first ever TEST SPACE feature sees a leading DOP review three monitors in the field to give his verdict. Don’t buy your next monitor without giving it a read! We also chat with the cinematographer of It Was Just an Accident (this
Sales director Sam Scott-Smith samscott-smith@bright.uk.com 01223 499457 Sales manager Emma Stevens emmastevens@bright.uk.com 01223 499462 | +447376665779 DESIGN Design director Andy Jennings Magazine design manager Lucy Woolcomb Ad production Holly May PUBLISHING Managing directors Andy Brogden & Matt Pluck
year’s Cannes Palme d’Or winner) and gather a panel of kit rental pros to discuss the turbulent state of the sector and where it’s headed. Enjoy the issue and see you next time!
Bright Publishing LTD Bright House, 82 High Street, Sawston, Cambridgeshire, CB22 3HJ, UK prices, without tax, where available or converted using the exchange rate on the day the magazine went to press. Definition is published monthly by Bright Publishing Ltd, Bright House, 82 High Street, Sawston, Cambridge, CB22 3HJ. No part of this magazine can be used without prior written permission of Bright Publishing Ltd. Definition is a registered trademark of Bright Publishing Ltd. The advertisements published in Definition that have been written, designed or produced by employees of Bright Publishing Ltd remain the copyright of Bright Publishing Ltd and may not be reproduced without the written consent of the publisher. The content of this publication does not necessarily reflect the views of the publisher. Prices quoted in sterling, euros and US dollars are street
Editor in chief
@definitionmagazine
@definitionmags
PROUDLY PARTNERING WITH...
03
DEFINITIONMAGS
CONTENTS
COLOUR SPECIAL 31/ A deep dive into the world of cutting- edge colour in filmmaking – exploring quantum dot OLED, colourisation of vintage footage, the FilmLight Colour
Awards and lots more PRODUCTIONS 6/ F1
Claudio Miranda takes us inside the high-octane summer blockbuster 12/ 28 YEARS LATER Anthony Dod Mantle reflects on filming the zombie sequel – on an iPhone 18/ MISSION: IMPOSSIBLE The aerial team behind pulling off the franchise’s boldest stunts shares all 48/ THE HANDMAID’S TALE As the dystopian drama wraps on its sixth season, the cinematographer, supervising sound editor and sound designer take us behind the scenes 64/ IT WAS JUST AN ACCIDENT DOP Amin Jafari shares his memories from the shoot of this year’s Cannes Palme d’Or winner TECH 22/ TRANSFORMING VP WITH MOBILE 3D CAPTURE TECHNOLOGY We explore how new tools are driving the future of virtual production workflows from the ground up
31
I n celebration of Jaws ’ half-century, this hugely entertaining documentary dives into the film’s inside story – with cast, crew, scientists and Hollywood directors reflecting on its impact and the sheer logistical nightmare of creating it. Filmed entirely on open water – wildly ambitious then and unthinkable now – the shoot ballooned from 55 to 159 days. Boats were begged and borrowed, anchors everywhere – waiting on a clean horizon, while the mechanical shark, Bruce (named after Spielberg’s lawyer), malfunctioned daily. Actors were yanked around on ropes to mimic attacks, tempers frayed, marriages failed, the budget more than tripled – and Spielberg spiralled. But the result was a true pop culture phenomenon and one of the most influential films of all time. Sure, it would’ve been easier to shoot today with a water tank, volume wall and modern VFX – but where’s the fun in that? Editor Nicola dives into the shark-infested waters of the 1975 summer blockbuster DEFINITION RECOMMENDS: JAWS @ 50
04
DEFINITIONMAGAZINE.COM
CONTENTS
56/ DOCUMENTARY STORYTELLING THROUGH SPATIAL VIDEO Immersive storytelling brings D-Day : The Camera Soldier to life. We find out more
REGULARS 27/ TAKE TWO
A look back on the original Jurassic Park , which solidified Hawaii as a production destination and set the bar for CGI 54/ THE VIEW FROM… We continue with France, known for its pioneering New Wave and poeticism 61/ IN SHORT Frank Sun and Scott Aharoni on their coming-of-age short We Are Kings 62/ AI & THE CRAFT We explore film festivals’ handling of AI amid struggles to adopt a standard 68/ ROUND TABLE Experts assess the kit rental sector; its key pressures and future prospects GEAR 78/ TEST SPACE DOP Joshua Ighodaro puts three monitors through their paces in the first of our new kit tests series INDUSTRY 84/ FIRESIDE CHAT Severn Screen’s Ed Talfan talks Wales’ evolving production landscape, nurturing local talent and embracing sustainability 86/ INDUSTRY BRIEFINGS BAFTA albert’s Green Light Season, the ScreenSkills Passport and more 91/ BETTER TOGETHER Exploring the new partnership between Cinelab Film & Digital and fluent/image
12
48
64
54
© WARNER BROS
05
DEFINITIONMAGS
PRODUCTION F1
PITT STOP Brad Pitt’s training for months to master the racing cars reflects the film’s pursuit of authenticity in every shot
06
DEFINITIONMAGAZINE.COM
F1 PRODUCTION
Claudio Miranda, ASC slams on the throttle with some high-octane camerawork in F1 “We couldn’t have cheated any speed tricks to make this movie”
WORDS OLIVER WEBB IMAGES APPLE TV+
F 1 follows Sonny Hayes (Brad Pitt), one of of retirement to return to racing with Team Apex. He is partnered alongside the team’s hotshot rookie (Damson Idris) and soon realises that he can’t take the road to redemption on his own. Directed by Joseph Kosinski, the film was captured by longtime collaborator Claudio Miranda, ASC. Miranda has shot all of Kosinski’s feature films, beginning with Tron: Legacy in 2010. “I’m always around Joe and we often talk about what would be a great technology for the next movie,” he begins. “Before F1 , we discussed how to make Top Gun: Maverick better and we’re always trying to think how we can make the film. What I love about Joe is that we’re both very similar. We both prefer natural and believe that real is best; everything is sort of Formula 1’s most promising drivers in the nineties, until a near-fatal accident shatters his career. Thirty years later, Sonny steps out secondary from there. We try to aim for reality. I think people somewhat sense it. A lot of people are talking about using AI, but it takes you away from the immersion of the story.”
AT THE RACES When it came to the look of F1 , Miranda and Kosinski wanted to capture the authentic elements of F1 races. “We were really at the races when shooting this,” he says. “Between qualifying times, pre-qualifying times or any little slot that they would give us during the real race. There would be five- to ten-, even 15-minute windows that we were getting. We’d have scenes prepped, then we’d go out there and the cars would run. I think it feels like a visceral movie.” Miranda relied on a selection of Master Prime lenses, Fuji zooms, Voigtländers, ZEISS Loxias and Sony G Series E-mounts to capture the film. “We didn’t want to go into a synthetic world,” explains Miranda. “The studio was originally looking at volumes to start shooting the film, but that was never our approach, even though it might have been other people’s perceived approach. There was concern about how to get actors really going at the speeds that they go. I felt that there’s just no way to get the actors going around 180-plus miles an hour. Joe and I were watching some films together and we both loved Grand Prix . That was a great reference of shots, and
07
DEFINITIONMAGS
PRODUCTION F1
C lear Angle Studios was tasked with capturing 3D scans of several F1 circuits featured in the film. It used a combination of ground-based LiDAR scanning and aerial photogrammetry to ensure every track was scanned with precision. “Each morning, we’d lay out physical markers around the circuit, which allowed us to align the data from both the LiDAR and drone scans,” explains drone pilot Marco Lee. “By blending these datasets, we produced a cohesive 3D model for each track. These scans were then delivered to the VFX teams in order to accurately replicate the real-world tracks in a digital environment.” Yas Marina Circuit stood out as the most challenging environment to capture, especially as the scan was there was a panning angle that we were attracted to. So, we wanted to attempt a more modern take on that as well.” Miranda knew the cameras had to be small for this project and worked with Sony to devise one-off cameras specifically for the film. “They were prototypes of what we called Carmens – these little sensors on a stick,” he adds. “The whole camera system had to fit inside an elongated F2 body that Mercedes made for us. That was a lot of work, and we also had Panavision building these panel mounts. So, there was Sony developing new gear, and then Greg ‘Noodles’ Johnson with RF Films, who made all the telemetry on the track work as well.” POLE POSITION Miranda faced similar logistical challenges while figuring out how to capture cockpit shots in T op Gun: Maverick . “For Top Gun , we had six cameras in the cockpit: four on the actor and then two that were kind of overs to the pilot wearing the same helmet. That was also our pathology when it came to F1 that we didn’t really change. We were
GOTTA GO FAST Filming at real F1 events meant the team had to adapt to several variables
able to do a good variety of moves. They had to be approved by Mercedes and also Graham Kelly, our car fabricator. He was in charge of building all the cars that Mercedes gave them, all the parts too and safety heading. Everything was planned three months ahead. I had CAD models from Mercedes, and all the mounts were stuck in certain places.” Due to Miranda’s background as a gaffer, he finds it hard to let go when it comes to lighting. “I’m very much in charge of lighting,” he admits. “However, I do love input from gaffers. There was a gaffer I really loved – he was one of
my best collaborators; his name was Erik Messerschmidt. He then got an offer to start shooting for Venture and I told him that he had to go and do it. I’m very much a technical gaffer and always involved with 3D, the design of shots and camera design. I like my own way of exploring sets. It’s for myself, but also if you have to sell producers on a certain shot, it just helps me show them what I want to do and it all makes sense. Everyone’s on board.” Lighting choices were often out of production’s control due to filming across numerous actual live races. “As
SCANNING IN THE FAST LANE From ground-based LiDAR to high-flying drones, Clear Angle Studios captured every curve and corner of F1’s iconic tracks for the film’s VFX teams
conducted during the Grand Prix. “Due to event restrictions, we weren’t allowed to use the drones, so we needed to wait until the race was over and then work through the night with ground-based LiDAR scanners,” says Lee. “When we returned months later to complete the aerial photogrammetry, we faced strict flight height restrictions, which meant we had to carefully navigate around large lighting towers to get the coverage needed. It was a real test of skill for the team, but we stayed agile to produce a high-quality scan of an iconic track.” As someone who’s into cars and a F1 fan, being part of the project was an incredible experience for Lee. “Getting to step onto the actual tracks was surreal,” he shares. “Being there during the
FINISH LINE Clear Angle Studios used ground LiDAR to capture 3D scans of F1 circuits
Grand Prix added another level, seeing the teams, their cars and the entire atmosphere up close was something I’ll never forget. The APXGP car looked incredible, and the shots they were capturing were on another level. The crew was great to work with, professional, welcoming and clearly passionate. It made the whole experience not just technically rewarding, but genuinely fun to be part of.”
08
DEFINITIONMAGAZINE.COM
PRODUCTION F1
CLIP THE APEX Filming relied on prototype cameras custom- built to fit inside race cars without interfering with performance
cinematographers, we always like late light, but that goes all out the window on a project like this. You just start with the races; some of it is high noon – which is not always the best – but I would choose that over some virtual reality room at perfect sunset all the time, as I prefer the reality over artificial. “Abu Dhabi was an interesting one, moving from an hour before sunset into twilight and then full night. It was a challenge both in terms of lighting and schedule. I wasn’t putting bounce cards out or chasing highlights – it was more about just getting light on the actors and hoping the camera had enough range. There were moments of perfect timing, like when Brad would turn a corner and a sliver of sunlight would shine through, for example.” NEED FOR SPEED Brad Pitt and Damson Idris did all their own driving for the film. “Brad was going over 180 miles per hour,” reveals Miranda. “He was great. In normal moviemaking, we have things called biscuit rigs. It’s like a self-driving trailer. And as you can imagine, no matter what kind of VA engine they put into it, it’s still going to be a quarter of the speed of what an actual car will do. I was watching other racing movies and I could see that the exteriors look fast, but when you get inside the cockpit, it all of a sudden feels slow. The thing about F1 cars is they have
IT WAS A CHALLENGE BOTH in terms of lighting and schedule ”
a great view of the outside world. It’s a very driver-exposed sport, meaning we couldn’t have cheated any speed tricks to make this movie. Brad and Damson trained for months, from regular cars to F2 cars. The shoot lasted a couple of years, and the actors’ strike delayed us. Because of that, we missed part of the racing season and had to wait for the following year to go back and reshoot Abu Dhabi, Monza and all those scenes with Brad.” Miranda and his team often found themselves shooting from the sidelines and the pits, which meant they weren’t able to do a lot of Steadicam work. “Any
of the walking, or when we’re with the crowd, we’re shooting on DJI Ronin 4Ds,” he continues. “We used those mostly for the whole opening of Daytona. We needed a small camera that could be steady, fluid and nimble – one that wouldn’t be too big. The whole point of this movie was to stay small. I really liked the problem-solving element of the whole job.” Miranda concludes: “I love working with Sony too; they were up for it. It took almost six months from conception to actually having a camera in my hand. That’s amazing speed to build a camera and an extraordinary feat.”
10
DEFINITIONMAGAZINE.COM
PRODUCTION 28 YEARS LATER
As the cult zombie film returns with its much-anticipated third act, DOP Anthony Dod Mantle discusses shooting 28 Years Later with iPhone 15s “Coming back to that world some 20 years later, you feel ready for fire”
WORDS OLIVER WEBB IMAGES SONY PICTURES
F ollowing the events of its virus who inhabit a small island. When one of the group leaves the island on a mission, he discovers the infested horrors of a mainland plagued by the outbreak. Having previously shot 28 Days Later , which also marked his first feature collaboration with director Danny Boyle, Anthony Dod Mantle, DFF, BSC, ASC served as the film’s DOP. Dod Mantle has shot the majority of Boyle’s films since, including Millions , 127 Hours , Trance and the 2008 film Slumdog Millionaire , for which he won an Academy Award for best cinematography. Their most recent collaboration was 2017’s T2 Trainspotting , so it was only natural that the two would reunite for 28 Years Later . predecessors 28 Days Later and 28 Weeks Later , 28 Years Later follows a group of survivors of the Rage
Boyle was originally inspired by the films of George Romero for 28 Days Later and, of course, this film is part of the same universe. “We didn’t discover the genre; we just sort of reignited it, and 28 Days Later was a lot more successful than anybody expected it to be,” begins Dod Mantle. “Coming back to that world some 20 years later, you feel ready for fire and it’s quite unnerving. With Danny, the expectation is enormous. People also have a certain expectation of the level of aura and the type of horror. The difference with this one is that it’s agrarian and agricultural. It’s a vast space that is very different from the urban landscapes of London. You’ve got a bombardment of green information, which is not my favourite colour, but I grew to love it in this film. One of the biggest challenges was trying
to find existing ancient landscapes and dropping people into it.” 28 Days Later was shot using Canon XL1 digital cameras, a technique that was pioneering for its time. Boyle and Dod Mantle’s initial conversations about the look of 28 Years Later revolved around shooting the film on the same camera, as they wanted to maintain the gritty realism depicted in the original, but opted instead for the iPhone to keep with the times. “I thought that’s suitably unusual for Danny,” says Dod Mantle. “It’s not a completely original idea, but it’s an interesting idea for a $75 million film. Luckily, Danny and I can speak frankly with each other as we know each other well. I was down to the lions to try and figure out how to accomplish it. The next few weeks was about helping him realise his dream. I know he is also prone to
12
DEFINITIONMAGAZINE.COM
28 YEARS LATER PRODUCTION
13
DEFINITIONMAGS
PRODUCTION 28 YEARS LATER
dropping you in a semi-deep hole or bog in the middle of the countryside suddenly on day three. I respect him for that and know that can happen, so now I’m standing in front of two or three people and I can’t quite achieve what he wants.” To help prepare for the unexpected alternative technique compatibility with the smartphone. “I felt that Danny and some of the producers were awaiting some kind of epiphany I would come up with,” he smiles. “I started looking at production meetings, storyboards and the geography we had to travel, as well as landscapes we wanted to see and feel, and how he wanted to cover the scenes. Danny likes to work extremely fast and furious, which I love. For example, there would be situations written in the storyboard as a rat zoom, but I thought I’m not going to give him this; I’m going to almost hand out iPhones to actors and sew them into their costumes. We had demands of the film, Dod Mantle explored additional lensing and many iPhones, actually. They are very lightweight and we were not putting any
I’M GOING TO ALMOST hand out iPhones to actors AND SEW THEM INTO THEIR COSTUMES”
stabilisation inside the camera. We had to use small stabilised systems of our own. We had multicam systems, where you had between eight and 20 iPhones placed in an array, which we could then trigger manually.” As well as relying on iPhone 15s, Dod Mantle captured the film with additional thermal cinematography and infrared. “For those shots, I used a very old Panasonic camera, which I found alongside my dear camera operator Stefan Ciupek. We found four or five dotted around Britain that could
MEMENTO MORI The use of multicam iPhone rigs and infrared footage captures the Rage- infected with a gritty, immediate realism
14
DEFINITIONMAGAZINE.COM
28 YEARS LATER PRODUCTION
achieve a better signal, better resolution and still get the same thing – which is essentially an internal heat camera for animals and people – for one part of the film. There’s also a lot of drone footage in the film, courtesy of DJI. There’s one I call the howler – a tiny sensor, not the same quality, but it flies really close and doesn’t hurt anybody.” The actors never knew which frame would be used – and neither did Boyle, a technique Dod Mantle had previously experimented with during the making of the 1998 Dogme 95 film Festen ( The Celebration ) – directed by Thomas Vinterberg. The Dogme 95 manifesto consisted of a number of rules, including that shooting must be on location; the camera must be handheld (with any movement or immobility attainable in the hand permitted); and optical work and filters are forbidden. Working within those constraints enabled Dod Mantle to explore innovative ways of working, and 28 Years Later was no different. Dod Mantle developed a lens set specifically for the film that consisted of prosumer anamorphic lenses. “I didn’t like any of the 1.55 squeeze click-on phone lenses; they distorted too much,” he adds. “I liked the 1.33 and thought it was interesting to have in my hand a little gadget and anamorphic capability through an app that produced a spectacular wide-screen image. That became even wider during our final tests, eventually evolving to 2.76:1.” One of the more logistically challenging sequences to capture was the Alpha causeway sequence. “It’s a weird limbo land and this starry night,” details Dod Mantle. “The two characters have to navigate their way through the causeway and they are chased by an Alpha. I knew we’d have to shoot it day for night, as we have working hours with kids. Also, the summer weather in Britain is extremely erratic. When Danny and
SHOOT TO THRILL Minimalist gear keeps the viewers close to the action – as well as the relentless horror unfolding
I were looking for existing causeways, we realised there were so many issues attached to locations. We contemplated shooting it dry and then adding in water with VFX. We ended up going for a massive interior warehouse, which we then filled with water. We built a set and there was no lighting rig, so we had to put the rigs and lights up ourselves.” “You’re meant to feel petrified and fearful, and there’s this broadly presumptuous idea that the skies are without pollution as there are no industries or cars,” he continues. “We contacted astronomers and found this astronomy centre up north, where we shot plates through their extraordinary equipment. We could then map together
these beautiful sky plates in the effects afterwards. Additionally, we had a certain amount of haze and traditional effects for the sequence, as well as Technocranes, a track and dolly.” Despite the technical challenges presented while making the film, Dod Mantle loved the process of fitting the pieces together and coming up with a new system from the previous two films. “I haven’t made a film like this before with these tools. I seem to go back to the first letters of the alphabet and start all over again every time – that’s what we did with this one,” he concludes.
I HAVEN’T MADE A FILM LIKE THIS BEFORE with these tools ”
17
DEFINITIONMAGS
MISSION: IMPOSSIBLE AERIAL
THE SKY’S THE LIMIT We sit down with Dani Rose to discuss his role in bringing the Mission: Impossible franchise’s most audacious aerial stunts to life WORDS NICOLA FOLEY
T he Mission : Impossible films are famous for their daring stunts and innovative techniques – and behind many of the franchise’s most thrilling sequences is CineAero, a UK-based aerial filmmaking company led by owner Dani Rose. Over the last three films, Rose and his team have pushed boundaries to create bespoke solutions that capture Tom Cruise’s jaw-dropping stunts from all sorts of audacious angles. Rose’s introduction to the franchise came during Fallout ( MI6 ), for which he was brought on board as a drone pilot. One of his first major sequences was flying over the cliffs of Preikestolen in Norway, capturing the film’s final scene, and his first foray into custom work followed shortly after. “I was asked to modify and fly a drone that was used on- screen by Simon Pegg’s character. The production wanted a certain look for the aircraft, so we were given an off-the-shelf machine and retrofitted it with high-end flight electronics,” he recalls. “It had to be safe to fly in close proximity to the cast and crew – a challenge under pressure!” In 2023’s Dead Reckoning , the scope of work levelled up with the development of a lightweight multicamera array system for Tom Cruise’s motorbike jump. “It featured a custom wireless run/stop system and integrated tracking modules, so we could locate the rigs after the bikes landed in dangerous terrain. Because of
the risk of rockfalls, only mountain rescue teams were authorised to retrieve the kits, so the tracking was crucial,” Rose shares. Later in the film, he engineered a body- mounted dual-camera stabilisation system for the speed flying sequence, allowing two independently controlled camera systems – using both zooms and primes – to capture Cruise in mid-air. “It was built around professional skydiver and aerial sports expert Malachi Templeton,” says Rose. “We 3D scanned Malachi to design a carbon-fibre and aircraft-grade aluminium rig that maintained optimal balance and safety without compromising performance. We managed video, control and focus from a nearby helicopter – Malachi effectively became a flying camera platform, with two operators and two 1st ACs controlling the system remotely and a small ground support team. It was a world-first.” In the recently released The Final Reckoning , the bar was raised again, with Rose describing the film’s biplane sequence as the biggest challenge of his career. “I was approached with the brief that Tom and director Christopher McQuarrie wanted to ‘put cameras anywhere on the plane’. These were forties-era Boeing Stearmans, not built with modern cinematography in mind.” Working alongside DOP Fraser Taggart, stunt coordinator Wade Eastwood, the stunt team, pilots and engineers,
Rose’s team undertook a complete reengineering of the airframes. “We reinforced wings, created removable exoskeletal camera frames, vibration isolation and power systems for telemetry and camera control. Every camera build was unique to the aircraft it was mounted on, and each Stearman – being hand- built – had slightly different geometry, which made consistency a challenge. “Each airframe started with 34 sections to mount to, and we ended up with around 50 at the end of the shoot. Development was constant and always pushed the limits of aerial cinematography,” adds Rose. As with all of CineAero’s work, safety was a key priority – with all modifications deemed airworthy and approved by flight engineers. “That included new looms for power, telemetry, camera comms and monitoring systems across all four planes. Depending on the shot, we mounted between one and six cameras per aircraft, with up to 20 cameras and 40
18
DEFINITIONMAGAZINE.COM
MISSION: IMPOSSIBLE AERIAL
IN CRUISE CONTROL At this level, capturing a great shot came down to millimetre- accurate rigging and a dose of ingenuity
“He’s deeply involved in every detail, especially when it comes to the aerial work. He wants to understand everything – how the rig functions, what the flight profile will be, what the risks are, how the cameras and lenses are capturing the moment and how he can deliver his best performance,” shares Rose. “Even under extreme conditions for difficult stunts, he is always committed and thinking about the action, his position in the air, what the camera and lens are doing and how the viewer will react to his performance. It’s incredible to watch.” Having an A-list actor performing his own stunts heightens all safety protocols, but for Rose, it was worth it to see the star in his element. “When Tom is performing a real stunt – strapped to a biplane in a steep dive, for example – you feel the weight of responsibility. But you also know you’re part of something truly special. It’s real cinema, done the hard way, and that makes it unforgettable,” he enthuses. Looking back on his work across the Mission : Impossible films, Rose says there have been many memorable moments. “Venice during the night shoots in Dead Reckoning was special. Because of the Covid-19 lockdowns, the city was empty. We’d finish a night shoot, then sail back across the canals with no one around. It was surreal.” Another standout experience was filming in Svalbard for Final Reckoning : “We were operating near the top of the world – on the sea ice, with polar bears nearby, in front of massive glaciers. Very few people get to see that, let alone film there. Those experiences stay with you. “Each film has raised the bar”, he concludes. “What began as a drone shot in Norway eventually became me creating and leading my own technical teams, developing new ways to shoot aerial sequences, and engineering completely new systems for skydiving, speedflying, and biplanes. “It’s a rare thing to be part of something where the creative trust runs that deep, and where the solutions have to be invented from the ground up. I’m incredibly proud to be part of it.”
DEVELOPMENT PUSHED the limits of aerial cinematography ”
lenses managed in total by my very small team of camera assistants, who did an incredible job,” he elaborates. Rose credits rigging specialists Philip Kenyon and Ross Sheppard for adapting to the shoot’s evolving demands: “Everything had to be re-rigged in a matter of hours, often in extreme weather and remote locations,” he marvels. Beyond the complex biplane work, Rose and his team also engineered a bespoke body-mounted system for a high-octane skydiving sequence featuring Cruise. “Tom wanted a rig that allowed for maximum flexibility in camera position on his body – front-facing, side-on, even underneath,” he says. “The system had to endure high-speed, high-G manoeuvres, hard parachute openings and remain stable throughout.” To make it happen, CineAero collaborated with experts from the Red Bull Air Force as well as Allan Hewitt’s skydiving team. The result was the Snorri rig – a lightweight, modular and
aerodynamically stable camera platform. “We had several variants of this device, re-engineered as we learned how the stresses of this type of stunt affected the rig,” he reveals. “The final version integrated with the parachute system and involved clever work from the SFX team at Dust Films. It’s very cool!” Of course, none of this work happens in isolation, and Rose credits the collaboration between multiple departments needed to bring the franchise’s impressive stunts to life. “Each major stunt is broken down and passed through stunts, aerials, camera, VFX, safety and engineering. You need experts in each field who can translate a creative vision into reality, often by building something completely new,” he explains. “What makes these films unique is the level of integration across those disciplines. Seeing that level of talent come together is a career highlight.” Working with Cruise – who does all of his own stunts – adds an extra dimension.
19
DEFINITIONMAGS
ADVERTISEMENT FEATURE
Meet Dorothy Clear Angle Studios tells us about their revolutionary head scanning system Dorothy: the cutting-edge tool capturing digital humans with unmatched precision O f all the challenges in VFX, creating digital humans good enough to fool real ones has long been the holy
EVERY PORE is the same , EVEN THE SUB-SURFACE BLOOD FLOW – it’s all shown exactly as it is ” heads and faces,” Ridley confirms. “We cover down to just below the shoulders, and the reason for that is to maximise the camera and lighting to produce the best images and the best scan, without blowing anyone’s eyes out with light.” The crucial difference between things and people, Ridley reflects, is practical: “you’ve got more time with a prop. You don’t necessarily get as much time with talent, so you have to maximise that time so they can get back to what they do best – acting.” Clear Angle has 18 full-body systems available worldwide, but “this rig has been designed specifically around The hardware required to make that happen quickly, as Ridley describes it, is significant, and can vary depending on the goal. “Dorothy has 1500 lights and 76 cameras that are used to create the 3D mesh. For 3D capture, we photograph
grail. At the same time, sheer demand for 3D content often tests the productivity limits of traditional modelling and texturing techniques. Meanwhile, when big-name talent is involved, the need for speed and precision has created demand for some of the most capable scanning technology ever used to capture a human likeness. Dominic Ridley is a co-founder and director of Clear Angle Studios, founded in London in 2013. It maintains facilities worldwide for scanning objects, locations and – with ever-increasing accuracy – people. Clear Angle calls its most capable scanner Dorothy; an inward- looking sphere of cameras and lights designed to capture a human head, and much more than just shape and colour. "To date, our focus has predominantly been film production, across the big studios and indies," Ridley says, "but now we’re also branching out into games and advertising. We even worked on a couple of the Super Bowl commercials this year." “Dorothy services a range of different requirements,” Ridley goes on. “From 4D to facial performance capture, to training data – we don’t use machine learning or AI, but we do capture the data. If you’re trying to faithfully recreate someone on screen, we can capture multiple lighting scenarios of someone doing a performance. If you’re doing a stunt, we can get the stunt person to come in, then the lead talent, and a VFX house could do a face replacement without having to do the old school approach of rigging and animating a face.”
around 16 frames per second. But if we’re doing 4D capture, we’ll use 6K cameras and capture 96 frames per second. It depends what you’re trying to capture.” The storm of data produced by those cameras is stored on a series of servers Ridley describes as “hefty – our server room is cooled to the max. Our electricity bill isn’t worth mentioning, but it’s a hell of a thing to be able to piece together all of this data and produce amazing results.” How Ridley often describes those results is “pore-level detail that captures high-quality textures with subtle nuances, and an understanding of how light changes across surfaces. Having pore- level detail does change how light casts across the face. You can model, say, a lampshade, but getting fine detail on a human face is difficult. That’s the stuff that’s really, really tricky to do. When big- name talent turns up, you’re so familiar
20
DEFINITIONMAGAZINE.COM
EXACT LIKENESS Clear Angle captures precise detail for the creation of realistic digital humans
by VFX teams in a digitally rendered world, where virtual lights will illuminate the scanned object as required. Ridley describes it as “a blank canvas which can have lighting, animation and rigging added to it – but the foundation is exact.” As the technology has developed, Ridley has observed the emergence of new, perhaps unexpected markets. “We’re seeing a lot of interest in the head capture market, despite the fact that AI technologies are out there. There’s an area where talent captures themselves, so they have ownership of their own data. This is happening across sports, politics and CEO-level individuals – perhaps so they can police their likeness online, or have a secondary revenue stream where they can license out their likeness without going out on a shoot or in a space.” This thought seems timely, with the world waking up to new realities around people’s own image. For example, Denmark has recently been discussing new laws regarding the rights to control use of a likeness. As Ridley points out, there are many reasons to deploy the technology. “There’s an interest in creating very believable human faces, so you can have avatar conversations without having to travel. There is a lot of interesting chatter about that kind of thing, and we’re often asked if we can capture the data. It’s a very curious space, and we’ll start seeing many more digital avatars around.
We’ve done research projects with Nvidia, as well as a couple of private medical institutions. It’s an exciting prospect.” Meanwhile, the world of video games has seen sufficient technological leaps to make Clear Angle’s work increasingly relevant. “I’m not sure there are renderers yet that can properly display every detail,” Ridley muses, “but there’s still that creative need to see what boundaries you can push, and to showcase to the wider interested community what can be achieved if they have that level of detail – now that they know where to capture it.” Clear Angle has built three Dorothy scanners, with two in production centres and one free to move around the world. “One is in Culver City in Los Angeles, one is at Pinewood Studios and the third is a free-moving unit. Our Dorothy systems can travel – we can pack them up and move them. We can even ship them in air freight, and when they’re completely out of our hands, they’re still robust. It’s a two-day set-up – the team is well versed, it’s all in-house and they have over a decade of experience.” “Many of our team members,” Ridley concludes, “have been with us since the beginning. It’s been an interesting journey, and it’s not just Dorothy – there are our environment and prop scans too. Our goal is to capture data faithfully. A big part of the business is ensuring productions have it easy; making the shots or characters look amazing.”
with them as a human. Using the Dorothy scanner, we’re not recreating them – we’re capturing them as they are. Every pore is the same, even the sub-surface blood flow – it’s all shown exactly as it is.” Clear Angle’s system can deliver data representing any selected lighting scenario. A lighting environment can be conveniently supplied as an HDR image, which Ridley likens to the image-based lighting sometimes applied to virtual production: “You can give us an HDRI, and we can put that into our system and project that onto the subject so the subject can perform under that lighting; we can also project multiple HDRIs. We can shoot a subject in our office with the lighting of the final shot.” Thanks to a multiplex lighting set-up, Dorothy is just as capable of producing flat-lit scans representative of no specific lighting environment, for use
21
DEFINITIONMAGS
TECH SCANNING
ROOM TO ZOOM Unlike bulkier scanners, the XGRIDS LixelKity K1 can be operated solo, making it ideal for run-and- gun filmmakers
“This technology will transform how live action productions can be conceived and planned”
L ight Detection and Ranging too cumbersome and expensive for most film. Now, compact, handheld, cheaper devices designed to capture real-world locations as interactive 3D environments present a bold evolution in the cinematographic toolkit. Aside from top-end models from Leica Geosystems designed for industry, lighter and less expensive scanners offering marker-free tracking include the Eagle from 3DMakerpro and the XGRIDS LixelKity K1 scanner. The latter is particularly interesting given its compatibility with Jetset, an iOS app from Lightcraft that fuses CGI and live action in real time. Roberto (LiDAR) has been used in heavy industries like construction and surveying for years, but has been
New mobile 3D scanning techniques can help cinematographers drive new virtual production workflows from the ground up, discovers Adrian Pennington
22
DEFINITIONMAGAZINE.COM
SCANNING TECH
In Italy, Schaefer spent a day in Sutri, north of Rome, scanning buildings and streets previously identified by his director. “I shot a music video there with Belinda Carlisle many years ago, so I knew the town well. It has a beautiful piazza and churches, and it seemed like the right place. You have to walk in a zig- zag pattern for the scan to work, but you can see where you’ve been thanks to a green dotted line in the display,” he says. Schaefer also captured some scans of Rome, including of the Colosseum, before heading back to LA. Having sent the raw data to Lightcraft over Wi-Fi, the scans were processed as files called Lixel CyberColor (LCC) ready for Schaefer to look at as soon as he landed. “ XGRIDS has a [Mac/PC] viewing application called LCC Viewer that lets the user navigate through an XGRIDS scan with a traditional keyboard and mouse. You can freeze frame, move within it, point to the ceiling (or sky), go left and right, or create stills from it.” With the scans exported to Jetset, a director or DOP can test shot composition with a precise simulation of a specific production camera and lens, as well as real-time tracking of the camera’s motion. “For example, they could load in the XGRIDS scan of Rome into Jetset, select a simulated camera and lens combination of an ALEXA 35 shooting at a 2:1 aspect ratio with a 32mm lens, and simply walk around their living room,” shares Mack. “Jetset will accurately track their motions inside the XGRIDS scan and show them what the view would be if they were standing in that exact portion of Rome (for example), holding that ALEXA camera and 32mm lens.”
According to Bella Wan of XGRIDS , “The K1 scans primarily capture raw LiDAR point clouds and aligned image data. A computational process called simultaneous localisation and mapping (SLAM) is applied and then the LCC Studio software generates the 3D Gaussian splat models. These final models with natural parallax, spatial depth and RGB colour can then be exported for use in other industrial software like Jetset.” According to Schaefer, it’s a more efficient and agile form of photogrammetry which, while producing ‘extremely high resolutions’, essentially stitches thousands of stills together in a process that can be more consuming than simply walking around with a handheld scanner. XGRIDS HAS A VIEWING APP CALLED LCC VIEWER that lets the user navigate THROUGH AN XGRIDS SCAN”
Schaefer, ASC, AIC – who shot Monster’s Ball and was also the Bafta-nominated cinematographer for Finding Neverland – trialled the combination as a test for a short film under development for a director in LA. “I discovered Jetset a year ago, and it was Lightcraft founder Eliot Mack who suggested I try it out with the K1,” Schaefer says. “At the same time, I had a project to shoot in Italy, so I decided to tie the lot together and fly out to Rome.” The user’s smartphone can be attached to the K1 for real-time monitoring during the scan via a companion LixelGO app. The phone is used as a control module and viewing screen. The K1 itself has four cameras: two panoramic vision modules at 48 megapixels and two depth cameras.
23
DEFINITIONMAGS
SCANNING TECH
Schaefer showed the results to his director. “He didn’t know what to expect, but he was surprised at how much we could explore and in such detail. He understood instinctively that he could plan an entire movie, including blocking and camera placements, remotely. “You can measure things like doorways accurately to within two centimetres. That means you know in advance if a piece of furniture or a dolly can fit through this doorway, or how tall the ceiling is. It gives you a lot of options in your production office before you get to location.” Schaefer says he’d like to see location scouts equipped with such scanners. “A location scout will often come back with hundreds of pictures of different places, which can be confusing to look at. Also, they tend to shoot with super wide-angle lenses, so even a small bathroom closet feels like it’s the Taj Mahal. “When you’re done, you have a reference for every set and location on how it was dressed and lit. You can archive it, and if you ever need to do pickups, add close-ups or extend a scene in the same environment, you already have the entire set pre-built. You can then place it on a green-screen stage or in a volume. The scans are often high enough resolution – and even if not, the background walls in a volume are usually slightly out of focus anyway. It’s more than good enough for pickups.” He thinks it’s a tool the whole production can use. “I would bring in the production designer, the costume designer, all the heads of department and especially the grips, key grip and gaffer. You can point out the size of a wall to the production designer or set decorator, so they understand how to dress the space. You could even display the output inside a VR application and have someone explore it wearing goggles – though it can feel a little
THE PLOT THICKENS LiDAR scans processed into 3D splats can be used in Blender for scene layout
strange. For example, if you're in Grand Central Station or a massive concert hall, you would need enough physical space around you to walk the entire length of it in virtual reality because it’s so accurate – one metre in VR equals one metre in the real world.”
Schaefer suggests the K1/Jetset workflow could be integrated with an Artemis Prime viewfinder, allowing him to use a virtual camera with different lenses to previs, block and test camera positions and lens choices for an entire show using the scans. Among other benefits, this could help avoid costly on-location errors with a full crew. “This will transform how live-action productions can be conceived and planned,” elaborates Mack. “We can use VP from the earliest stages, with the entire team referencing the same 1:1 accurate scan of locations. In addition, we’ll be building software to best enable this process.”
THIS WILL transform how live productions ARE CONCEIVED”
25
DEFINITIONMAGS
JURASSIC PARK TAKE TWO
TAKE TWO
DINO DRAMA A combination of impressive animatronics and CGI brought Mesozoic- era creatures like the Triceratops to life
Known for churning out hit after hit, the name Steven Spielberg has become synonymous with the summer blockbuster. We take a look back on his 1993 dinosaur epic, which set the bar for visual effects
combination of live actors, animatronics and prosthetics. While Raiders once again proved Spielberg’s preference for filming on location, taking his company from France to Tunisia and, finally, Hawaii. About ten years later, he returned to the islands to adapt Michael Crichton’s 1990 novel about dinosaur clones gone rogue. Today, Jurassic Park is one of the most successful film franchises of all time, with Spielberg’s classic spawning a series of sequels – the latest being Jurassic World Rebirth . Blending Kauai’s topography with pioneering VFX, the iconic action- adventure film cemented Hawaii as a top spot and CGI as a viable storytelling tool. Centring around a small team of scientists, Jurassic Park puts the ethics of genetic cloning under a microscope, while also being an action movie about thwarting should-be-extinct predators. To build anatomically accurate versions of these ancient creatures, paleontologist Jack Horner advised and Dennis Muren and Phil Tippett from Industrial Light & Magic (ILM) helped execute. While Spielberg was initially hesitant about relying on CGI, the final film includes 15 minutes of dino screen time; six minutes of CGI and nine minutes of animatronics. Thanks to the team at ILM, audiences watch as a convincingly realistic, life-
sized T rex escapes its enclosure, flips a car with two kids inside it, chases down the scientists (this sequence took two months to perfect in post) and ultimately kills several park employees. However, Spielberg wanted to avoid the dinosaurs- as-monsters trope, choosing to include scenes with friendly dinos too – such as the Brachiosaurus herd peacefully grazing the fields or the sick Triceratops that the visitors encounter. For the latter, the crew brought an animatronic to Kauai – the only one filmed on location, while the rest lived on sound stages at Warner Bros and Universal Studios. Near the final days of filming on Kauai, Hurricane Iniki caused the crew to scramble, breaking down and packing up the sets, taking cover on the nearby island of Oahu. The Hawaiian weather proved an untamed beast, but the state’s scenery is unrivalled – as the film shows. Thanks also to financial incentives for filmmakers, Hawaii has since played host to a range of high-profile productions, including several Jurassic Park sequels, Lost , The White Lotus , The Hunger Games: Catching Fire , Hawaii Five-0 and plenty more. Jurassic Park lives on through this continuing local collab, but its legacy also survives at ILM, where its influence on CGI is indisputable.
WORDS KATIE KASPERSON IMAGES UNIVERSAL PICTURES/ AMBLIN ENTERTAINMENT
B est known for box office knockouts like Jaws , ET and Raiders of the Lost Ark , Steven Spielberg is, above all, committed to filmmaking as a craft. On Jaws , his crew created multiple mechanical sharks; it was also the first major motion picture to be shot out on the ocean. The beloved, titular ET was a
27
DEFINITIONMAGS
Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7 Page 8 Page 9 Page 10 Page 11 Page 12 Page 13 Page 14 Page 15 Page 16 Page 17 Page 18 Page 19 Page 20 Page 21 Page 22 Page 23 Page 24 Page 25 Page 26 Page 27 Page 28 Page 29 Page 30 Page 31 Page 32 Page 33 Page 34 Page 35 Page 36 Page 37 Page 38 Page 39 Page 40 Page 41 Page 42 Page 43 Page 44 Page 45 Page 46 Page 47 Page 48 Page 49 Page 50 Page 51 Page 52 Page 53 Page 54 Page 55 Page 56 Page 57 Page 58 Page 59 Page 60 Page 61 Page 62 Page 63 Page 64 Page 65 Page 66 Page 67 Page 68 Page 69 Page 70 Page 71 Page 72 Page 73 Page 74 Page 75 Page 76 Page 77 Page 78 Page 79 Page 80 Page 81 Page 82 Page 83 Page 84 Page 85 Page 86 Page 87 Page 88 Page 89 Page 90 Page 91 Page 92 Page 93 Page 94 Page 95 Page 96 Page 97 Page 98 Page 99 Page 100Powered by FlippingBook