Definition March 2025 - Web

DIGITAL DE-AGING

FLASHBACK Bringing back young Harrison Ford as Indiana Jones instead of a younger actor was crucial to preserve the story’s charm

YOUNG

Released late last year, Here proved a novel challenge; Zemeckis wanted to capture Hanks, Wright and their co- stars as if they were aging organically. Metaphysic handled the film’s VFX, using generative AI to digitally age and de-age the actors in real time (with a negligible two-frame delay), tidying up the effects in post. The actors could even watch this live feed while filming to get a more accurate sense of what audiences would see, allowing them to refine their performances accordingly. Going from de-aging primarily in post to doing so in real time is a natural step in the direction of more efficient filmmaking. As with virtual production, VFX is becoming involved at earlier and earlier stages, with lots more being captured in camera – a useful tool for DOPs and directors, who appreciate having this immediate feedback. While ILM is internationally renowned for its artistry and attention to detail, Metaphysic is pushing the boundaries of what’s possible with regards to GenAI. The company ethically trained its model by feeding it reference footage from when Hanks and Wright were younger

(after receiving their consent), to mimic not just their looks but also unique body language, expressions and behaviours. Addressing the critique that AI equals job displacement, Metaphysic’s chief content officer and president of production Ed Ulbrich argues that this criticism is often misguided. He uses the example of CGI, suggesting that “it’s just learning new tools. As an industry, this created many new jobs that never existed previously,” and he believes the same is true of AI. At Metaphysic, digital de-aging and digital aging often go hand in hand, and this requires an extra human touch. For instance, makeup artists aren’t being forced out: “we love working with great makeup artists,” Ulbrich says. “Instead of putting a whole new face on an actor, we can work with makeup teams to do a prosthetic makeup treatment. You make them older, or into an alien or creature, and we do a data acquisition of them while wearing that makeup. Then they don’t need to wear it during production,” Ulbrich explains. “It’s the same as swapping their younger face onto them.”

When inputting a digitally enhanced face onto a real actor, films run the risk of entering the uncanny valley – where a person appears almost human, but not quite. The eyes have commonly been deemed the giveaway, particularly in the early days of computerised VFX. As technology evolves and digital de- aging becomes more convincing, this phenomenon diminishes; however, with the AI-generated, photoreal face, it never existed to begin with, Ulbrich argues. “We’ve eliminated the uncanny valley.” While GenAI might seem like the next big thing, there are disadvantages too. “The main drawback of using this tech for de-aging is that you need a dataset of existing images,” Pollock explains. “If you don’t have access to that material, or it doesn’t exist, the AI-based approach might be impossible,” whereas with CGI, composites are created manually. In the CGI versus AI debate, there’s no right or wrong; it mainly comes down to a filmmaker’s preference and budget. We’ve seen great examples of digital de- aging with vastly different techniques, and as technology evolves across the board we’ll likely be getting more of it.

63

DEFINITIONMAGS

Powered by