Definition May/June 2025 - Web

POLICIES & PROTECTIONS AI & THE CRAFT

D uring the 2023 Hollywood strikes, AI was a major pain point being discussed. Creators were fighting against copyright infringement – in other words, generative AI models being trained on their work without permission or payment – as well as replacement, with many filmmakers and actors fearing AI would render them redundant. They demanded protections be put in place to ensure that AI would be used ethically, when used at all – but the industry has been slow to move. “There are so many facets of AI,” begins Claire Leibowicz, head of AI and media integrity at Partnership on AI (PAI), a global nonprofit ‘devoted to creating resources and recommendations for responsible AI’. Founded by a group of tech research heads, the organisation “recognises that the challenges and opportunities in AI transcend any one industry,” says Leibowicz, who’s been with the partnership since 2018. “We’ve been thinking about, ‘How do you collectively

TIP OF THE TONGUE There has been an increase in AI use for filmmaking recently, including the AI refinement of Adrien Brody’s Hungarian accent in The Brutalist (left)

create technically and socially sound guidance for an emergent technology with broad implications for storytelling and entertainment?’” There are risks to AI as well as rewards, but its mere existence begs a bigger question: “What does it mean to tell a story honestly in the AI age?” asks Leibowicz. She cites a BBC documentary on Alcoholics Anonymous that used AI to obscure faces. She also mentions the ‘recent kerfuffle’ around The Brutalist , wherein AI was used to elevate Adrien Brody’s Oscar-winning performance by refining his Hungarian accent – a notoriously difficult one to get right. “Is that the same as hair and makeup,” she presents, “which already enhances performances? Or is it crossing a line?” Where that line is, well – it’s up for debate and ambiguous. Unfortunately, it’s not as simple as relying on a collective moral code; the industry needs explicit guidelines, with explicit consequences for behaving unethically. Until such a rulebook exists, creators might need to be their own advocates. TAKING CONTROL One way for filmmakers, and especially actors, to certify their work is via content authentication. “How might we be able to prove someone’s face is truly theirs?” asks Leibowicz. “There has been momentum on tagging content to be associated with people. There’s also something called the NO FAKES Act, which hasn’t passed yet, but is the idea of having greater protection over one’s likeness. It’s giving people more control, and grounds for pushing back if people violate their likeness in a way that is unsavoury.” Companies like Human & Digital (HAND) offer authentication through something called the Digital Object Identifier system, which verifies when a person’s likeness is used legitimately and creates a unique talent ID. HAND not only

identifies real people; it also does so with virtual counterparts (ie digital replicas) and any related fictional characters, encouraging proper compensation across intellectual properties. While the US government has not made much headway on AI regulation (it’s currently focused on other agendas), just last year California passed the AI Transparency Act, which requires GenAI providers to include an AI detection tool as well as an AI-generated content disclosure. The law doesn’t go into effect until 1 January 2026, but in other areas – like the EU and China – “there has been a push to label AI-generated content too,” according to Leibowicz. With global discrepancies in AI policy, filmmakers should stay on top of the latest legislations. “The norms will be country-dependent,” says Leibowicz, “and it just adds a layer of complexity.” This also extends to AI models; some are trained ethically, others aren’t. “Support companies that are mindful of these data rights issues,” she suggests. “Put your money where your mouth is.” The final piece of advice Leibowicz offers is to ‘get involved in advocacy’. “Speaking up matters. There’s something called the Concept Art Association,” she states, an organisation for artists working in entertainment. “There’s a strong push from them to be vocal about not having AI trained on their work. “Collective action can make a difference in this moment – bringing a voice to these things is most important,” Leibowicz continues. “Galvanise energy within your community.”

Learn more at partnershiponai.org

SUPPORT COMPANIES THAT ARE mindful of data rights issues ”

37

DEFINITIONMAGS

Powered by