FEED Spring 2024 Web

to impact her life and mental health to this day, despite the image being originally generated 11 years ago. “Everyday women like me will not have millions of people to protect us and take down the content; we don’t have the benefit of big tech companies, where this is facilitated, responding to the abuse,” quotes the article. “Takedown and removal is a futile process. It’s an uphill battle, and you can never guarantee its complete removal once something’s out there.” Martin goes on to emphasise how it affects everything from your employability and future earning capacity to your relationships – and it’s something she is even required to mention for job interviews in case a potential employer discovers the image through background checks. Many have deemed the recent Taylor Swift images as an example of a new way for men to control women. But, as Noelle Martin’s case soberly highlights, there’s not much that’s new about this edition of misogyny. FIGHTING AI WITH AI Seeing as any response to the problem often overlooks everyday deepfake crime, many have turned to fellow victims to seek ways in tackling the ever-evolving issue. #MyImageMyChoice is a cultural movement against intimate image abuse, from the creators of the documentary Another Body . The documentary follows US college student Taylor in her search for answers and justice, after she discovers deepfake pornography of herself circulating online. She dives headfirst into the underground world of deepfakes and discovers a growing culture of

DARK DOPPELGANGER Another Body thrusts viewers into the malicious use of deepfake technology, showcasing it’s not just high-profile celebrities that fall victim to these bad actors

UNAUTHORISED DEEPFAKES HIGHLIGHT THE NEED FOR ROBUST LEGAL FRAMEWORKS AND ETHICAL GUIDELINES TO GOVERN SUCH TECH privacy, with officials at the White House noting it as ‘alarming.’ “The recent incident involving Taylor Swift is a stark reminder of the dark side of deepfake technology,” continues Marr. “Unauthorised deepfakes – especially those that are malicious and infringe on personal rights – highlight the need for robust legal frameworks and ethical guidelines to govern the use of such technologies. “This incident underscores the potential for deepfakes to cause harm and invade privacy, reinforcing the necessity for both societal and technological responses to protect individuals.” What happened to Swift also served as a sharp reminder to those previously victimised by non- consensual deepfake porn, who don’t have the capacity to leverage the protection of an army of devoted fans. In an interview with The Guardian , Noelle Martin detailed how her image-based abuse continues

@feedzinesocial

Powered by