Pin It
DEEPFAKE V2

Inside the disturbing rise of ‘deepfake’ porn

With just a few clicks, pornographic videos can now be made starring people who have never consented – so why isn’t anyone taking it seriously?

Noelle Martin was 17 when she discovered that her face had been edited onto naked photos of someone else. The Australia-based activist, now 26, found the photos by chance after doing a reverse Google image search on an innocuous selfie. Within seconds, her screen had been flooded by deepfake pornographic imagery – featuring her face – created by an unknown group of “nameless, faceless” sexual predators. “[Someone] had doctored or photoshopped my face onto the bodies of naked adult actresses engaged in sexual intercourse,” Martin recalled in a 2020 TED talk

Revenge porn (the nonconsensual sharing of sexual images and videos) is a growing concern, especially among young women. But what Martin’s experience shows us is that sexual content doesn’t even need to be produced in the first place for people to share it. Now, new developments in AI digital technology have given rise to a disturbing new strain: nonconsensual deepfakes.

Deepfake porn involves superimposing a person’s face onto sexual images or videos, to create realistic content that they have never participated in. The majority of apps and websites that provide these kinds of pornographic deepfake services last for several months before they are taken down (mainly after mass reportings from activists). Like a hydra’s head, however, they always multiply and pop back up. Often, these sites are spread anonymously on forums like Reddit, with many masquerading as a typical face swap service where porn gifs, videos and images can be used. 

But in recent months, these sites have become more brazen. One of the most prevalent  – which we will not be naming – now advertises its services freely on adult content websites, and even provides the pornographic images and videos that people’s faces can be edited onto. All users need to do is select a photo of the person they would like to see spliced onto sexual scenes, and upload it. With just a few clicks, porn videos can be made starring people who have never consented to this content being produced. Predictably, this is a gendered issue: a study carried out in 2019 reveals that 90 to 95 per cent of deepfakes are nonconsensual, and about 90 per cent of those are of women.

One of the most high-profile cases of deepfake porn abuse is that of Indian investigative journalist Rana Ayyub. After reporting on the rape of an eight-year-old Kashmiri girl in 2018, Ayyub drew criticism for arguing that India protects child sex abusers. In an article for Huffington Post, she outlined how trolls first spread fake tweets about her “hating India”, before creating deepfake porn with her face on another person’s body. It was shared by the leader of nationalist political party BJP, and the harassment she received as a result of the video became so bad that the United Nations had to intervene. She concludes that deepfake is “a very, very dangerous tool and I don’t know where we’re heading with it.” 

“Predictably, this is a gendered issue: a study carried out in 2019 reveals that 90 to 95 per cent of deepfakes are nonconsensual, and about 90 per cent of those are of women”

The potential for manipulating political figures and their running campaigns with deepfake technology has been well covered – but the damage it poses to women is barely discussed in the media, despite being a growing problem. In 2020, the legal charity Revenge Porn Helpline published a report called ‘Intimate image abuse; an evolving landscape’, in which they addressed the rise of deepfake technology and its “worrying potential” for image-based abuse. Since this report was published, senior helpline practitioner Kate Worthington tells Dazed that the charity has seen a rise in cases, but they are unfortunately limited in the help they can offer. This is mostly because laws in England and Wales do not see deepfake revenge porn as an offence.

The same can be said for Ayyub’s native India, where little has been done to regulate deepfake technology after her case, despite the intervention of the UN. There are little pockets of hope, however: Scotland does have revenge porn legislation that covers deepfakes in place, and earlier this year the state of Florida’s lawmakers approved a similar bill that will seek to ban deepfake pornography and revenge porn. 

Noelle Martin has also been campaigning to criminalise image-based abuse in New South Wales, Western Australia and in the Commonwealth, and has spoken about her experiences in a TED talk. She tells Dazed that the technology for deepfake porn is becoming increasingly advanced, and those who make it are gaining a “stronger resolve”. She isn’t surprised that the newest website to offer this service is taking out paid advertisements, but she does find it “despicable”.

Does this kind of abuse happen because digital spaces are a reflection of reality, or because they are separate from it?”

Martin has also been vocal about a key issue that the fight against image-based abuse and deepfake technology must reckon with: the Metaverse. She notes that virtual reality worlds, where images of people can be captured and digitised to create increasingly realistic avatars, are the perfect platform for nonconsensual digital sexual abuse. This is already happening: researcher Nina Jane Patel recently wrote a Medium post on how her avatar was gang-raped by multiple men and photos were taken of the event within 60 seconds of her joining Facebook’s Meta. 

Does this kind of abuse happen because digital spaces are a reflection of reality, or because they are separate from it? That this is so clearly a gendered issue is an extension of real-life misogyny, and its sexual nature reflects rape culture on a global scale. But in Patel’s post, she suggests that the non-fiction element of digital spaces are part of their appeal, noting that those who attacked her avatar were engaging in a fantasy that they may not have carried out in real life. While this may (or may not) be true of VR, deepfake porn has real-life consequences that have the intention of mimicking reality for both sexual gratification and revenge purposes. As the metaverse expands and becomes more and more realistic to its users, there is nothing to say that the intentions behind deepfake porn won’t be carried over to this space. 

One of the most startling aspects of this growing threat to people’s safety is how victims of nonconsensual image-based abuse are barely protected by the law. That a handful of locations have begun implementing legislation is promising, but – as the stories of Martin and Ayyub only prove – much more thorough and targeted laws are needed to stop these more elusive, twisted forms of nonconsensual sexual abuse.

Download the app 📱

  • Build your network and meet other creatives
  • Be the first to hear about exclusive Dazed events and offers
  • Share your work with our community
Join Dazed Club