In the ad, a woman in a white lace dress makes lewd faces at the camera and then kneels. There’s something sinister about her; a trembling at the side of her temple, a peculiar stillness of her lips. But if you’ve seen the video in the wild, you might not know it’s a deepfake fake. It would just look like a video, like the opening shots of some cheesy low-budget internet porn.
In the top right corner as the video loops is a still of actress Emma Watson, taken when she was a teenager, from a promotional shoot for the Harry Potter films. It’s her face pasted onto the porn actress’s. Suddenly, a woman appears in it who has never appeared in pornography.
Related: A porn bot stole my identity on Instagram. It took an agonizing month to delete | Nurie Salim
The ads, which directed users to an app that creates deepfake videos, were spotted by Kat Tenbarge in more than 230 iterations across Facebook, Instagram and Metas Messenger, according to an investigation by NBC News. Most advertisements featured Watson’s picture; some others used Scarlett Johansson’s face. The same ads appeared in photo editing and gaming apps available on the Apple App Store. To keep the message across to viewers, the ads make it clear that they are intended to help users create non-consensual porn of any woman. “Swap EVERY FACE in the video!” read the ads. “Replace the face with anyone. Have fun with AI Face Swap technology.”
Similar ads for deepfake services appear right next to inappropriate videos on PornHub. Although deepfake technology can theoretically be used for any type of content – from joking satire to malicious political disinformation campaigns – the technology is predominantly used to create non-consensual porn. According to a 2019 report, 96% of deepfake material on the internet is pornographic.
This number could well increase. The ads on Meta and Apple platforms appeared as consumer demand for deepfake pornography explodes. The surge follows a controversy that rocked online video game communities in January when a popular streamer, Brandon Ewing – who calls himself “Atrioc” – featured deepfake pornography from several popular female streamers in one of his online broadcasts. He later admitted to paying for the fake porn of the women who were his colleagues and friends after seeing an ad similar to those that appeared on the Meta and Apple platforms. The women whose images were confiscated for Ewing’s pornography reacted with anger and hurt; Ewing himself apologized. But the controversy seems to have only made the streamer’s overwhelmingly young and male following more aware of the availability of deepfake content – and ready to tap into it themselves.
Genevieve Oh, a researcher who studies live streaming, told NBC that web traffic to the top deepfake porn sites exploded after Ewing’s apology. This rapid increase in recent weeks has been followed by a slower but still alarming growth of the deepfake revenge porn sector in recent years. In 2018 less than 2,000 videos were uploaded to the most famous deepfake streaming site; By 2022, that number had grown to 13,000, with 16 million monthly views. With deepfake revenge porn growing in popularity, the barrier to entry is pretty low: the app, which abused Watson’s face in its ads, costs just $8 a week.
The rapid increase in the number and availability of non-consensual deepfake porn videos is raising alarming questions about privacy and consent in the digital future. How will the large number of women – and the smaller but significant number of men – affected by this new AI-enabled revenge porn deal with their reputations and their lives? How will viewers tell the difference between fact and AI-generated fiction as technology improves? How can non-consensual material be removed when the internet moves so much faster than regulation?
But the example of these apps — and the men who use them, like Ewing and his fans — also sheds light on something older and more uncomfortable about the nature of porn: that men often use it as an expression of their disdain for women, and feel that the sexual Portrayal of women humiliating and hurting them. Indeed, this is a big part of the appeal of mainstream porn, at least according to many of the men who consume it: that it allows men to imagine they are in control of women and inflicts pain and humiliation on them. Deepfake revenge porn, then, only fulfills with technology what mainstream porn has offered men in the imagination: the assurance that any woman can be diminished, degraded, and humiliated by sexual violence. The non-consent is the point; the humiliation is the point; the cruelty is the point.
Related: Women aren’t always safe, even in gyms. But fear is a good way to curb it Martha Gil
There’s no other way to really understand the allure of deepfake pornography: It’s not that the internet lacks sexual content that portrays real and consensual adults. Concretely and explicitly, what these apps offer their users is the opportunity to hurt women by forcing them into pornography against their will. After Ewing showed his deepfake pornography to his streaming audience in January, one of the women pictured released her own tearful video, describing how the malice and hurt of the deepfake had hurt her. In response, a man sent her an image of her own crying face, which appeared on his tablet. The screen was covered in semen.
Right now, the women and others targeted by deepfake revenge porn have few legal options. Most states have laws that penalize revenge porn, but only four — California, New York, Georgia, and Virginia — ban non-consensual deepfakes. Companies hosting the apps are often based overseas, mostly out of reach of law enforcement — the company whose app was promoted on Meta appears to be owned by a parent company based in China. Meanwhile, more and more men will start using the technology against more and more women. “I was on fucking Pornhub… and there was an ad [for the deepfake site]’ Ewing said in his apology video, explaining how he discovered the AI revenge porn site. “There’s an ad for that in every damn video, so I know other people have to click it.”