Imagine discovering that your face has been overlaid onto footage of a porn actor, digitally edited into a porn video shared on the Internet (all without your consent) and the video is so convincing you’re worried others might not spot the deception.
This actually happened to one woman and it can happen to anyone.
Scrolling through her Twitter feed one evening, the woman stumbled across a disturbing video notification.
Thirty-year-old Kate Isaacs who founded the #NotYourPorn campaign in 2019 and whose campaigns against non-consensual porn contributed to the adult entertainment website – Pornhub – having to take down all videos uploaded to the site by unverified users, said she went into a panic mode upon seeing that.
“This panic just washed over me,” Kate told BBC. “Someone had taken my face, put it on to a porn video, and made it look like it was me.”
According to the news outlet, Kate had been deepfaked: Someone used artificial intelligence to digitally manipulate her face onto someone else’s – in this case a porn actress.
“My heart sank. I couldn’t think clearly,” she said. “I remember just feeling like this video was going to go everywhere – it was horrendous.”
The deepfake video with Kate was made using footage from TV interviews she granted while campaigning and it appeared to show her having sex.
Underneath the video, people were leaving abusive comments, saying they were going to follow Kate home, rape her, film the attack and then publish the footage on the Internet.
“You start thinking about your family,” she said, holding back tears. “How would they feel if they saw this content?”
She added that the threat intensified when both her home and work addresses were published below the video, a practice known as doxing.
“I became completely paranoid – ‘Who knows my address? Is it someone I know that’s done this?’
“I was thinking, ‘I’m really in trouble here, this isn’t just some people on the internet mouthing off, there’s actually a real danger.'”
From her experience supporting others in similar situations, Kate knew exactly what to do if someone becomes a victim – but in that moment she froze and didn’t know how to go about handling it.
“I didn’t follow any of my own advice,” she said. “Kate the campaigner was very strong and didn’t show any vulnerability – and then there was me, Kate, who was really scared.”
A colleague of hers reported the video, vicious comments and doxing to Twitter, and the social media platform took all of them down.
The problem, though, is once any deepfake has been published and shared online it’s difficult to be removed completely from circulation.
“I just wanted that video off the Internet but there was nothing I could do about it,” Kate said.
Although she had no idea who was behind the deepfake of her, Kate believed the person must have been annoyed by her activism. She had “taken away their porn.”
Kate’s face was overlaid onto footage of a porn actor and the video made to be convincing enough that she worried others might not detect the deception.
“It was a violation – my identity was used in a way I didn’t consent to.”
In the past, high-profile celebrities and politicians were the most common targets of deepfakes, according to BBC.
While some were made for comedic value, they weren’t always porn. But that has changed over the years.
According to cybersecurity company – Deeptrace – 96 per cent of all deepfakes are non-consensual porn.
And, like revenge porn, deepfake pornography is what’s known as image-based sexual abuse – an umbrella term which comprehensively include the taking, making and/or sharing of intimate images without consent.
There’s a marketplace for deepfakes in online forums.
People post requests for videos to be made of their wives, neighbours and co-workers and – as unfathomable as it might sound – even their mothers, daughters and cousins.