The Horrifying Reality of Deepfake Pornography
This is what it looks like to see yourself naked against your will, being spread all over the Internet.
I just googled my name and the name of the website, and there I was.
To the people who say, "Oh, it's just a bunch of pixels. It’s not you. What is your problem?" I say, go f*** yourselves.
All right. I think we can start.
On January 30th, Twitch streamer Atrioch gave a tearful apology for watching female streamers in sexually explicit videos. The thing is, the female streamers weren’t physically in those videos. Their likeness was.
It seems a little ridiculous now.
What he was watching is known as deepfake pornography, where someone's face is put on the body of an adult film star using AI. It is hyper-realistic and usually done without the consent of the person whose face is being swapped in.
"I don’t… I just clicked the fing link at 2:00 AM, and I… and the bulls didn’t catch up to me. And… and I’m sorry."
These kinds of videos are only getting more popular. The number of uploads to one of the most prominent deepfake pornography websites has been steadily increasing since 2018, with last year seeing the highest number yet—over 13,000 uploads.
"You simply replace a person from eyebrow to chin, cheek to cheek, with another face."
Hany Farid is a professor at UC Berkeley who specializes in digital forensics.
"Manipulating and generating audio, images, and videos is not new. Hollywood studios have been doing it for decades. What is new here is that machine learning or AI algorithms are learning to do the hard work of what used to be in the hands of special effects masters."
"There will always be those who are unable to accept what can be."
Now, anyone with access to the Internet can make a realistic deepfake video.
"To synthesize a voice, you need 30 seconds. To synthesize a video, you need a handful of images—not thousands or tens of thousands of images."
"Not Morgan Freeman."
"And what you see is not real."
And suddenly, what used to be a threat vector for people who had a huge digital fingerprint has now become a threat vector for people who have a relatively small digital fingerprint.
That means that while celebrities are still the main target, everyone is at risk. A creator on one online chat platform offered to make a five-minute deepfake of a "personal girl"—aka anyone with less than two million Instagram followers—for $65.
The Victims and Their Struggles
"Hi, how’s it going?"
"There are other people who, literally, in the comments section of my videos and in my chat, were saying, 'Ha, I bet you just got caught making porn, and now you’re trying to backtrack, claiming it’s a deepfake.' Where’s chat? There you are."
HelloSwedenitas, a Twitch streamer with about two million followers, found out she’d been deepfaked after the Atrioch scandal.
"I’ve definitely tried to get these videos taken down, and I have looked into legal action. I’m not some wealthy millionaire who can just fritter away thousands every single month. And it is thousands a month that you’d have to pay to have someone look through the Internet endlessly and take down anything like this.
I’ll never know who’s consumed this content and who hasn’t. I could go to an event. I could do a meet and greet. I could literally hug a person who has consumed this content, and that’s really, really hard. And there’s no way to process that. No comfortable answer to it.
It’s just something that will have to linger over me now forever."
So what’s being done about this?
Well, Australia banned non-consensual deepfake porn in 2021. And in late 2022, the UK said they’d do the same.
Here in the US, while most states have laws that ban non-consensual pornography, only four have laws that specifically address deepfakes.
"So it’s good to see you."
Congresswoman Yvette Clarke is the author of the Deepfake Accountability Act. She has introduced her bill to the House twice now—once in 2019 and again in 2021—but it hasn’t gone anywhere.
"We couldn’t have imagined when we initially drafted this legislation that AI would take off the way it has."
The bill, in its most recent form, would require anyone who makes a deepfake to put a digital watermark on it so people know the content they’re looking at isn’t 100% real. It would make it a crime to conceal or crop out that watermark, punishable by a fine, up to five years in prison, or both.
She’s planning on reintroducing a new version later this year and is hopeful it will progress.
The Fight Against Deepfake Exploitation
The key to stopping this lies in going after the places where the content is being shared and in changing how technology like this is developed in the first place.
"I’m not saying stop innovating, but I’m saying safety first—by design, as it’s called. Because if you develop and deploy technology and put it out there, then try to backfill the guardrails, it never works."
It’s a concern shared across the industry. More than 1,000 notable tech leaders signed an open letter asking AI labs to pause training their systems for at least six months, citing "profound risks to society."
"The one thing we’ve learned over the last 20 years is that there is no longer an 'online world' and an 'offline world.'
What happens on the Internet doesn’t stay on the Internet."
"What’s the problem with it? I’d say consent.
Imagine that you were deepfaked into a video of you stomping a kitten to death. Then, loads of people saw that, and your boss fired you.
And your friends wanted nothing to do with you.
And your family stopped messaging you.
You’re completely alone, and there’s no way to explain yourself to the public because they’re not giving you a chance.
And that is what it’s like to immediately be put in a video doing things that are stigmatized—that people don’t accept.
My safety, my day-to-day life, my whole career—has been completely turned upside down.
And probably…
I think a superficial challenge will be getting people to understand.
And the deeper challenge will be getting people who do understand what’s wrong with it… to change."
Because sometimes, knowing it’s wrong is not enough.
"I’ll tell you what—I’ll create a video of your sister, or your mother, or your aunt, or one of your loved ones.
And you tell me how 'innocuous' this is."
Source: Stay Tuned. (2023, April 24). People don’t believe I was deepfaked [Video]. YouTube. https://www.youtube.com/watch?v=qXBOMuk7jkI