Fighting Deepfakes: How the Take It Down Act Protects You

AI deepfakes are on the rise, creating non-consensual content from simple photos. Learn how the Take It Down Act provides new legal tools to fight back. Listen to the full episode to learn more.

Fighting Deepfakes: How the Take It Down Act Protects You

TL;DR

The Take It Down Act is poised to become law, forcing social media platforms to remove non-consensual deepfakes and revenge porn within 48 hours. #DigitalSafety #AI #VentureStep

INTRODUCTION

The rapid advancement of AI has unlocked incredible potential, but it has also created powerful new tools for harassment and exploitation. Malicious actors are now using AI to create "deepfakes"—hyper-realistic but entirely fake images and videos—to create non-consensual pornography, build fraudulent sales funnels, and launch targeted attacks on individuals. This grotesque misuse of technology has left victims with little recourse as existing laws and platform policies struggled to keep pace.

In this episode of Venture Step, host Dalton Anderson dives into the legislative response to this growing crisis: the Take It Down Act. After seeing disturbing tutorials on creating malicious AI models, Dalton explains why he previously hesitated to cover the topic but now feels it's essential to discuss the new protections this bill offers. This act, which has now passed both the House and Senate with overwhelming support, aims to provide a robust legal framework to combat digital abuse.

Dalton breaks down how the Take It Down Act expands upon existing rules for Non-Consensual Intimate Imagery (NCII), defines what constitutes a deepfake, and holds social media companies accountable. He explores the real-world cases of high school students that galvanized the movement for this legislation and discusses the practical challenges and future-proof solutions needed to protect people in an age of synthetic media.

KEY TAKEAWAYS

  • The Take It Down Act mandates that social media platforms remove reported non-consensual intimate imagery, including deepfakes, within 48 hours.
  • The legislation expands legal protections beyond traditional "revenge porn" to explicitly include AI-generated synthetic content.
  • A "reasonable person" standard is used to identify deepfakes, covering composite images like face swaps where a person is recognizably depicted.
  • The Federal Trade Commission (FTC) is empowered to investigate platforms like Meta and Snapchat for non-compliance with takedown requests.
  • A potential long-term solution involves mandating "synthetic keys" or digital watermarks in all AI-generated content to make it easily identifiable.

FULL CONVERSATION

The Disturbing Rise of AI-Generated Content for Profit

Dalton: Today we're to be discussing a topic that I've been wanting to touch on for a while, but the issue was there wasn't anything out there in progress to prevent these kind of grotesque acts. 1And now that the Take It Down Act has passed the Senate and the House, we're on track. 2So I was sent two YouTube videos about how people are making and telling people to make these AI models. 3 When I say AI models, I don't mean an AI model like ChatGPT; I'm talking about a deepfake version of somebody or a completely synthetic virtual human to create OnlyFans sales funnels. 4

Dalton: And the breakdown of what they described to do was very bad. 5I didn't want to talk about it because there was nothing preventing people from doing these things or there wasn't an act put in place to explicitly target deepfakes, but now there is. 6Basically, what they described was to find cute women on TikTok, take their face, and then stamp it on an Instagram model type body. 7Then, create a profile, use dances of this body with this person's face, and create an OnlyFans sales funnel to get people to subscribe and make money. 8It's just such a gross thing to do, not cool, and is just disgusting human behavior. 9

Why We Couldn't Ignore the Deepfake Problem Any Longer

Dalton: I never talked about the methodology of what this person described to do. 10The guy breaks down which apps to use, which websites to go to, how to identify a good model versus a bad model, all sorts of stuff. 11 He really breaks it down.

With innovative technology, you'll always attract people that either are weirdos or that want to exploit others using this technology to have notoriety or monetary gain or attack others. 12

Dalton: And that's evident with some of the attacks on Taylor Swift. 13Social media platforms had to block access to typing in her name for some time while they got the deepfake videos under control. 14Basically, people made deepfakes of Taylor Swift doing sexual acts, and it was just out of control. 15The only way that social media companies were able to combat this was to just not allow people to be able to type that name while they took down all the content. 16

How AI Voice Cloning Highlights the Deception Risk

Dalton: I just wanted to give you some background of why I couldn't talk about what I wanted to talk about previously. 17I'm overall very, very bullish about this act. 18I think it future-proofs quite a few things and covers revenge porn, deepfake porn, and anything of the sort. 19It puts us in a pretty good spot on a legal standpoint to scale with the technology that is rapidly evolving. 20For example, my altered deepfake voice that I created. 21I showed this to folks and I didn't tell them that it was AI, and people couldn't tell. 22They couldn't tell that it wasn't AI until I told them. 23

Dalton: What if somebody took all my voice data and started saying all sorts of bad things on the internet? 24And then I lose my job or I lose my children. 25Society judges folks on their actions and the things that they've done. 26What if that stuff is fake and not me, but you can't tell that it's not me? 27It's really difficult to know if it's me or not me. 28And that's kind of the gist of what the Take It Down Act is about. 29

Defining Non-Consensual Intimate Imagery (NCII)

Dalton: The Take It Down Act is about revenge porn and sexual acts related to deepfakes. 30This has since been defined in the NCII rule, which defines what non-consensual imaging is. 31If you're in an intimate relationship with somebody and you decide to film or take photos of each other, you've consented to that. 32

But just because you gave consent for private photos or videos to be taken and recorded and then stored, that doesn't consent to publication of those videos. 33

Dalton: People in the past have been victims of the publication of private sexual acts, and most of the time the victims are women. 34Then there's this other emerging risk of deepfakes with the advancement of AI where people are able to go on websites that allow you to undress somebody. 35You can take a photo that was not intimate and undress that person and turn it into an intimate image or a nude or a sexual act that that person is not aware of and did not consent to. 36The photos that you can use as reference points could be you at dinner or just taking a photo of yourself walking around in the mall. 37

Dalton: The NCII did not include deepfakes. 38So now there's this Take It Down Act after these high-profile events, especially with girls in high school where social media companies took a long time to take down these digitally altered images. 39In the case with one victim, it took Snapchat nine months to take down the sexual image. 40

The Alarming Statistics Behind Deepfake Popularity

Dalton: You may be thinking, "I've never heard about a deepfake. Who even does that kind of stuff?" 41 Fair enough. You are not around weirdos. 42But these deepfake websites do get a lot of visits. 43The top 16 sites in six months got 200 million visitors. 44These numbers were a result of an investigation for a legal case that is ongoing. 45So it's a little bit more mainstream than you would think. 46

There are more than 21,000 deepfake videos uploaded to pornographic sites, a 460% increase from two years ago. 47

Dalton: Year over year, a 460% increase in deepfake videos uploaded online. 48

What Exactly Is a Deepfake?

Dalton: I don't think I defined what a deepfake is. 49I would say any digitally altered asset could be considered a deepfake. 50When I made my AI voice, that would be considered a deepfake because I did not say those things, nor was that me, but it sounded like me and acted like me. 51People had a hard time telling the difference because it was a deepfake. 52But I made the deepfake, so I was aware of it and I consented to it. 53If someone, unknowingly to me, made a deepfake of me saying inappropriate things on the internet, that would be a deepfake. 54

Dalton: The issue is it's a new technology and there's not a lot of synthetic keys that are built into the audio or the images or the videos. 55One company that does this very well is Google, where they bake in a synthetic ID into the pixel structure of the image or video that was generated. 56So if you create an image from Google, Google knows when they scan the internet that it's an AI photo. 57These things aren't created at the moment for a whole bunch of sites, so you won't be able to tell where it came from. 58It's kind of like an unregistered gun. 59A deepfake would be a digitally altered asset without the person's consent. 60

Understanding the Take It Down Act's Core Components

Dalton: The Take It Down Act's approach is related to the distributors of these models that allow you to "undress" people and make these deepfakes. 61There's also an agreement with social media companies like Meta platforms and Snapchat to commit and have an obligation for removal of content after a good faith statement was made by the person filing the complaint. 62They have to take it down immediately or within 48 hours. 6348 hours is the deadline, and if not, they'll be held liable. 64

Dalton: An act similar to this was approached and failed to pass the house before. 65It was called the Defiance Act. 66The Defiance Act was more about allowing the victim to have restitution regarding deepfakes and revenge porn at a federal level, whereas the Take It Down Act focuses more on the deepfakes and the NCII definitions of consent. 67

The High School Victims Who Inspired a Movement

Dalton: The Take It Down Act is led by Ted Cruz, mostly because one of the key victims in this whole ordeal was Ellison Barry, and she was based in Texas. 68She was a freshman in high school and one of her classmates unfortunately decided it'd be funny to create a deepfake of her using an image she posted on Instagram. 69The source image was a dress she was wearing, I think she was going to dinner with her parents. 70They took that image, undressed her, and then distributed it throughout the school. 71And especially when it takes nine months to get it taken down. 72

Dalton: Instead of being embarrassed and upset, she is taking a different approach where she's attacking the problem and speaking out. 73She is one of the key advocates of this bill. 74It wasn't her parents, it was her. 75Then there was another person named Francesca Manny, who was based out of New Jersey. 76The same issue happened with her. 77She's also been a key representative and an advocate. 78She was on 60 Minutes with her mom. 79

Penalties for Violating the New Digital Abuse Laws

Dalton: What happens if you do violate the NCII law that now covers both revenge porn and synthetic publication of pornographic images or videos? 80I think that these rules are a little bit more relaxed than I would expect them to be. 81Penalties, fines, and restitution, imprisonment up to two years for adults and then three years if the victim is a minor. 82I don't know that I just don't feel like that's enough. 83Especially if it's an adult that does that to a minor, that's crazy. 84If a minor did it to a minor, I can understand it might've just been a badly placed joke or some kind of bullying. 85There should be some kind of flexibility there. 86

The 48-Hour Mandate: Holding Platforms Accountable

Dalton: The biggest provision in the Take It Down Act was within 48 hours, there needs to be action taken and the content needs to get taken down. 87The only thing that the victim would have to provide is a good faith statement. 88All they have to do is provide a good faith statement like, "Hey, this is me. I did not consent to this. This is a deepfake." 89

Dalton: If a platform doesn't take it down in time, then there will be fines and investigations and sanctions. 90The body that oversees that is the FTC. 91If the FTC determines that you are acting in non-compliance, they will launch an investigation. 92When the FTC launches an investigation, most of the time they'll take you to trial and they'll win. 93So you don't want to do that if you're a social media company because the FTC does not play around. 94

Why Social Media's "Crowd Report" System Fails Victims

Dalton: I think the issue with social media at the moment regarding these extreme things is they do a crowd report where if many people report an image is bad, then they'll take it down. 95But if it's related to something extreme like fake nudes and there's not that many people reporting it, then it just doesn't get taken down. 96I've helped women that I know where people would create these fake accounts under their name. 97Instagram is refusing to take it down. 98Can you help me report this? 99

What do you need, 10 people to tell you that your house is burning down? Not enough people told me, so I didn't know. We were at seven, but we really needed 10. 100

Dalton: Why would somebody just randomly go up to you and say, "Hey man, your house is burning down." 101In a general sense, society in itself is wholesome. 102There are some bad actors, but a lot of people are quite nice. 103People aren't out to get you. 104

Addressing the "Face Swap" Gray Area

Dalton: This is a weird gray area and the Take It Down Act addresses this. 105If I take someone else's face and plaster it and stitch it onto somebody else's body, then technically it isn't anybody. 106 This is when the definition from the Take It Down Act is put into perspective. It is, will a reasonable person be able to tell that this is a different thing? 107

If a reasonable person is not able to recognize or acknowledge that this is a deep fake or recognize that this is a related person to this person A versus person B, that is what the definition is. 108

Dalton: If you can relate those two people, then it's a deepfake. 109I think that kind of clause protects future advancements in the technology as this grotesque technology evolves. 110

The Practical Challenge of Enforcing the New Law

Dalton: The practicality of this is going to be a huge task. 111There is just going to be a monster undertaking of attacking these websites that are generating all of this "undress me" content or deepfake pornography. 112As I mentioned earlier, up 460% with over 20,000 videos uploaded to pornographic sites. 113It's going to be hard to identify what is a deepfake that's attacked, especially on a normal porn site. 114

Dalton: That might be a thing where people create apps where they scrub the internet for deepfakes of you. 115Maybe there are these whole consulting firms that are created just so they help manage your digital presence in this deepfake world. 116But I think those are just additional work towards solving the problem versus attacking the actual problem, like taking down these big companies and putting laws together that make the generation of deepfakes illegal. 117

A Future-Proof Solution: Mandating Synthetic Keys

Dalton: You won't be able to completely prevent people from creating these deepfakes. 118But what I could see happening is, as I mentioned earlier, each gun has a serial number on it. 119If you're caught with a gun with no serial number, you'll get in a lot of trouble. 120

If you put a serial number on these AI images and videos that you create, this digital synthetic key that something that Google does within the pixels... then you will be able to tell what is AI generated and what is real. 121

Dalton: If you made a law that forced every AI model company that provides imagery and video to assign a synthetic key to it, it's easier to track down what companies are not generating them. 122122122122This is quite complicated because you would need an alliance with the private companies and probably some kind of world alliance to get a registration of all the main AI companies. 123

The Overwhelming Bipartisan Support for Change

Dalton: These AI models are just going to become more sophisticated and more capable over time. 124Every three months, there's always some kind of historical breakthrough that just blows your mind. 125I'm really happy with what has been passed and I'm 100% sure that this is going to get signed. 126I think that there's only two people that voted against it. 127It was 409 to 2. 128It has strong bipartisan support, of course. 129

Dalton: I just wanted to create that last note where it's just refreshing that these things are changing. 130The concern that women and girls have had has been an issue since I've been a kid. 131I'm happy that these things are changing and I'm happy that the impact that these two girls have had on shaping and pressuring politicians to create a bill that is passable and has bipartisan support has gotten this through ASAP. 132

RESOURCES MENTIONED

  • Meta Platforms (Facebook, Instagram)
  • Snapchat
  • TikTok
  • OnlyFans
  • Google
  • YouTube
  • Fidelity
  • 60 Minutes

INDEX OF CONCEPTS

Take It Down Act, Deepfake, NCII, Non-Consensual Intimate Imagery, Revenge Porn, Taylor Swift, Google, Synthetic Key, Digital Watermark, FTC, Federal Trade Commission, Meta, Snapchat, Ted Cruz, Defiance Act, Alexandria Ocasio-Cortez, Ellison Barry, Francesca Manny, 60 Minutes, Fidelity, OnlyFans, TikTok, Instagram, YouTube