Senator Ted Cruz is getting serious about tackling deepfake AI-generated porn with his latest bill ... which is all about forcing tech companies to do the right thing, and swiftly too.
Category
✨
PeopleTranscript
00:00Tell us about the situation that you had with this young lady and why you decided to introduce
00:08this legislation.
00:09In Texas, this one teenage girl, Alliston Berry is her name, she was 14 years old.
00:16She lives in Aledo, Texas, which is in North Texas just outside Fort Worth.
00:22One Monday morning, she was in ninth grade, she was getting ready for school and suddenly
00:26her phone blew up, she started getting phone calls and texts from her friends.
00:30It turned out another teenage boy at her school had taken a picture of her, a perfectly innocent
00:36picture on social media, and had gone to an app and had used it to make a deep fake of
00:41her and made it appear that she was nude and this teenage boy did this to several girls
00:47in the class.
00:48He then created fake Snapchat accounts and sent it to almost all her classmates and ultimately
00:54they found out who did it.
00:55He was transferred to another school, but he faced no legal penalties, no consequences,
01:02and for nine months Snapchat did nothing.
01:04Although, I mean personally what I think is this is necessary, I'm wondering how effective
01:10it's going to be given the new world order with what we're seeing with AI and whatnot.
01:15Look, obviously we're not going to prevent all bad conduct and there's still going to
01:19be horrible things that happen, but what this bill is trying to do, and it's a bipartisan
01:24bill, whether it is an actual image that was taken consensually, but you didn't agree
01:28to have it posted to the world, or whether it's a deep fake that was made without your
01:33knowledge entirely, it makes it a criminal offense to post it.
01:38But then secondly, and this is a really important piece on the bill, it puts a legal obligation
01:43on big tech that when the victim or victim's family says, hey, these pictures are of me,
01:48they're non-consensual, that the tech company has 48 hours to take it down.
01:54And that, look, I'll tell you with Elliston, for nine months the pictures were still up
01:59on Snapchat.
02:00Her mom told me that last week, and I said, wait a minute, they're still there right now?
02:03She said, yeah.
02:04I told my staff right then, I said, all right, get on the phone with Snapchat right now.
02:09If need be, put me on the phone with the CEO.
02:11Let's get this taken down immediately.
02:14Within the hour they pulled it down.
02:15Now, frankly, you shouldn't have to have a sitting member of Congress make a call on
02:19your behalf.
02:20The victim should have that right written into law.
02:23That's what this bill would do.