Adi Robertson
Appearances
Decoder with Nilay Patel
The AI election deepfakes have arrived
So far, it feels like the consensus is we're going to label this and that's going to be mainly our job is that we're going to try to make sure we catch it. There are cases where, say, maybe you get it taken down if you haven't disclosed if you're a company or you're buying a political ad.
Decoder with Nilay Patel
The AI election deepfakes have arrived
But broadly, the idea seems to be we want to give people information and tell them that this is manipulated and then they can make their own call.
Decoder with Nilay Patel
The AI election deepfakes have arrived
I feel like the incentives for something like the music industry and for things that are basically aesthetic deep fakes, I think the incentives there are very different than they are for political manipulated imagery. That a lot of the question with YouTube is, okay, you are basically parodying someone in a way that may or may not legally be considered parody.
Decoder with Nilay Patel
The AI election deepfakes have arrived
And we can make a deal where that person really, all they want is to get paid, right? And maybe they want something sufficiently controversial taken down. But if you give them some money, they'll be happy. That's just not really the issue at hand with political generated images. The problem there is around reputation. It's around people who do, at least in theory, care about.
Decoder with Nilay Patel
The AI election deepfakes have arrived
Did this person say this thing? Is this true? So I just I don't know that you could cut a deal with Joe Biden that says every time Joe Biden, you make something up about him, he gets a penny.
Decoder with Nilay Patel
The AI election deepfakes have arrived
There are companies that are signing on to an initiative called C2PA, which is – we were talking about watermarks earlier. It's a content provenance system. It includes a watermark that has metadata, and the goal there is the idea that –
Decoder with Nilay Patel
The AI election deepfakes have arrived
you will be able to at least tell where something has come from and whether it's manipulated, and that it's supposed to be this broad industry-wide, everybody has the same watermark system, so it's very easy to look at an image and pop it in and check and see if it has the watermark. That's one of the leading ways the AI industry at this point is trying to deal with truth and provenance.
Decoder with Nilay Patel
The AI election deepfakes have arrived
Watermarks are rolling out places. OpenAI adopted them in mid-February. They're starting to appear on Dolly Images. You can look at them in Photoshop. I think the problem is more that this thing rolled out, but really most people are not going to care enough to check.
Decoder with Nilay Patel
The AI election deepfakes have arrived
Yeah, a lot of the issue with C2PA right now is that you have to actually go in and pop it into a tool to check the metadata, which is just an extra step that the vast majority of people are not going to take. And that, yes, it's not applying to things like Sora yet, at least as far as OpenAI has told us. So there is not a really prominent in your face, this thing is AI in most cases.
Decoder with Nilay Patel
The AI election deepfakes have arrived
I mean, a screenshot tool, as far as I can tell, can remove the watermarks. And I think there are ways that you can end up just stripping these things out. It's very, very hard to create a perfect watermarking system.
Decoder with Nilay Patel
The AI election deepfakes have arrived
Just so much of this relies on a lot of Adobe's argument being, well, eventually we want people to expect this, that it's going to be like, you know, if you look in your browser and you get a certificate warning that says there's no certificate for this webpage, maybe you won't trust the webpage.
Decoder with Nilay Patel
The AI election deepfakes have arrived
I think the goal they're going for is the idea that everyone will be trained into expecting a certain level of authenticity. And I just don't think we're at that point. In some ways, these problems already existed. Photoshopped nudes have been a thing that has been used to harass people for a very long time.
Decoder with Nilay Patel
The AI election deepfakes have arrived
And Photoshopped images of politicians and manipulated content about politicians is nothing new. A thing AI definitely does is scale up those problems a huge amount and make us confront them in a way that it was maybe easier to ignore before, especially by adding a technology that
Decoder with Nilay Patel
The AI election deepfakes have arrived
The people who are creating the technology are trying to hype up in a way that sounds terrifying and world-ending for other reasons. The problem with a lot of this is that you can't apply the kinds of paradigms that social media has because it really only takes one person with one paradigm.
Decoder with Nilay Patel
The AI election deepfakes have arrived
capability to do a thing like it takes one bad actor to make something that you can spread in huge variations that are hard to recognize across huge numbers of platforms i think that raises slightly different problems than say there's this big account on social media that's spreading something well all right facebook can ban them
Decoder with Nilay Patel
The AI election deepfakes have arrived
There are a lot of different problems that AI-generated images pose, and there are cases where individual states have passed rules for individual problems. There are a few states that incorporate, say, non-consensual AI pornography laws into anti-pornography.
Decoder with Nilay Patel
The AI election deepfakes have arrived
general revenge non-consensual porn rules there are a few states with rules about how you have to disclose manipulated images for elections and there are some attempts in congress to create a larger framework or in the say fec and other government regulatory agencies to create a larger framework but we just are still in this large chaotic period of people debating things
Decoder with Nilay Patel
The AI election deepfakes have arrived
The copyright issue is actually something that came up with non-synthetic, non-consensual pornography because, say, if one of your partners took a nude picture of you, you don't own that picture. And that was already just a huge loophole that... legislators have spent about a decade trying to make laws that meaningfully address nonconsensual pornography that's not AI-generated.
Decoder with Nilay Patel
The AI election deepfakes have arrived
And the frameworks they've come up with are getting ported over to AI-generated imagery, that a lot of it is about, all right, this is harassment, this is obscenity, this is some other kind of speech restriction that is allowable. A lot of nonconsensual pornography is a kind of sexual harassment thing.
Decoder with Nilay Patel
The AI election deepfakes have arrived
that we can find ways to wall outside protected speech, and that we can target it in a way where it's not going to necessarily take down huge amounts of other speech, the way that, say, just banning all AI-generated images would.
Decoder with Nilay Patel
The AI election deepfakes have arrived
There's California, New York is another, there's Texas. At the federal level, there have been attempts to work this into, it's not a criminal statute, but there is a federal civil right to sue if you have non-synthetic, non-consensual point of view.
Decoder with Nilay Patel
The AI election deepfakes have arrived
And there have been attempts to work AI into that and say, all right, well, it's not a crime, but it's a thing that you can sue for under, I believe it is the reauthorization of the Violence Against Women Act. And then there have been attempts to, like you mentioned, just tie all of this into a big federal likeness law.
Decoder with Nilay Patel
The AI election deepfakes have arrived
So likeness laws are a mostly state-level thing that says, all right, you can't take Taylor Swift and make her look like she's advertised your Instant Pot. And so there have been some attempts to make a federal version of that. But likeness laws are really tricky because they're so much broader that they end up catching things like parody and satire and commentary.
Decoder with Nilay Patel
The AI election deepfakes have arrived
And they're just, I think, much riskier than trying to create really targeted, specific use laws.
Decoder with Nilay Patel
The AI election deepfakes have arrived
If you're talking about non-synthetic stuff, then there are all kinds of documentaries and news reports and really things that people have a public interest in making where you don't want to give someone the right to say you cannot depict me in a thing. And in that case, it's doing something I actually did. But AI-generated images raise the whole other question, which is, OK, so what –
Decoder with Nilay Patel
The AI election deepfakes have arrived
Where do you draw the line between an AI-generated image and a Photoshop of someone and a drawing of someone? Should you not be able to depict any person in a situation that they don't want to be depicted in, even if that situation is something that would just broadly be protected by the First Amendment? Yeah.
Decoder with Nilay Patel
The AI election deepfakes have arrived
We've had, I think, about three major in the U.S. presidential election cycles where disinformation was a huge issue. 2016, where there was a lot of discussion in the aftermath about, all right, was there foreign meddling in the election? Were people being influenced by these coordinated campaigns?
Decoder with Nilay Patel
The AI election deepfakes have arrived
Like, where do we think that the societal benefit of preventing a particular usage that hurts someone should be able to override the interest we have in just being able to write about or create images of someone?
Decoder with Nilay Patel
The AI election deepfakes have arrived
There was 2020 where deepfakes technically did exist, but generative AI tools were just not as sophisticated. They were not as easy to use. They were not nearly as prevalent. And so there was a huge conversation about what role do social platforms play in preventing general manipulated information. And there was, in a lot of ways, a huge crack down, there was the entire issue of Stop the Steal.
Decoder with Nilay Patel
The AI election deepfakes have arrived
The two bills are a little bit the thing I talked about where one of them, the Defiance Act, is really specifically about we want to look at non-consensual pornographic images. We define what that means. And we think that this particular thing we can carve out. There are lots of questions about, in general, how far you want to go in banning synthetic images. But it's really targeting porn.
Decoder with Nilay Patel
The AI election deepfakes have arrived
sexually explicit pictures of real people. And I think things like the No Fakes Act, I believe there's also something called the No AI Fraud Act. These are much broader. We just think that you shouldn't be able to fake images of people. And we're going to make some carve-outs there, but the fundamental idea is that we want to create a giant federal likeness law.
Decoder with Nilay Patel
The AI election deepfakes have arrived
And I think that's much riskier because that is much more a, we start from a point of saying that you shouldn't be able to fake an image of someone without their permission. And then we're going to create some opt-ins with some options where you're allowed to do it.
Decoder with Nilay Patel
The AI election deepfakes have arrived
And I think that raises so many more of these questions that do we really want to create a federal ban on being able to create a fictionalized image of somebody?
Decoder with Nilay Patel
The AI election deepfakes have arrived
Defamation law has already come up with text-based generative AI, where if something like ChatGPT tells a lie about you, are you allowed to say they're making things up about me? I can sue. And I think the benefit of defamation law is that there is a really huge framework for hammering out when exactly something is an acceptable lie and when it's not.
Decoder with Nilay Patel
The AI election deepfakes have arrived
There are these large movements that are trying to just lie about who won the election. What do we do? There were questions about, all right, do we kick Trump off social networks? These were the locus of debate. And now it's 2024, and we have in some ways I think a little bit of a hangover from 2020 where platforms are really tired of policing this.
Decoder with Nilay Patel
The AI election deepfakes have arrived
That, all right, well, would a reasonable person believe that this thing is actually true, or is this really obviously political commentary and hyperbole? I think that we're on at least more solid ground there than we are with just saying, all right, fine, you know what, just ban deepfakes. I do think that still defamation law is complicated.
Decoder with Nilay Patel
The AI election deepfakes have arrived
And every time you open up defamation law, as Donald Trump has once suggested, you end up getting a situation where in a lot of cases it's very powerful people. throwing the law against people who don't necessarily have the money to defend themselves. And in general, I'm cagey about trying to open up defamation law.
Decoder with Nilay Patel
The AI election deepfakes have arrived
But it is a place where at least you have a framework that people have spent a very long time talking about.
Decoder with Nilay Patel
The AI election deepfakes have arrived
When a new technology comes along, there are a large number of people who don't necessarily think about it in terms of the First Amendment or of speech protections where you're able to say, oh, well, this thing is just categorically different. We've never had technology like this before. The First Amendment shouldn't apply. And.
Decoder with Nilay Patel
The AI election deepfakes have arrived
I always hope we don't go there with the technology because I think that the problems that come from just blanket outlawing it tend to be really huge. I don't know. I think that we're still waiting to see how disruptive AI tech actually is. We're still waiting to see whether it is meaningfully different from something like Photoshop, even though it seems intuitively like it absolutely should be.
Decoder with Nilay Patel
The AI election deepfakes have arrived
but we're still waiting to see that play out.
Decoder with Nilay Patel
The AI election deepfakes have arrived
If we're talking about non-internet systems like robocalls, then we actually have laws that aren't really even related to most of the things we've talked about. There's a rule called the TCPA that's an anti-robocall law, basically, that says you cannot just bombard people with synthetic phone calls. And it was recently decided that, all right, should artificial voices there include voice cloning?
Decoder with Nilay Patel
The AI election deepfakes have arrived
Yes, obviously. Right. So at this point, things like robocall laws apply to AI. And so if you're going to try to get Joe Biden calling a bunch of people and telling them not to vote, that's something that just can be regulated under a very longstanding law.
Decoder with Nilay Patel
The AI election deepfakes have arrived
That raises really all the same questions that image based questions. AI raises. In some ways, it's probably going to be harder to detect and regulate against at a non-legal platform level because so much stuff is optimized for detecting images. And so in some ways, it's maybe even a thornier problem.
Decoder with Nilay Patel
The AI election deepfakes have arrived
And so they're dealing with, all right, how do we renegotiate this for the 2024 election? And then you have this whole other layer of generative AI imagery, whether or not you want to technically call it deepfakes is like an open question. And then there are all the layers of how that gets disseminated and whether that turbo charges a bunch of issues that already existed.
Decoder with Nilay Patel
The AI election deepfakes have arrived
And also, on the other hand, voice impersonation was a thing before this, that there were really good impersonators of celebrity voices. And so I think that that might be a technically harder problem to fix, but I think that the legal questions it raises are very similar.
Decoder with Nilay Patel
The AI election deepfakes have arrived
There are a bunch of really hard technical issues. And a lot of those issues are going to be irrelevant to people because so many people do not check even very obviously fake information because of a variety of reasons that do not have anything to do with it being undetectable as a fake. I think that trying to actually make yourself care about whether something is true is...
Decoder with Nilay Patel
The AI election deepfakes have arrived
is in a lot of ways a bigger, more important step than making sure that nothing false is capable of being produced. I think that's the place where huge numbers of people have fallen down and where huge numbers of people have fallen for things. And I think that while all of these other issues we've been talking about are incredibly important,
Decoder with Nilay Patel
The AI election deepfakes have arrived
This is just a big individual psychological thing that people can do on their own that does not come naturally to a lot of us.
Decoder with Nilay Patel
The AI election deepfakes have arrived
Yeah, I think that the oversight board, what it tends to do is that is maybe comparable to the Supreme Court is do sophisticated outside thinking about what does a consistent moderation framework look like. But like the Supreme Court in real life does not adjudicate every single complaint that you have. You have a whole bunch of other courts. Facebook doesn't have really those other courts.
Decoder with Nilay Patel
The AI election deepfakes have arrived
Facebook has a gigantic army of moderators who don't always necessarily even see its policies. So, yeah, it's this very macro level. We're going to do the big thinking. But also, even at the time, there was the question of, is this really just Facebook or now Meta kind of outsourcing and kicking the can out of its court and putting the hard questions on other people?
Decoder with Nilay Patel
The AI election deepfakes have arrived
Yeah. And part of this is also political, that there was a huge, largely, again, in the U.S., right-wing backlash to this, that this was the kind of thing that would get a state attorney general mad at you and get a congressional committee to investigate you. as it ended up doing with pre-Musk Twitter. I think that, yeah, there became a real political price for doing this as well.
Decoder with Nilay Patel
The AI election deepfakes have arrived
Since then, some platforms have let Donald Trump back on. They've said, all right, but we cannot possibly moderate every single lie on this. We're going to just wash our hands of whether you're saying the election was stolen or not.
Decoder with Nilay Patel
The AI election deepfakes have arrived
The companies are in slightly different spots, but they actually have come together. Very recently, they've signed an accord that says, look, we're going to take this seriously. They've announced policies that are varying levels of strictness, but tend toward, you If you're a major AI company, you're going to try to prevent people from creating information that maybe looks bad for public figures.
Decoder with Nilay Patel
The AI election deepfakes have arrived
Maybe you ban producing images of recognizable figures altogether or you try to. And you have something in your terms of service that says if you're using this for political causes or if you're creating deceptive content, then we can kick you off.
Decoder with Nilay Patel
The AI election deepfakes have arrived
We don't know necessarily how good the enforcement of it is going to be, but the companies seem so far pretty open to the idea of self-regulation, in part because I think this isn't just a civic-minded political thing. Dealing with unflattering stuff about real people is just a minefield they don't want. That said, there are also just there are open source tools.
Decoder with Nilay Patel
The AI election deepfakes have arrived
Stability AI is pretty close to open source. It's pretty easy to go in and make a thing that builds on it that maybe strips away the safeguards you get in its public version. So it's just not quite equivalent to the way that, say, social platforms are able to completely control what's on their platforms.
Decoder with Nilay Patel
The AI election deepfakes have arrived
Does stopping mean that you're just trying to limit the spread to where this doesn't become a huge viral thing that a bunch of people see, but it still may be technically possible to create this? Or do you want to say, all right, we have a zero tolerance policy. If anything is created with any tool anywhere, even if someone keeps it to themselves, that is unconscionable.
Decoder with Nilay Patel
The AI election deepfakes have arrived
The most promising argument I've heard for these is the idea that you can – and this is an argument that Adobe has made to me – train people to expect a watermark. And so if what you're saying is we want to make it impossible to make these images without a watermark, I think that raises the same problems that we just talked about, which if anyone can make –
Decoder with Nilay Patel
The AI election deepfakes have arrived
tweaked version of an open source tool, they can just say, don't put a watermark in. But I think that you could potentially get into a situation where you require a watermark. And if something doesn't have a watermark, there are ways that its design or its spread or people trusting it are severely hobbled. That's maybe the best argument for it, I've heard.
Decoder with Nilay Patel
The AI election deepfakes have arrived
It does seem like the thing about a lot of generative AI tools is that there are just vast, vast numbers of ways to get them to do something. People are going to find those. Software bugs are a thing that has been a problem. Zero-day exploits have been a problem on computers for a very long time. And this feels like it kind of falls into that category.