Hany Farid
👤 PersonPodcast Appearances
It's good to be here.
Yeah.
Correct.
You can think about what I was talking about in several levels.
Here's the big level, which is trust.
It's not about images.
It's not about videos.
It's not about deep fakes.
It's not about anything.
It's not about...
fake news.
It's about trust.
How do you trust anything anymore?
Who do you trust?
Where do you trust?
And I would contend that if you can't trust the information we get, we can't do anything.
We can't have open and fair elections.
We can't tackle climate change.
We can't tackle a global pandemic.
We can't have stable societies and economies.
So I think there's this sort of bigger story here that we have slowly but surely started to erode our ability to trust each other, trust organizations, trust the media, trust anybody.
And that feels to me like a death blow to a society.
There's two reasons for it.
One is what I just enumerated.
We can't respond if we don't agree on basic facts.
But the other one is how do you regain trust once you've lost it?
You don't.
That's the problem because you need to believe in the media.
You need to believe in the experts.
You need to trust your government.
And if you don't have that, how do you regain it?
And now you have the mother of all chicken and egg problems.
So what's so fascinating to me as a technologist about the Internet was that it was designed to democratize access to knowledge and information.
And it did, but it didn't distinguish between good information and bad information.
And arguably, the bad information overrides the good information by a big margin.
And I think we are repeating exactly the same mistakes with this AI generation because what are AI models?
They're trained on the Internet.
The Internet is a cesspool.
Trash in, trash out, right?
And so I worry that...
So many people's lives sort of start the day and end the day on these devices.
And we are being fed garbage, junk food, to keep us clicking and liking, to raise profits for a couple of five companies in the world.
And I think we're burning the place to the ground, honestly.
Now, one level under that is the deep fakes and the fake images and the fake videos that are being weaponized against individuals and organizations.
But it's a subset of a much larger issue that I think we all sort of know is there.
And what I find so fascinating about this conversation is I can't tell you how many people I have this conversation with where I say, we should get off of social media.
And they're like, yeah, yeah.
And they applaud.
And then they're like tweeting about that.
Yeah, exactly.
And knock it out of bed in the morning.
Yeah.
Well, I will tell you there are days where I think, yeah, this is over.
This is a good experiment, but this is the beginning of the end.
There are other days where I'm more hopeful.
But I do think that if...
We are passive.
We just say, it'll work itself out.
This is going to end badly.
We are on a path of no return.
We have to make a conscious choice to change.
And that's going to require not one or two or three, but a lot of big moves and a lot of small moves too.
We've got to get the tech companies in order.
We've got to get governments to be more responsive.
We've got to get better education in the schools about critical thinking and what the Internet is and what it is not.
And there's a lot of moves to be made.
And at a time, I don't want to be political about this, without a lot of leadership coming out of the U.S., I don't see that happening in the next three years.
And I don't know that we have more than three years before this gets really, really weird.
It's weird.
And here's what I can tell you.
I'm a technologist, and I've been in this space for a long time.
We used to measure advances in technology in 12 to 18 months.
We now measure it in 12 to 18 days.
I mean, a couple weeks go by, we're like, whoa, where did that come from?
It is fast.
And here's the thing you have to understand about AI.
It's starting to program itself.
These AI models are starting to be so good.
Yes.
To teach themselves.
Yeah, yeah.
And so then, you know, they start to sort of get out of our control.
And when you start putting these systems in critical decision-making places, that, by the way, we don't understand these systems.
We don't thoroughly understand them.
They have emergent behaviors, and people are like, well, where did that come from?
And that is disconcerting.
Yeah, so I would like to see regulation.
I would like for our U.S.
government and our European government and our Australian and all the governments to say, look, guys, let's not repeat the mistake of the last 20 years.
We tried the hands-off, move fast and break things, and it didn't work, so let's get better.
But it's not happening.
So the Europeans have moved pretty well.
They have the DSA and the DMA, which are trying to put guardrails.
The Brits have some good guardrails.
But, okay, they can help a relatively small percentage of the world.
But what do we do in 85, 90 percent of the world?
And if the U.S.
is not here, we are nowhere.
And we are nowhere.
Right.
And the other thing with regulation is that it moves spectacularly slowly with lobbying efforts from now trillion-dollar companies watering everything down in their favor.
So I'm not optimistic.
I think if you are looking for external relief, it has to come from litigation, not regulation.
That is, you have to start holding companies responsible for the harms that they do.
Why are the products that we have in our pockets relatively safe?
Because we told these companies that if you create a product that either you knew or should have known would be harmful, I'm going to sue you back to the dark ages.
And we got product safety.
Yeah.
It was good.
It happened with cigarettes.
It happened with a lot of products.
And so the physical product, the foods, the food we eat is relatively safe, right?
The airlines, everything is relatively safe.
But somehow in the digital world, we thought, nah, it's the internet.
Yeah.
But it's not the internet.
There is no more online world.
Yeah.
And this is the thing is we have to start holding these companies responsible for the harms that they do.
And when that happens, they internalize that.
And they say, okay, we can move fast and break things and get sued back to the dark ages, or we can slow the hell down, build these things by safety, by design, not as a third or fourth afterthought, and then not get sued back to the dark ages.
You have to create the right incentives.
And when we don't have the right incentives, the companies are going to do what the companies do.
And you're also not the customer.
Whether you're on there or not is irrelevant.
You're the product.
Right?
The customer or the advertisers.
Yeah.
Yeah.
And you're absolutely right.
And, you know, for young people, I'm actually fairly sympathetic.
Right?
If every single one of their friends are on these apps.
Yeah.
Yeah.
What are you going to do?
Right.
I mean, so I am sympathetic to that.
And I do think that...
although I said this in my talk, get off of social media.
And I mean that.
You really should, by the way.
The evidence is overwhelming.
It is bad for you.
It's bad for your mental health.
It's bad for your physical health.
And I swear to God, it drops your IQ by 20 points.
You should get off of it.
But I'm also realistic that berating people for doing something that they know is bad for them is telling people to stop eating potato chips.
I just ate a bag of potato chips, by the way.
Delicious.
Fantastic.
So I think we have to give them better options.
I think what we have to do is say, look, this stuff sucks and it's bad for you, but here's something that's good, right?
And it doesn't suck and it's good for you.
So I think I would like to see...
The venture capital community start to try to invest in companies that are just better, right?
This is a group of people, by the way, that I think get off without a lot of criticism.
All of these tech companies are VC-backed, right?
And they are the ones that are driving.
They are absolutely making the bets.
And they are betting on a model that works, right?
Move fast and break things.
So I do think we have to give better options.
And this is where governments can step in.
Start creating incentives for companies to do better.
This is something we can do.
We don't need regulation for that, and it's frankly not that controversial.
But I think we have to give people a better option, and we haven't done that yet.
Can I give you one analogy to this?
So when you go to the grocery store, they should make the junk food hard to find.
They should make the healthy food easy to find.
They should tax the stuff that's bad for you and make it more expensive.
So we're not telling you not to do it, but we're going to de-incentivize it.
So we can do that with our digital worlds.
We just have to be motivated to do it.
Syntaxes.
Yeah.
Right?
Yeah.
Tobacco.
Alcohol.
I got to tell you, one of the things that makes me crazy because I teach at UC Berkeley and I interact with a lot of young people who I generally adore, by the way.
I think they – even if their heads aren't in the right place, their hearts are in the right place.
I can't tell you how many conversations I have with this –
I was watching on TikTok about Gaza, Ukraine, climate change.
And I always say, don't finish that sentence.
Because there's nothing at the end of that sentence that starts with, I saw on TikTok, that I am interested in.
It is a primary news source.
And it is horrifying.
It is horrifying.
And so what do you tell parents?
I don't know.
I mean, thank God I'm not a parent, honestly, because I don't know how people do it.
I mean, honestly, it is unbelievably hard.
Here's the only thing I can suggest.
And this is not – it's a little bit unfair to put this on the parents.
I think we have to put this on the schools is we have to teach critical thinking.
We have to teach people that you are being manipulated on social media.
You are being delivered those videos with a very specific algorithm to keep you clicking for as long as possible to deliver ads.
That's what is happening.
And that's, by the way, not that different than the tobacco industry manipulating nicotine levels to keep people addicted.
It's the same thing.
Or Las Vegas, the way they design machines to keep putting money into the machine so that you separate it from yourself.
So I think we have to teach critical thinking.
I think we have to teach people the difference between a story in the Associated Press or Reuters or Aux Enfants and some random video on TikTok.
Those are not the same thing.
Okay.
Right?
So I think a lot of this is about education and about reminding people that if you want to use TikTok for entertainment, I'm fine with that.
But it's not a place to get news and information.
Like, just knock it off.
That's not what it was designed for.
All right, a couple of things.
One is stop getting your news from social media because that's literally— Step number one.
Step number one.
Step number two is whether you like mainstream media outlets or not, they do a pretty good job of getting it right.
And I know this because I talk to them every single day, that when images come out of war zones or natural disasters or whatever it is, they are vetted pretty carefully.
There's a methodology.
And there's also a consequence for getting it wrong.
Right?
So if you want reliable information, get the hell off of social media and go to places that have editorial and journalistic standards.
Okay.
Number three is you cannot do this well.
I do this for a living and I'm pretty good at it and I've been doing it for 30 years and it's hard.
hard and it's getting harder and here's the real danger i could right now name five things that i can teach you to look at an image but here's the problem is in six months it probably won't be true anymore and now you have this false sense of confidence right i call it the arrogance ignorance problem right you got to understand that this is incredibly hard you are not well suited to do this it's sort of like saying how do you teach somebody to give themselves surgery
You don't.
You go to a doctor who went to medical school.
That's what you do.
You leave this to professionals.
You're not going to be an investigative journalist.
I hate to tell you this.
Do what you do well and let other people do what they do well.
And trust that they're going to do their best to get you information.
That's it.
I've never heard the cow in the cow barn, by the way.
I'm from Texas.
Okay, that's where it came from.
That's why.
That's why.
I've got to, like, bring my Texas.
Cat out of the bag.
No?
Okay.
All right.
Yeah.
This is hard.
So I'll give you a couple of examples.
We live in a very polarized time, and I get requests and emails from all kinds of news outlets to analyze things that are harmful to Donald Trump or things that were harmful to Kamala Harris.
And when I say that something that is harmful to Donald Trump is real, I get a phenomenal amount of hate mail from the right.
Hmm.
When I say something that is helpful to Donald Trump and I say that it is real, I get hate mail from the left.
People don't want to hear things that they don't agree with.
And this is very, very bad.
And I don't know how to fix that.
This is a bit of a cop-out, but I think what I do is necessary, but it's not sufficient.
It's necessary to know what's what, but it's sort of like medicine.
It doesn't always go down right.
And I can tell you this is not a partisan issue.
I get just as much hate mail, maybe a little bit more from the right than the left, but I get a lot of hate mail from people who don't like what I have to say because it doesn't conform to their worldview.
Yeah.
I can't tell you how many emails I get from people who say, hey, you should look at this and tell me what you think because you're the world's leading expert on this.
And I'll respond, hey, we've looked at this.
Here's the fact check.
And they wrote back being like, you're a moron.
I'm like, okay, but you're the one who wrote to me telling me I'm the world's leading expert.
So what are we doing here?
Yeah, yeah.
Right?
I'm only the world's leading expert when I agree with you?
When I agree with you.
It's not the way the world works.
And here's the thing is you can sort of blame people for being knuckleheads, but part of it is also that we are living in the mother of all echo chambers because of social media.
We absolutely do.
We absolutely do.
And I think people also interact with each other online in a way that they wouldn't do like this.
I can't tell you the number of people who have threatened to kill me on email or voicemail for that matter.
Sometimes they're handwritten letters.
Those are particularly weird.
Yeah.
Because I just went through so much effort to get to me.
You put that in the mail?
I know, exactly.
Yeah, I've gotten that.
But I do worry that we have demonized the people we disagree with so much so that we can't even listen anymore.
And now we come back to trust, right?
So I don't know how to fix that.
I don't know how to fix it, but I think we have to fix it.
Good.
I love rapid fire.
I don't know, but I know when I get it.
Oh, my God.
Like a physical thing?
I don't have an answer for this.
This is terrible.
Ah.
I co-founded a company to try to restore trust, and I'm hoping that we will at least be somewhat successful.
Ooh, another TED Talk on it.
Okay, favorite obsessions are anything with two wheels.
I ride a Harley Davidson motorcycle.
I'm a road cyclist.
I'm a mountain biker.
I'm not good enough at any of those things to give a TED Talk on.
But if I had to give another one, this is what it would be on, because I give this advice to students all the time, is that you've got to find a way to unplug.
You got to.
There is nothing like being out in nature.
Here we are in this beautiful Vancouver.
There is nothing like it.
I mean, 20 minutes with a hike in the woods is better than anything else.
Like, you got to unplug.
You got to put the device down.
You got to get away from it.
And I know it's really hard, but once you start doing it, it's like the mother of all therapies.
My family is originally from Egypt.
And I was born in Germany and we immigrated when I was very young.
But as young kids, we would go to Egypt every year.
And probably my most cherished memories were in Cairo with my grandparents.
And it was such a different time.
They would just give us a few pennies and we'd go down to the local baker and get the fresh bread.
And the smell of that and the streets of Cairo and going back home and eating incredible food.
I have such fond memories of that.
And the smells, that smell of that fresh bread.
Oh, yeah.
A couple of things.
On my bike all day long, century ride, put in 100 miles with friends and end up at a phenomenal restaurant for a meal.
By the way, if you've ever ridden 100 miles, there is nothing that tastes nearly as good as whatever you're eating at the end of those 100 miles.
That's a good day for me.
Yeah.
Yeah.
I'm worried about everything.
I really do.
I know.
I feel bad about that.
Here's what gives me a little bit of hope, I would say, is young people give me hope.
They are smart.
They are engaged.
They want to change the world.
I don't think they know how to, but I like their energy.
There's an intention.
Yeah, I like their intention.
That's a good word.
I like their spirit, and I wish them well because, you know, what I tell them is, we screwed up this world for you in one way.
You'll screw it up for another one, but you need to fix our problems before you do that.
But, you know, I love being on a university campus because young people are inspiring.
They really are, despite the TikTok thing.
Thank you.
I'm still trying to think what I brought into my life in 2025.
I know.
Some of these are stumpers.
I think I drink a lot more.
And then... A lot of bourbon.
A lot of bourbon.
You are a senior military officer, and you've just received a chilling message on social media.
Four of your soldiers have been taken, and if demands are not met in the next 10 minutes, they will be executed.
All you have to go on is this grainy photo, and you don't have the time to figure out if four of your soldiers are in fact missing.
What's your first move?
If I may be so bold, your first move is to contact somebody like me and my team.
I am by training an applied mathematician and computer scientist, and I know that seems like a very strange first call at a moment like this.
But I've spent the last 30 years developing technologies to analyze and authenticate digital images and digital videos.
Along the way, we've worked with journalists, we've worked with courts, we've worked with governments on a range of cases, from a damning photo of a cheating spouse, gut-wrenching images of child abuse, photographic evidence in a capital murder case, and of course, things that we just can't talk about.
It used to be a case would come across my desk once a month.
And then it was once a week.
It's almost every day.
And the reason for this escalation is a combination of things.
One, generative AI.
We now have the ability to create images that are almost indistinguishable from reality.
Social media dominates the world and is largely unregulated and actively promotes and amplifies lies and conspiracies over the truth.
And collectively, this means that it is becoming harder and harder to believe anything that we read, see or hear online.
I contend that we are in a global war for truth, with profound consequences for individuals,
for institutions, for societies and for democracies.
And I'd like to spend a little time talking today about what my team and I are doing to try to return some of that trust to our online world and, in turn, our offline world.
For 200 years, it seemed reasonable to trust photographs.
But even in the mid-1800s, it turns out the Victorians had a sense of humor.
They manipulated images.
Or you could alter history.
If you fell out of favor with Stalin, for example, you may be airbrushed out of the history books.
But then, in the turn of the millennium, with the rise of digital cameras and photo editing software, it became easier and easier to manipulate reality.
And now, with generative AI, anybody can create any image of anything, anywhere at the touch of a button.
from four soldiers tied up in a basement to a giraffe trying on a turtleneck sweater.
It's not fun and games, of course, because generative AI is being used to supercharge past threats and create entirely new ones.
The creation of nudes of real women and children used to humiliate or extort them.
Fake videos of doctors promoting bogus cures for serious illnesses.
A Fortune 500 company losing tens of millions of dollars because an AI impersonator of their CEO infiltrated a video call.
Those threats are real, they are here, and we are all vulnerable.
It's useful to understand how generative AI works.
Starting with billions of images with a descriptive caption, each image is degraded until nothing but visual noise is left, a random array of pixels.
And then the AI model learns how to reverse that process by essentially turning that noise back into the original image.
And when this process is done not once, not twice, but billions of times on a diverse set of images, the machine has learned how to convert noise into an image that is semantically consistent with anything you type.
And it's incredible.
But it is decidedly not how a natural photograph is taken, which is the result of converting light that strikes an electronic sensor into a digital representation.
And so one of the first things we like to look at is whether the residual noise in an image looks more like a natural image or an AI-generated image.
Those star-like patterns are a telltale sign of generative AI.
Now, for mathematicians and the physicists in the audience, that is the magnitude of the Fourier transform of the noise residual.
For everybody else, that detail doesn't matter, but you definitely should have taken more math in college.
Professors can't help themselves.
But no forensic technique is perfect.
And so you don't stop after one thing, you keep going.
So let's go on to our next one, the vanishing points.
If you image parallel lines in the physical world, they will converge to a single point, what's called a vanishing point.
A good intuition for that is the railroad tracks.
Railroad tracks are obviously parallel.
They narrow as they recede away from me and intersect at a single vanishing point.
This is a phenomenon that artists have known for centuries.
But here's the great thing.
AI doesn't know this.
Because AI is fundamentally, as I just described, a statistical process.
It doesn't understand the physical world, the geometry and the physics.
So if we can find physical and geometric anomalies, we can find evidence of manipulation or generation.
Evidence number two.
All right, what else can we learn?
Surprisingly, shadows have a lot in common with vanishing points.
And again, this is a physical phenomena that you expect in natural images, and because AI fundamentally doesn't model the physics and the geometry of the world, it tends to violate these physics.
We now have a very good indication that this image is not authentic.
The most important thing I want you to take away from this is that while it may not be easy, it is possible to distinguish what is real from what is fake.
This image is a bit of a metaphor for how a lot of us feel.
We feel like hostages.
We don't know what to trust anymore.
We don't know what is real, what is fake.
But we don't have to be hostages.
We don't have to succumb to the worst human instincts that pollute our online communities.
We have agency, and we can effect change.
Now, I can't turn you all into digital forensics experts in 10 minutes.
But I can leave you with a few thoughts.
One, take comfort in knowing that the tools that I've described and that my team and I are developing are being made available to journalists, to institutions.
to the court to help them tell what's real and fake, which in turn helps you.
Two, there is an international standard for so-called content credentials that can authenticate content at the point of creation.
As these credentials start to roll out, they will help you, the consumer, figure out what is real and what is fake online.
And while they won't solve all of our problems, they will absolutely be part of a larger solution.
Three, please understand that social media is not a place to get news and information.
It is a place that Silicon Valley created to steal your time, your attention by delivering you the equivalent of junk food.
And like any bad habit, you should quit.
And if you can't quit, at least do not let this be your primary source of information, because it is simply too riddled with lies and conspiracies and now AI slop to be even close to being reliable.
Understand that when you share false or misleading information, intentionally or not, you're all part of the problem.
Don't be part of the problem.
There are serious, smart, hardworking journalists and fact-checkers out there who work every day, because I talk to them every day, to sort out the lies from the truths.
Take a breath before you share information, and don't deceive your friends and your families and your colleagues and further pollute the online information ecosystem.
We're at a fork in the road,
One path, we can keep doing what we've been doing for 20 years, allowing technology to rip us apart as a society, sowing distrust, hate, intolerance.
Or we can change paths.
We can find a new way to leverage the power of technology to work for us and with us and not against us.
That choice is entirely ours.
Signal-to-noise ratio is getting close to one.
Stay off of Twitter, stay off of X, and stay off of everything else for that matter.
I would say we're getting close to 50%.
Yes, but it's becoming increasingly more difficult.
By the way, this is a secondary problem, which is now people are creating fake things, then going to fake sites to authenticate fake things, and it's all getting very weird.