Corynne McSherry
👤 PersonPodcast Appearances
So I think it's a very dangerous path that unfortunately we're already well along.
I think in moments of crisis, and I think we're in a moment of crisis right now, we look to simple solutions for very complex problems, and we are often sorry.
And I think that is where we are right now.
The internet grew up the way it did for mostly good, I would argue.
because the platforms and the intermediaries mostly stayed neutral.
If we have a world in which Facebook, Twitter, Google, Instagram put themselves in the position of a court and decide what speech should be up, what speech shouldn't, we're going to walk down a dangerous path because those decisions, those tactics,
will inevitably be used against speech that we would support, for one thing.
They will be inevitably used eventually by governments.
Private censorship does not stay private.
It becomes public censorship almost inevitably.
And the third reason is really practical.
They're already doing it, and they're doing it badly.
All kinds of lawful speech is being taken down every day.
Google and Facebook can't save us from the Nazis.
OK, so the problems here are legion.
And I'm going to start with the ones that I just touched on briefly before.
The reality is that we can all target people that we hate right now.
But if we think that the rules that Twitter and Facebook and all those guys are going to come up with aren't going to be used against speech that we support, we are foolish.
Community standards complaints are used against valuable speech all the time.
I know because I hear about it every day in my job.
Then the related problem to that is when you get your lawful speech taken down, you don't have any options.
You don't know how to get your stuff put back up.
So we have courts, but we don't have a right of appeal.
These platforms have the right to host any speech they want.
They actually have the First Amendment right to host any speech they want.
But I think as users, we want them to use that right wisely.
That's not happening right now.
You know why I know they can't?
Because they're trying and they're failing over and over.
They cannot tell the difference between hate speech and reporting on hate speech.
And so accounts get taken down and suspended when they're doing perfectly lawful things.
I do just want to respond to this real quick.
My view is if white supremacists and Klansmen and Nazis are organizing, I way prefer they were doing it out in public where I can see them, and I can challenge them, and I can respond to them.
And law enforcement will say the exact same.
People who fight terrorism say it's much better for the people that are speaking publicly for the radicals to be radicalizing where you can see them.
They're going to organize anyway, okay?
So would you rather do it in secret or in the open?
So we can continue the silo conversations that we're having right now, which is a big part of why we ended up in these conversations.
Yes, I would like to be siloed from Nazis.
That sounds very nice and it's a good talking point, but in reality, I think that's very, very dangerous for our society.
We need people to be talking to each other.
When they only talk to people who agree with them, they never change their minds.
It doesn't happen nearly enough.
Do you know why we have gay marriage equality now?
Because people talk to each other.
It's not the only reason, but it helped.
But I want to answer Jad's question, because I think what you're asking is for an example of why I'm worried about how the moderation happens.
So the way that it works now and the way that it's likely to continue to work is that the social media companies employ a combination of humans and mostly algorithms to try to figure out what's bad speech and what's good speech.
So they'll end up taking down this statement, all white people are racist, as an example of hate speech, but they won't take down
If you might show the previous one, this from a congressman who said, not a single radicalized Islamic suspect should be granted any measure of quarter, et cetera, et cetera.
They can't tell the difference.
And there's a hat tip to ProPublica.
I hope you guys are all ProPublica supporters and fans because they're great.
They did a detailed study to look at Facebook's policies.
And they found out that, among other things, they're training their moderators to, in some instances, protect white men over black children.
That's where we are right now.
That's what we want to endorse.
That's what we want to encourage.
What we have, where we are right now, is thousands of accounts are being suspended every day.
Let's just say a relatively small percentage of those are for perfectly lawful speech.
That's a lot of lawful speech.
That's a lot that we have authorized Twitter
Facebook and everyone else to take down and encourage them to and keep in mind I want to say one more thing that I said before but I want to emphasize it once we start down this path if you think that this is going to stay within the decision makers at Silicon Valley you are dreaming I mean that's bad enough I'm not actually sure why we all want Silicon Valley to make decisions about what speech is okay for all the rest of us but even that aside it's not going to stop there governments are going to come in when they see that Google Facebook Twitter can easily take down accounts they're going to say okay
Oh, if I was queen of the world.
But I think even I would have trouble, in all instances, being perfect about what was lawful speech and what wasn't speech.
But that actually isn't my main concern.
It's that even I could then potentially be required by a government
to then use that algorithm for other purposes, and that would be really dangerous.
But here's the one thing that I would say, and this is where I think we agree, is that if I was queen of the world, and I was running any of these companies, one of the things I would absolutely do is put in much better processes for people to appeal, for people to challenge when things are taken down wrong.
This isn't just a speech issue, it's a due process issue, because let's face it,
Of course, these aren't official government forums.
We all understand that, but nonetheless, this is how we talk to each other.
These are our public spaces, and in those public spaces, it's really important when your account gets suspended, when you get taken offline, to be able to get back up if what you're doing is perfectly legal, and right now, the reality is, and I know this because I hear from people all the time, it's very confusing.
You don't know who to appeal to.
You don't know why you're taken down half the time, and you don't know what to do.
So, I mean, I think that that's really a real pressure point because I think a lot of these companies, and I think actually genuinely so, feel uncomfortable making money from hate.
But unfortunately, we still have a problem.
And I'm going to give you an example from an article I just read yesterday.
It's a long piece about Google and how it runs advertising and search and so on from Talking Parts Memo.
And Talking Points Memo mentioned that one of the problems that they have, because these processes are so opaque, they survive because of Google advertising.
And they're a legitimate site trying to do good for the world.
They survive because of Google advertising.
They keep getting penalized for hate speech because they're reporting on hate speech, specifically the Dylann Roof situation.
So it's not easy to sort of disentangle...
See, now he's just trying to piss me off.
So what we're talking about is now a step further.
It's social media companies and intermediaries, by the way, all the different people that you interact with, they take it upon themselves to out you, to pierce your anonymity.
That is profoundly, profoundly dangerous.
Anonymity, anonymous speech, is probably the most important form of political speech that we have.
The ability to speak, especially online,
without fear of retaliation, means that you have the ability to speak your truth.
If we out people, if we accept that social media companies should be judge and jury over that, should just expose people to the world without any choice, without any recourse, because once you're outed,
That weapon was also used to persecute minorities all over the... Everything was always used to persecute minorities at some point.
It's still used to persecute minorities.
The one thing we have always understood in this country, and this is before the First Amendment, is the importance of anonymous speech.
Okay, but I don't actually think that was what he was saying at all.
Someone should say that with a microphone.
I think he was just saying Silo's bad.
No, I think that... No, I think he's saying if we don't talk to each other, nobody's mind ever changes.