Menu
Sign In Pricing Add Podcast
Podcast Image

Logbuch:Netzpolitik

LNP504 The politics of intellectual shame

Thu, 17 Oct 2024

Description

For this episode we made an exception and decided to record an interview in English. We are talking to Meredith Whittaker, president of the Signal Foundation, a non-profit company that's developing the Signal app that is world's most popular Multiplatform private encrypted messenger system. We talk to Meredith about her way into the digital realm, how she shook up Google by organizing walkouts and more or less erasing all memories to it's "don't be evil" motto, how Signal came to be and what its principles are, how she views Europe and the regulations policies of the EU and much much more.

Audio
Transcription

0.503 - 9.208 Meredith Whittaker

Good morning, Linus. Good morning, Tim. Linus, what watch? Ten watch. Such much? You'll get along beautifully in America.

0
💬 0

30.464 - 52.166 Interviewer

Logbuch Netzpolitik number 503 from October 16, 2024. And you will have noticed it. Wir sprechen eine andere Sprache. Wir sprechen eine andere Sprache heute, weil wir haben heute einen Gast und das ist dann ein Gespräch, was wir heute auf Englisch führen wollen.

0
💬 0

57.532 - 57.512 Interviewer

503.

0
💬 0

57.552 - 97.276 Tim

Oh. Anyway, we say hello to Meredith. Meredith Whittaker. Hello, welcome to our podcast. I'm so happy to be here. Thank you for having me. Meredith, you are, in case somebody doesn't know you, the president of the Signal Foundation.

0
💬 0

97.716 - 98.657 Meredith Whittaker

I am. I am.

0
💬 0

99.057 - 102.619 Tim

And Signal is a messenger application?

0
💬 0

103.259 - 115.086 Meredith Whittaker

It is the world's most widely used truly private messaging application. I always have to put the full sentence in there. A very special messaging application.

0
💬 0

115.106 - 115.827 Tim

What makes it special?

0
💬 0

117.894 - 135.779 Meredith Whittaker

We build against the grain, to put it in kind of flowery language. We go out of our way to build for privacy and to get as close to collecting no data at all as possible, which, as you and I assume many in your audience know, is a lot more work.

0
💬 0

136.826 - 138.006 Tim

Then simply encrypting.

0
💬 0

139.027 - 153.01 Meredith Whittaker

Well, encrypting is part of that, but then you need to make sure the encryption works. And then encryption of message content isn't going to solve all of your problems because you also have metadata. You also have libraries you're pulling in. You have core services you're working with.

0
💬 0

153.03 - 175.597 Meredith Whittaker

So there's a lot of ingenuity required to actually create a system that is rejecting the normative assumptions of the tech industry today, which is that we all want to collect a ton of data. And we all want to monetize that data. We want to sell you ads with that data. We want to sell that to different customers as a data broker. We want to train AI models with that data.

0
💬 0

176.978 - 184.422 Meredith Whittaker

Most of the infrastructure in the tech ecosystem now assumes that as a given. And then we have to rewrite things too.

0
💬 0

184.902 - 194.146 Tim

And you negate all these assumptions and try to build a truly private messaging platform in a world that has a completely different business model.

0
💬 0

194.537 - 209.101 Meredith Whittaker

Yeah. Well, I'm going to just flip that to be a little bit rhetorical and say those assumptions negate the human right to private, intimate communication. And we are trying to rebuild a tech ecosystem that actually honors those.

0
💬 0

210.322 - 211.442 Tim

That's an even higher goal.

0
💬 0

211.642 - 211.762 Meredith Whittaker

Yeah.

0
💬 0

212.921 - 230.5 Interviewer

Before we dig more into the Signal project, which in itself I think is quite interesting, we'd like to understand how did you get there, where you are right now? So what was your introduction into this digital world? When did it start?

0
💬 0

231.935 - 252.665 Meredith Whittaker

Um, I don't have one of those stories where I'm like an Atari. I was four. I'm a hacker. I didn't care about tech, Quatech. I wasn't. You remember back in school, there were two kinds of nerds. There were the math nerds and the book nerds.

0
💬 0

253.506 - 254.206 Tim

You were the book nerd?

0
💬 0

254.286 - 266.334 Meredith Whittaker

I was a book nerd. But in the story we tell now, because nerddom has been overlapped with monetary success and sort of a career in math and science, I think we forget about the book nerds. But hello, I'm here to remind you.

0
💬 0

267.094 - 268.215 Linus

We exist, we're here.

0
💬 0

268.915 - 288.705 Meredith Whittaker

And that's who I was. And so I studied literature and rhetoric at Berkeley, which I just thought were, I mean, they're still beguiling. Like being able to read and write is pretty fundamental for anything. And then I was poor. So I took a job at Google because they were the first ones to offer me a job. And then I got very fascinated with what on earth was going on.

0
💬 0

289.205 - 294.488 Tim

When was that? Like when were you hired by Google as somebody that studied literature and rhetoric?

0
💬 0

295.086 - 314.171 Meredith Whittaker

July 10th, 2006. So I graduated Berkeley and I needed rent money and I put my resume on monster.com, which is a precursor to LinkedIn. And they reached out and then I talked to my friend Michelangelo because at the time... You still needed an invite to join Gmail, as I recall.

0
💬 0

314.231 - 336.387 Meredith Whittaker

And I was like, hey, Michelangelo, can you get me an invite so I can make a Gmail address so I can appeal to this recruiter? And now my spam-filled Gmail address dates from that moment. So I was hired as something called a customer operations associate. And I didn't know what that was. No one knows what that is. It's just a bunch of words. But it sounded I was like, that's a business job.

0
💬 0

336.528 - 360.126 Meredith Whittaker

It sounds like a business title. I had no idea. It was a basically a customer support person who wrote technical documentation, who are kind of user documentation, some technical documentation and answered questions. inquiries about Google's free products. And I was doing that for rightly, which was an acquisition that Google made that then they rebuilt to become Google Docs.

0
💬 0
0
💬 0

361.907 - 382.959 Meredith Whittaker

And this is at a time when the entire The business of Google was just search. So search was all on the main campus. It was where the money came from. And then there was this dinky little building where everything else, like Gmail, Blogger, Rightly, whatever, Reader, all of them sat and they were kind of experimental projects. And that was called apps.

0
💬 0

384.12 - 400.853 Meredith Whittaker

uh did you ever have exposure to actual customers or did you just write documentation i well they would email in like i only did this for a second because i kind of figured out i wanted to do other things and that's part of the story um there's a part of the story actually so i i did answer like

0
💬 0

401.233 - 428.037 Meredith Whittaker

inquiries sometimes and there were like auto responses like hot keys you'd use and this like really janky ticketing system and this is back like we didn't have laptops there was no such thing as a smartphone like our desktops were chained to the desk and when you went home you went home and if you needed to do something now you had to check out this like clunky Lenovo or something I don't remember exactly but it was you know I remember I was like okay I get rewarded in my job based on how many tickets I close and

0
💬 0

428.479 - 443.771 Meredith Whittaker

But I don't know the engineers who have to close these tickets because I'm reporting bugs. Right. And I need them to fix the bugs I'm reporting. And if they don't fix the bugs and I'm not getting reward. And I was like, this is really silly. So I went over to the apps building because I was sitting in another building and I like met the engineering team. I was like, hey.

0
💬 0

444.286 - 462.577 Meredith Whittaker

I'm that person who keeps sending you these emails that are annoying you. And I was able to convince one of them to go out to Costco, which is this big warehouse store, kind of a Walmart-y store in the US, and put a giant couch on his personal credit card and bring it into the apps building so I could sit with them.

0
💬 0

463.117 - 483.808 Meredith Whittaker

Because if I sat with them, then I could vibe with them, and then they would be much more likely to fix my bugs. And so all my bugs started getting fixed. But then my manager got really upset because it was like, yeah, obviously I had breached the hierarchy. And I was like, oh, kind of. And so this meeting appeared on my calendar that was basically about me being insubordinate.

0
💬 0

483.848 - 498.074 Meredith Whittaker

But I'm also like I didn't come from that class. I had no I was like, what the fuck did I do wrong? I don't know. And I was like, oh, God, I like I did something. And then the day before I think that meeting was going to happen, this email hit kind of we had this all.

0
💬 0

498.892 - 519.36 Meredith Whittaker

org like some group i don't know it was like the group to the consumer operations team and one of the engineering directors and apps must have had like a beer or something during one of the many many many drinking parties that happened during the day at google in that era and sent some email that was like meredith's couch is a model for collaboration

0
💬 0

520.781 - 527.35 Guest

I was going to say, that's why you hire somebody from Berkeley to write your tickets because they come up with ideas like that.

0
💬 0

527.63 - 545.306 Meredith Whittaker

And then that meeting just like disappeared. And I was like, well, shit, I need another job because I just burned someone who's going to figure out how to get back at me. And that's where my street sense comes in. So I... took a bike around campus for a couple of months and like asked every VP during their office hours, which is the thing they used to have, like, Hey, I want another job.

0
💬 0

545.626 - 567.533 Meredith Whittaker

And I would just drop in and be like, hi, you don't know me. I'd like a job. And then I got a job doing basically standards work, like trying to push document interop standards. And so I like started in standards and then standards parlays into measurement pretty quickly. Cause measurement is ultimately the methodological standard. Um, And co-founded MLab.

0
💬 0

567.573 - 589.216 Meredith Whittaker

And then that became the nucleus around which the research group I founded existed. And then I just that was the that was when the fresh air of sort of these political and social complexities started hitting my technical work. And I was like, oh, standardization is power. Oh, creating data means you own the narrative. Oh, yeah. shit, none of this is neutral. All of this is contingent.

0
💬 0

589.616 - 599.4 Meredith Whittaker

I started seeing the balance sheet. I started seeing the capital that was involved in running infrastructure. And I remember around the time I met you, minus like maybe over a decade ago.

0
💬 0

599.701 - 602.742 Tim

Yeah, it must have been about a decade or longer. Yeah, probably a bit longer even.

0
💬 0

602.782 - 629.575 Meredith Whittaker

Yeah, aging is weird. But I remember I used to have these sort of smuggled balance sheets I took out of Google, like printed out, which showed how much the... uplink bandwidth and the infrastructure and power costs you were for MLAB. And I would be showing this to all these civil society funders being like, you can't be funding 250 K. It costs $40 million a year in bandwidth just to do MLAB.

0
💬 0

629.675 - 640.401 Meredith Whittaker

Like you guys don't understand the economics of this because we're still in the era of like civic tech and it's all we need is a good idea. Right. And I think I was really lucky to get sensitized to the political economy and

0
💬 0

641.038 - 658.139 Meredith Whittaker

And the fact that we're talking about infrastructure, capital, network effects, economies of scale and not some kind of brilliant idea that just ephemerally transformed our world and that our side just needs to wait to have one of those to get our turn.

0
💬 0

658.752 - 668.214 Interviewer

That sounds like a quick upgrade from a customer support person to me. So you mentioned MLab, which stands for measurement lab.

0
💬 0

668.534 - 669.055 Meredith Whittaker

It does, yeah.

0
💬 0

669.095 - 674.916 Interviewer

Can you explain what this is all about and how it came alive and why you were involved?

0
💬 0

676.776 - 687.119 Meredith Whittaker

It was a project. It was Derek Slater, Vince Cerf, Stephen Stewart, some Sasha Meinrath at Open Tech Institute,

0
💬 0

689.384 - 690.605 Interviewer

The elders of the internet.

0
💬 0

690.845 - 697.748 Meredith Whittaker

Yeah, some old internet guys and me. Let's say the elders of the internet.

0
💬 0

697.768 - 700.369 Interviewer

You're one of them now.

0
💬 0

701.729 - 723.898 Meredith Whittaker

And the lady with the couch. And the conceit there, which felt really simple to me at the time, is everyone is buzzing about net neutrality. And I was a Kool-Aid drinker. I still think the value underlying that kind of mythology, let's say, like, yes, we should, you know, we should not have one gatekeeper deciding, you know, which news source, right?

0
💬 0

723.918 - 745.625 Meredith Whittaker

This is, you know, this is old school common sense. It goes back to Western Union, who there's a little, you know, in the US would refuse to carry telegrams from political candidates the company didn't support, right? So there's like real, there's a real like bedrock precedent here. And I was like, yeah, of course, we need net neutrality. But, you know, neutrality itself is a really wafty sort of

0
💬 0

746.412 - 766.052 Meredith Whittaker

It's a loose concept that needs to be augured in some type of benchmark against which we can assess. You need to quantify it. And then you just flung into the abyss of philosophy the second you try to do that. But we started and it was... I would say it was not necessarily built to succeed.

0
💬 0

766.092 - 778.236 Meredith Whittaker

It was built as a sort of hypothesis project where we stood up three servers that hosted open source measurement clients. And the thing that we were doing was putting...

0
💬 0

779.73 - 805.528 Meredith Whittaker

Servers that were all kind of configured the same that gave each client a dedicated slice of resources and then way over-provisioned the uplink between the server and the switch so that we could all, for all intents and purposes, guarantee that any artifacts that were detected through the measurement methodology, like some TCP RTT or something,

0
💬 0

806.288 - 811.351 Meredith Whittaker

were not interfered with by our infrastructure and weren't suffering for bandwidth, right?

0
💬 0

811.871 - 816.714 Interviewer

So for practical reasons, you had unlimited bandwidth and zero latency to everywhere.

0
💬 0

816.98 - 819.801 Meredith Whittaker

Yeah, and that's very expensive.

0
💬 0

821.422 - 822.282 Interviewer

But cool.

0
💬 0

823.182 - 840.329 Meredith Whittaker

We immediately DDoS the servers. We're like, oh, shoot, because Vint Cerf put a blog post out and he's a big name. And then it was like this job of years and years of getting servers in different interconnection points because we wanted to measure across consumers. We're not just measuring to a server.

0
💬 0

840.349 - 841.269 Linus

The topology of the network.

0
💬 0

841.429 - 867.829 Meredith Whittaker

Exactly, and we need to cross interconnection points to do that because that's where you begin to see... interesting business relationships and kind of feuds. And it was just, it was very, it taught me about like how difficult and contingent and ultimately subjective creating data is. And the process, the political process of then sort of defending that data is, as a reliable proxy for reality.

0
💬 0

867.909 - 888.734 Meredith Whittaker

Cause I would be at the federal communications commission and I had like Comcast across from me being like, no, we want to measure it, you know, multi-threaded TCP, you know, burst single threaded because you get a, you know, let's just say higher number that way it's more forgiving. And we would be sort of defending the methodology of our tests and the openness principles.

0
💬 0

888.794 - 899.066 Meredith Whittaker

So it was open source code. The database architecture was open. I don't remember all the server architecture, everything. Old school.

0
💬 0

900.287 - 905.689 Interviewer

Old school ideological commitment. You don't sound like a book nerd now. You've become a real network nerd.

0
💬 0

905.709 - 909.75 Meredith Whittaker

Let's just say book nerds read and network stuff is written down.

0
💬 0

909.87 - 915.032 Interviewer

It's like a real... I read the fucking manual.

0
💬 0

915.172 - 920.755 Meredith Whittaker

I'm like, no one told me I was allowed to not read it and still have an opinion.

0
💬 0

920.795 - 920.935 Linus

But...

0
💬 0
0
💬 0

951.594 - 953.037 Guest

It's scary, isn't it?

0
💬 0

953.157 - 957.905 Meredith Whittaker

It is. I mean, it's a catalyst into today for my work.

0
💬 0

960.226 - 979.792 Tim

You did that for, I think, a couple of years, the MLab stuff. And you also, at this point in time, I think, began the ties to the, let's say, broader internet freedom community. I remember Uniprobe at the time, I think, was another... They still exist, right? They do.

0
💬 0

979.932 - 993.8 Meredith Whittaker

And we actually talked to them at Signal. They'll detect signal blocking. And I believe... So one of the things MLab did was provide... backend infrastructure for projects like Uniprobe. And I don't actually know what that relationship is today.

0
💬 0

995.14 - 1014.004 Meredith Whittaker

But, you know, so academics and hackers and developers could write a test or, you know, kind of a measurement methodology, deploy a client to consumers, like you test from your laptop or whatever. And then we would be the ones paying for the bandwidth and infrastructure that would allow that to scale.

0
💬 0

1014.645 - 1018.449 Tim

Uni is the Open Observatory of Network Interference.

0
💬 0

1018.909 - 1020.351 Meredith Whittaker

Yes. If I remember the acronym.

0
💬 0

1020.391 - 1045.329 Tim

Hi, Arturo. Disclosing a whole social network here voluntarily. But eventually something happened that is known as the Google Walkouts. Was that following your MLAB activities or during your MLAB activities or did you rise even higher in the Google hierarchies before that happened?

0
💬 0

1045.829 - 1080.279 Meredith Whittaker

Well, I mean, there are many. So I was not working on MLAB by that point. Because there's a part of this story that I didn't cover, which is around 2014, 2015. We started seeing really interesting artifacts in the data, which showed essentially that at particular interconnection points between particular telcos, We were seeing drastic drops in performance.

0
💬 0

1081.24 - 1097.986 Meredith Whittaker

And what we were able to do was look at every intersection of telco one and telco two, we see this drop. Intersections of telco three and telco two don't see this drop. And so what we're able to do is say there's actually a business feud there.

0
💬 0

1098.306 - 1119.786 Meredith Whittaker

going on and the interconnection point is the locus of that feud and using trace route data we were able to say like look these aren't so it's not just that they're sharing a path in some other you know region of the network that's slowing it down they're actively throttling they're actively throttling and what that we put together a report on that and you know kind of this is where i was sort of

0
💬 0

1120.427 - 1139.573 Meredith Whittaker

I think kind of sideloading myself into some academic research work. I was like, okay, how do we do this? How, you know, method document, our methodology document sort of, you know, everything was open. So we pulled that together and it, um, it showed, I don't know if you remember, there was a sort of Comcast cogent Netflix. Netflix was shoving all its graphics through cogent.

0
💬 0

1139.894 - 1161.127 Meredith Whittaker

What we had done is expose that and exposing that it exposed the principle of that you have to, if you wanted to ensure net neutrality, you have to take the interconnection points and the interconnection agreements into account. And that led Obama to add interconnection to the reclassification under Title II of these, you know, which kind of moved toward net neutrality that was then nullified.

0
💬 0

1161.167 - 1172.956 Meredith Whittaker

But, you know, that was kind of the swan song, let's say, for my MLAB time, because, of course, that was a huge deal, right? Like, that's where the business model rubber hits the road. Yeah.

0
💬 0

1173.596 - 1198.123 Meredith Whittaker

and you know i'll just shorthand it like that bought me a lot of capital at google and with that i was you know i was already interested in a lot of things m lab was sort of humming along and like it had grown from a hypothesis project to like a global thing that was working and and doing stuff um and i had started getting agitated by ai and a lot of these privacy and security concerns being part of the community that you were part of you know kind of thinking around like

0
💬 0

1198.772 - 1227.463 Meredith Whittaker

tech alternatives and getting less comfortable with the business model um and from there i always had like eight different projects going on but i went on to found to co-found the ai now institute which was really trying to bring the conversation on what we called machine learning back then but like ai was like a flashier term like bring it down to the ground a little bit and stop talking about super intelligence and start talking about you know political economy and

0
💬 0

1227.895 - 1235.722 Meredith Whittaker

What are these technologies? How are they being deployed? How do you oversee them? Who uses them? On whom? And what are the social and political dynamics of that?

0
💬 0

1236.483 - 1243.108 Tim

But the AI Now Institute is not hosted at Google, is it? I think it's at which university is it again?

0
💬 0

1243.689 - 1257.107 Meredith Whittaker

It was at NYU and it moved out of NYU for reasons of that being easier. Okay. And cheaper because they take like 40% those universities. Okay. Word to the wise, your money.

0
💬 0

1260.13 - 1263.234 Interviewer

What do you think? Like whatever donations you get. Okay.

0
💬 0

1263.434 - 1263.874 Meredith Whittaker

Yeah. Okay.

0
💬 0

1263.914 - 1265.777 Interviewer

It's an expensive brand. It is.

0
💬 0

1270.962 - 1298.589 Meredith Whittaker

for a reason right i mean an nyu institute that's yeah i mean it works something to have yeah i think co-founded uh i mean it was the work was really good and this was again like you know kind of gaining the capital at google cementing a reputation being able to get to a level where i had a budget and then you know part of what i was always trying to do is how how much can i pull out of google and get into the hands of people doing work that i think is cool like how do we

0
💬 0

1299.363 - 1306.108 Meredith Whittaker

like, carve tributaries in the massive river of this huge, rich company and, like, get it out.

0
💬 0

1306.388 - 1309.45 Interviewer

So you started that institute at the end of 2017.

0
💬 0

1309.851 - 1333.52 Meredith Whittaker

Well, beginnings are sometimes hard to date. It was born out of a request from the Obama administration to be the host of one of his AI summits. I don't remember exactly the contours there, but it was in 2016. Mm-hmm. And so it started as like – like the idea then was like let's do this and let's do it big.

0
💬 0

1333.78 - 1348.628 Meredith Whittaker

Like let's make this the most polished, flashiest, like hard to ignore kind of spectacle using all the tools we can get from like hiring an events agency, doing good press, all of that. But let's also make this the one that is the most –

0
💬 0

1349.511 - 1373.728 Meredith Whittaker

engaged with these political topics that's actually forcing the debate in that direction and kind of you know making it face these questions that are much more grounded and you know hopefully much more healthy for our you know position on ai so it was it was very successful and and from there we got you know offers of funding and a lot of encouragement and the work just kept kept going

0
💬 0

1374.051 - 1398.19 Interviewer

Well, we in Europe, we are used to be a bit behind. But if I can recall 2016, there was no discussion about AI in Europe at all. And so it's quite interesting to see that Obama has actually, you know, decided that it's finally time to do something about this topic nobody has ever really talked about.

0
💬 0

1398.491 - 1401.493 Tim

Germans are currently making up their mind whether that is a topic or not.

0
💬 0

1401.593 - 1428.691 Interviewer

We're going to talk a lot about Europe today as well. But my question is, can you describe what kind of discussions were going on 10 years ago in American society that this has come up to be something of a topic for the near future, which actually became a big one. Where did this discussion take place?

0
💬 0

1430.034 - 1455.838 Meredith Whittaker

Yeah. To answer this question, I'm going to be like drawing on a lot of the research and the historical work, kind of the work I've done since then. Because when this dawned in my life, like when it started being a thing, I was very I had basically that same question. Like, what is this stuff? Why is it? Why is it kind of at Google? You would see a shift toward a new paradigm or a new trend here.

0
💬 0

1456.323 - 1470.412 Meredith Whittaker

by the incentives that were structured into the OKRs, the quarterly goals, that kind of, you know, there'd be all these training modules that would pop up and it's like, make your software engineer into an AI developer, you know, and you'd be like, there's an incentive here.

0
💬 0

1470.432 - 1477.254 Tim

Seems like they weren't too successful though, right? I mean, that's not like the biggest AI wave at Google.

0
💬 0

1477.594 - 1485.495 Meredith Whittaker

Well, they were, that was deep mind era. Okay, okay, good. Sorry. I think they were, they were ahead. It was them and Meta for a long time.

0
💬 0

1485.775 - 1488.756 Tim

And the history... Didn't make it to the business though at the time, right?

0
💬 0

1488.956 - 1514.755 Meredith Whittaker

No, but they're chaotic. It's a kind of court in decline so that, you know, actually where the actually the business model part has never been their strong suit beyond search to be real, you know, like cloud is like the best technology presented confusingly with 18 versions, all deprecated, right? Like that's, but this, the, the AI stuff was, so if you look at the,

0
💬 0

1515.851 - 1538.715 Meredith Whittaker

If you look at the sort of recent history, which is something I've spent a lot of time on, I spend a lot of time on because I think that gives us a really different picture than the Elon Musk narrative or the kind of popular narrative. There's a very important paper that was published in 2012 that introduced the AlexNet algorithm. And this was Jeff Hinton and his students.

0
💬 0

1538.735 - 1543.236 Interviewer

You just got the Nobel Prize in physics, interestingly enough.

0
💬 0

1543.905 - 1545.286 Tim

Yeah, because there is no AI.

0
💬 0

1545.406 - 1548.187 Interviewer

Yeah, because there's no computer Nobel Prize.

0
💬 0

1548.227 - 1552.048 Meredith Whittaker

Well, if you claim that your technology is everything, then you can get a prize in anything.

0
💬 0

1556.71 - 1558.931 Interviewer

Which is a proven point now, yeah.

0
💬 0

1559.431 - 1570.355 Meredith Whittaker

And it was Ilya Stutskever and then Alex, and I'm sorry, Alex, I am not grabbing your last name from the ether right now, but nonetheless, this was a paper that kind of

0
💬 0

1570.835 - 1593.592 Meredith Whittaker

pulled together key ingredients that became the foundation of the ai ai boom now so this is deep learning algorithms which is the paradigm we're still in it doesn't matter you know there's architectural sort of rejiggering but nonetheless it's deep learning um huge amounts of data so the what i've called the derivatives of this surveillance business model he found all the cats on youtube

0
💬 0

1594.373 - 1619.805 Meredith Whittaker

I mean, yeah, that was Jeff Dean, I think. And then, you know, powerful compute. Right. And they they showed that sort of using gaming chips and a lot of data, you could beat the benchmark. So do much score much better against standard evaluations than past models and thus sort of catalyze the industrial model. like industry interest in AI. And why were they interested?

0
💬 0

1619.825 - 1636.517 Meredith Whittaker

I think this is a key point because these optimist-seeking algorithms are really good at curating news feeds. They're really good at figuring out algorithms, right? And so I don't think it's an accident that

0
💬 0

1637.438 - 1663.946 Meredith Whittaker

Jeff was immediately hired at Google that, you know, Yann LeCun, who wrote the sort of deep learning algorithms that became the sort of seed of this current moment in AI, wrote them in the late 80s, was immediately hired at Meta. And it was the platform companies with a real investment in, you know, squeezing more ad dollars out of the data, sort of, you know, better serving, all of that, that

0
💬 0

1664.642 - 1677.758 Meredith Whittaker

you know, were first to AI. And this is, you know, Google with DeepMind, you see Meta and Google being kind of the leaders in this, you know, as measured by different evaluation standards, like the measurement question here is actually really

0
💬 0

1678.157 - 1699.345 Meredith Whittaker

interesting, kind of troubling until this generative moment where I think that the chat GPT Microsoft products kind of shifted people's perception of AI and what it can do and just like rearrange the leaderboard. But the paradigm is still the same. And the paradigm is still that AI is applying old algorithms on top of the sort of

0
💬 0

1699.835 - 1723.809 Meredith Whittaker

massive platform monopoly business model, availing itself of huge amounts of data, which is produced via this surveillance business model, and really powerful compute that was designed, built up, I would say, consolidated in the hands of these platform companies. via the imperatives of the surveillance business model, right?

0
💬 0

1723.969 - 1749.277 Tim

Would you say that, so clearly, I mean, you say, so here are Google and Meta that have these massive amounts of data, like larger amounts of data than probably ever existed or were in the hands of anybody or any organization. Would you say that they had amassed all this data and eventually learned like, okay, we probably can't handle this anymore.

0
💬 0

1749.377 - 1769.661 Tim

So we're interested in this new paradigm to even monetize this data any further? Or was it rather like, oh, we have all this data, we're monetizing it well. Here's another way to monetize it on top of that. Because clearly, I mean, these deep learning algorithms are not of much use if you don't have large data sets.

0
💬 0

1770.313 - 1773.115 Meredith Whittaker

And large compute, right? For both training and inference.

0
💬 0

1773.616 - 1777.378 Tim

But is it the hen egg? What's the hen? What's the egg there?

0
💬 0

1777.839 - 1801.576 Meredith Whittaker

I don't... I mean, I think... This is almost a perpetual motion machine, right? Like every quarter you have to report progress. You have to report growth. The logic is metastasis, right? And so you're trying to squeeze more out of what you have and you're trying to get more of what you have so you can squeeze more, right? There's also these sort of laws of scale. Remember big data?

0
💬 0

1801.596 - 1819.984 Meredith Whittaker

We used to call it that. And so I don't actually know the answer to your question, but I think it... Like, which came first? But, well, I mean, the business model came first, right? Like, you had to have the ingredients to know what they did together.

0
💬 0

1821.185 - 1844.772 Meredith Whittaker

And I think it was, you know, deep learning and AI has sort of languished in the backwater, you know, with some interesting experiments through the 2000s. because its history is always promising too much and disappointing since the mid-50s when it was invented. And I think the goal was really to sort of supercharge their existing business model

0
💬 0

1846.483 - 1864.816 Interviewer

I think deep learning was just the technology that perfectly served their current beliefs in that they have to work on the data, that they have to build up algorithms to somehow predict your personal future and be there with an ad before you even know it. And we've seen this everywhere.

0
💬 0

1865.516 - 1893.127 Interviewer

as an ad everywhere and we've also seen it in political influence as we've seen in the Brexit decision and also in the elections where we heard there's going to be one in the near future as well in the US that might be influenced as well let me check my calendar I mean I think we can like peel back also just this concept of like what is an advertisement right

0
💬 0

1893.627 - 1912.282 Meredith Whittaker

It's an influence. It's trying to get, you know, is it trying to get you to buy something? Is it trying to get you to like something? Is it trying to get you to believe something? Is it trying to get you to vote a certain way? Right. And I think that, you know. The term advertisement is usefully deconstructed when we start to think about the connection between all of those.

0
💬 0

1912.722 - 1936.801 Tim

The term advertising is one of the best tricks the devil ever pulled. Because, I mean, it's behavior manipulation. It's a stated goal, right? But I remember when the first debates came up about, for example, the Cambridge Analytica scandal. They used Facebook data to manipulate voters. And it's like, no, there's a red line over here. You can't manipulate behavior.

0
💬 0

1936.821 - 1957.795 Tim

I mean, Facebook and Google were basically built to manipulate users' behavior and ideally, you know, capture their attention to change their behavior. Probably buying this shirt or the other or that shoe or whatever. And suddenly, you know, vote for somebody else. It's like, well, that's off limits. How could you build something like that? That's unheard of.

0
💬 0

1959.496 - 1980.921 Tim

And, you know, viewing advertising as like, oh, this is just an offer. This is, you know, oh, we're just making our product. No, no, I have a limited amount of attention per day. You're capturing it and you're doing it with one simple reason and that is changing my behavior in the sense that you aim for. And I guess eventually it sounds like...

0
💬 0

1983.502 - 2006.685 Tim

eventually you questioned your google career right because you were i mean clearly you had an impressive career there in just a few years um but you began to question not only your like google and the company culture but apparently also a little bit your your the way you want to continue

0
💬 0

2008.365 - 2035.455 Meredith Whittaker

with your with your you know making use of the influence and the knowledge you have yeah i think is that the workout i mean that's there's a it's all interlinked and kind of periodizing your own consciousness is hard but i think you know i'm pretty earnest and i like i also don't come from that world i don't come from that class so there are often places where i just didn't

0
💬 0

2036.698 - 2054.954 Meredith Whittaker

I would take things sincerely or be really committed and then only realize two-thirds of the way through, whatever it was, like, oh, no one else really cares about this. They're just networking or whatever it was. So I think there was an element there where when I was doing MLab, I was like, I really want to win net neutrality. And then we won net neutrality.

0
💬 0

2054.994 - 2072.911 Meredith Whittaker

But just at that point, I was realizing this is not actually the battle anymore. Google has a bigger network than Comcast. That's not the gatekeeper shit. Right. Like, but it was, you know, that was a sincere thing. And then I was like, okay, well I can make, move money to all these, you know, cool privacy hacker projects. That's, you know, that was sincere.

0
💬 0

2072.931 - 2095.093 Meredith Whittaker

And then I got into AI and I was like, okay, well, and I think this is, this is something that did shift for me. I think I used to have a lot, lot more faith in, In the power of ideas to influence real change. Right. And I still think, you know, I spend a lot of time and kind of thinking through discourses. How do we shape them? Like, how do we.

0
💬 0

2095.595 - 2117.685 Meredith Whittaker

How do we kindly walk people into understanding things that, you know, they may have an interest in not understanding or they may have been, you know, misinformed about or what have you. But I began, you know, I began around the time I was looking at AI and sort of making all these cases that everyone loved, right? Like I was out there giving talks that were terrifying.

0
💬 0

2117.805 - 2143.609 Meredith Whittaker

completely against the google party line and i was getting like applauses i was getting promoted like i was like this is a this is a perfect job um and then you know like i was i envied you not only once i'm the house troll Um, but then there was like, I was getting more influence. So I was becoming known outside and inside.

0
💬 0

2143.629 - 2167.585 Meredith Whittaker

I was like the person you'd call into your team when it was like, Oh, we want to implement this. And I, you know, is there an ethical way to do it? Um, and you would say, no, I would, I would be like, my, my dear friends, let us sit down. Um, and then I was getting, I don't know, like that was, that was sort of my life. And I was, you know, we were, we,

0
💬 0

2168.045 - 2185.998 Meredith Whittaker

kind of took the AI Now Institute and really did a lot to reshape the debate. Like I was very focused on that discursive intervention and how do we begin to talk about AI in a more realistic way? And that was working outside of Google, but it wasn't really influencing core decisions at Google. And that was kind of the

0
💬 0

2186.298 - 2210.002 Meredith Whittaker

the thing i kept hitting up against more and more strongly until i got a signal message in uh late 2017 from moxie or no no i mean i probably did get one from moxie at that time but not this one was not that one um it was from a friend of mine at google who said yo there's a really disturbing project that is hidden that i'm very close to

0
💬 0

2210.902 - 2226.636 Meredith Whittaker

And you should know about it because you're the AI person who cares about this and you have standing at the company around it. And this was the secretive contract that Google signed with the Department of Defense. to build AI drone targeting and surveillance systems for the drone war.

0
💬 0

2227.677 - 2245.554 Meredith Whittaker

And of course, like I was politicized post 9-11, post Snowden, like this was, you know, the drone war and the signature strike and all of that were really core in kind of my, you know, things that I ideologically rejected and, you know, felt like we needed to disarm, not supercharge.

0
💬 0

2246.38 - 2269.153 Meredith Whittaker

um and i had like a you know like a righteous anger i was just like fuck this because you're there running around you know trying to shape the discourse on ai yeah and making them look good right and because they get to like they get to be like look we platform such heterodox voices we're surely benevolent right and then i'm like okay and then you're inking this deal

0
💬 0

2269.733 - 2277.883 Meredith Whittaker

with the DoD behind the scenes for technology that one we know doesn't sort of work for the purpose, right? Like, you know, it's not going to better identify

0
💬 0

2278.416 - 2298.145 Meredith Whittaker

a worthy target of death or whatever the fuck it's, you know, it's not, you know, we know this is bullshit, but like, and you know, this is a multinational company, like more than half the employees are outside the U S there is an issue with yoking yourself to, you know, not that you have one nation's government, not that many tech companies care that much about that.

0
💬 0

2299.406 - 2320.07 Meredith Whittaker

And then there was just the, you know, what is the structural danger, which is deeply acute and, of a massive surveillance company with more data than the world has ever seen, more compromise than you can imagine, like, yoking their fortunes and a key dependency to the U.S. military, right?

0
💬 0

2320.49 - 2327.753 Meredith Whittaker

And, you know, we know from Snowden that that's already, like, you know, seen as, you know, as long as it's a corporation gathering it, it's not, you know.

0
💬 0

2328.153 - 2331.574 Interviewer

Was Google still running under the motto of don't be evil at the time?

0
💬 0

2332.062 - 2334.79 Meredith Whittaker

They were, they were. And that was, we marshaled that actually.

0
💬 0

2334.81 - 2337.197 Interviewer

It was kind of the end of it, wasn't it?

0
💬 0

2338.492 - 2348.897 Meredith Whittaker

They quietly – like the lawyers removed that from the Google manifesto. It was slowly fading. Yeah, it was like – they were like, just don't open that closet. Which motto?

0
💬 0

2348.957 - 2359.182 Co-Host

I don't recall any. It's like – Do you know something about this? Try not to be bad is the new motto.

0
💬 0

2359.222 - 2364.464 Meredith Whittaker

Don't be as bad as – Yeah, like lay off. Lay off the evil. Yeah.

0
💬 0

2366.212 - 2390.372 Meredith Whittaker

yeah so i mean and this was there were like a lot of old school people there at that time who really did drink the kool-aid and so it was you know i just put my energy into organizing against that and that was when i turned toward labor organizing and and thinking through traditional methods and approaches to combating that type of corporate power or you know kind of industrial power and um

0
💬 0

2391.089 - 2404.36 Meredith Whittaker

And that was the on-ramp to the walkout. So the walkout was like a big, that was like a rupture, like a manifestation. It got a lot of press and it was the biggest labor action in tech, according to the paper.

0
💬 0
0
💬 0

2412.55 - 2441.463 Meredith Whittaker

I think it was November 11th, 2018. And everyone walked out for 20 minutes at 11, 11 a.m. in their local time. So we called it Rolling Thunder. And it started in the Singapore office as I was going to bed in New York. And I was seeing the photos. And this was chaos. I hadn't slept in days. There's so many meetings. There's so many tears. It's hard to organize something like that.

0
💬 0

2442.443 - 2463.975 Meredith Whittaker

And I remember going to bed and seeing the images from Singapore with like, you know, a few hundred people, Singapore, like hit my Instagram. And I was like, oh shit, this is going to be big. And then I woke up New York time at like 5am to go to our location in New York and like prep it, make sure the cops let us be there, whatever.

0
💬 0

2466.196 - 2484.793 Meredith Whittaker

And then I just remember seeing like, there's this little park near the New York office. And then it just like, grew outside the park and then no one could get into the park. And then I was looking at my home and there's my phone and there's live helicopter feeds and we don't have bullhorns because I'm one of the speakers. There's like speakers standing on chairs to address the crowd.

0
💬 0

2485.533 - 2506.655 Meredith Whittaker

And then this, this guy, like some, you know, there's this sort of type, I don't know if in Germany you have this type, but they're like, kind of like, the leftist at every protest, and some guy had found out about it and came in, and I just remember this man I'd never seen handing me a bullhorn from below, and I picked it up.

0
💬 0

2507.355 - 2508.756 Interviewer

A bullhorn like a megaphone.

0
💬 0

2508.857 - 2531.265 Meredith Whittaker

Like a megaphone, yeah, yeah, yeah. And then it was... And then we... walked over to a Mexican restaurant and sat at the table and had a press operation that we were running. And what was cool was everyone organizing that was kind of a professional. Most of them were femme. And most of them had jobs at Google that were organizing the company.

0
💬 0

2531.485 - 2544.788 Meredith Whittaker

So organizing their comms, being an administrator for 13 different directors. This particular type of hyper-competence at coordinating activity across a number of people who you may not have direct power over.

0
💬 0

2546.189 - 2549.492 Interviewer

And like for that matter much more than developers.

0
💬 0

2549.612 - 2560.965 Meredith Whittaker

I mean, let's just say, yeah, yes. Um, you know, you can, like somebody who can write an email and get someone to respond to that email by doing a thing is kind of a witch. Yeah.

0
💬 0

2562.406 - 2585.754 Meredith Whittaker

And it was all these like these preternaturally competent femmes just turning that energy toward organizing across the company, which was building on this sort of base of like meetings and locals and all of this sort of work that we'd done to kind of form a discursive environment inside Google where we're like weekly meetings to discuss the news, what Google could do, what campaigns we could do.

0
💬 0

2585.794 - 2586.714 Meredith Whittaker

So we had this sort of

0
💬 0

2587.574 - 2606.222 Interviewer

energy and solidarity already at that point and then how many percent of the google employees would you say have at one point taken part in this i don't know we were really careful not to keep lists yeah i mean you can dream up a number now you know i i

0
💬 0

2609.028 - 2623.374 Meredith Whittaker

20,000 was the number estimated from the sky photos and the local reports. We had like local leads at every office who we sent them zip files of like the kit for the, you know, handing out the flyers, the talking points, how to treat media, all of that.

0
💬 0

2623.934 - 2641.547 Meredith Whittaker

they all had that and then they organized their local and then they had sort of reporting back in from press, reporting back into the kind of central organizers around numbers. And then we were issuing the press releases. But then, you know, there's employees of Google and then there's contractors of Google, which more than double employees.

0
💬 0

2641.648 - 2663.199 Meredith Whittaker

So, you know, that number is, and then there's people who couldn't participate, but were really supportive because they have, you know, they need their health insurance or they'll die or they are on a visa. And so it wasn't, I think there was a huge amount of support. I know that we were able to get Google to drop their military contract because there was enough support.

0
💬 0

2663.239 - 2682.124 Meredith Whittaker

And there was, you know, there was like a spreadsheet of people who are quitting conscientious objectors. We had, you know, like every week at the all town meeting, there was a table we would set up with like banners and something like we would had questions. Like it was a very, it was a very like rigorous campaign. And it kind of laid the,

0
💬 0

2683.078 - 2693.906 Meredith Whittaker

And I think it built some muscle that people are definitely using now, even if they don't call it organizing. It's like, how do we marshal the resources from this?

0
💬 0

2694.467 - 2697.729 Interviewer

So did Google then still love you at that point?

0
💬 0

2698.049 - 2703.834 Meredith Whittaker

Well, I think Google does still love me, but it doesn't love itself enough to admit it. Right.

0
💬 0

2710.646 - 2716.391 Interviewer

They have issues, I think.

0
💬 0

2716.431 - 2719.594 Tim

I do think that. You're like, come on, come on. Which side do you want to be on?

0
💬 0

2719.614 - 2722.156 Meredith Whittaker

Which party do you want to go to? Let's be real.

0
💬 0

2724.038 - 2753.207 Interviewer

I mean, it leads to an interesting point because I would say looking from the other side of the ocean, I think to us, this whole tech scene the startups, the new stuff, the internet, everything that has developed in the last 20 years or so always had this liberal touch to it. It felt as if it was mostly about an

0
💬 0

2753.787 - 2776.45 Interviewer

open world loving agenda and it's good for everybody and google's kind of tuned in with their motto and some other companies did as well some not so much but it was always this this feeling that this is uh that the liberal thinking is at the core of everything that is driving the internet forward

0
💬 0

2778.372 - 2821.197 Interviewer

and I think we stopped thinking that now because it looks totally different right now currently we have more the feeling that it's turned into a total right wing thing apocalypse apocalypse somehow and i haven't really seen this coming can you explain what happened to this tech scene what what happened i well you know i think about post 9-11 and a lot of the fights over surveillance and tech

0
💬 0

2823.327 - 2830.468 Meredith Whittaker

And there's a talk that. What is Frank Rieger and Rupert.

0
💬 0

2830.868 - 2833.25 Interviewer

Sorry.

0
💬 0

2833.45 - 2836.493 Meredith Whittaker

I don't. Yeah. That's my American. You'll do beautifully.

0
💬 0

2836.513 - 2837.713 Interviewer

It's complicated for us.

0
💬 0

2838.234 - 2847 Meredith Whittaker

Yeah. But I, I apologize for not getting that right. Yeah. Did this talk, you know, we lost the war. I think it was 1994. No, 2004. Yes. I did the same thing. And, and,

0
💬 0

2855.386 - 2883.899 Meredith Whittaker

there weren't not everyone i think was equating the growth and monetary success of the u.s based tech industry with sort of you know values of social progress i mean i think we can yeah yes it was liberal and then we can get back to a critique of liberalism and and what have you but it was i think there were people who were looking at the infrastructure who were looking at its capabilities who were looking at the gap between the promises and the

0
💬 0

2884.873 - 2912.819 Meredith Whittaker

of what this tech did and calling that out. And I feel like when I entered into this kind of privacy security hacker development scene, There was a lot of that skepticism there around Google. That educated me a lot. There was a lot of skepticism around surveillance. I immediately recognized, yes, we need privacy because it doesn't matter if these people are good or benevolent.

0
💬 0

2913.54 - 2929.597 Meredith Whittaker

What we're doing is setting up an infrastructure that could be turned over at any moment to another regime. Logically, all of that made sense. But I don't feel that until the Snowden revelations, any of that was anywhere near in the nervous system of a kind of tech consciousness.

0
💬 0

2930.917 - 2945.444 Meredith Whittaker

And a lot of the work I spent, I have spent in the 1990s trying to get, you know, sift through the crypto wars and sift through, you know, what happened with, you know, tech regulation to set up these surveillance giants and to permit this monopoly platform business model.

0
💬 0

2946.585 - 2966.826 Meredith Whittaker

has kind of looked at that gap between the rhetoric of like, you know, liberal rights preserving, you know, open free tech, and what was actually being built, right. And one of the things if you look at there's a there's a scholar named Katarina Reiter, who who I would really suggest I can, for show notes, I can send some of these links.

0
💬 0

2967.647 - 2983.072 Meredith Whittaker

But she did her dissertation looking at, you know, some of the negotiations in the crypto wars. And what you begin to see is that, yeah, we, you know, and this is a thesis sort of I build on top of in some of my work. Yeah, we won liberalized encryption, right? By 1999, in the US, it was

0
💬 0

2985.974 - 3006.936 Meredith Whittaker

Finally legal to build, share, implement strong crypto systems without approval from the government, without some threshold that made them useless. Right. But the agreement there was basically, yeah, you can have encryption, but we're going to permit mass surveillance by companies.

0
💬 0

3007.516 - 3019.806 Meredith Whittaker

And so you don't actually... You can just get the... We're going to permit... We're going to endorse the advertising business model. We're going to endorse... And I can actually... I can start this point over, actually.

0
💬 0

3019.846 - 3029.978 Tim

Would you say those two decisions were... strategically interlinked or were they just coinciding?

0
💬 0

3030.734 - 3047.505 Meredith Whittaker

I, you know, I can't say that there was a conspiracy. What I can say is that Katarina's work shows that, you know, there was clear, like, Microsoft saying, like, liberalize encryption. Don't worry. We're not going to encrypt all the data. We need it, right? You know, just come to us, you know, quietly, and we'll give it to you.

0
💬 0
0
💬 0

3048.465 - 3071.276 Meredith Whittaker

So instead of key escrow, instead of fighting over a backdoor, instead of doing this in sort of the public domain where we're kind of losing the fight on technical and other grounds... allow companies free reign to surveil because that allows us to implement this ad-supported business model. And then the data agreements can happen behind the scene.

0
💬 0

3071.636 - 3091.136 Meredith Whittaker

Now, I've completely compressed a very complex history into basically a meme. But I think the purpose is there is that there was always that gap between these sort of rhetoric and what was actually going on. And I think the I don't know, like there's like a kind of internet people, right?

0
💬 0

3091.216 - 3106.066 Meredith Whittaker

This type maybe misunderstood exactly like how this, you know, who would have power over this technology, right? Like encryption is liberalized, but it's not going to be applied to protect personal communication. It's applied to protect transactions, right?

0
💬 0

3106.187 - 3121.645 Meredith Whittaker

It's not, you know, like the people who get to choose whether or not it's used aren't us in terms of, you know, actually this sort of mass infrastructure in terms of the tech ecosystem that's being built. Right, you know, through regulatory decisions made by the Clinton administration.

0
💬 0

3127.717 - 3160.458 Tim

So then there were a couple of hundreds of thousands of nerds worldwide that used PGP encryption using additional software and plugins to send an email with nonsense information to avoid government or private sector surveillance. But it was probably never really a significant number of individuals that... That got to the point of having mass encryption out there.

0
💬 0

3160.739 - 3181.759 Meredith Whittaker

Yeah. Well, I tried. It's just that my friend group didn't overlap exactly with those couple of thousand nerds. So what am I going to do, right? And I think this is the network effect. This is why it's actually very difficult to do that. And this is why if one of those actors that controls these infrastructures doesn't make the choice for us... it's really difficult to make that choice.

0
💬 0

3181.779 - 3197.869 Tim

And none of these actors voluntarily got the idea, right? I think it's part of the founding myth of Signal that Moxie at the time wanted to implement the, it's now called the Signal Protocol. I still remember it as being Axolotl or whatever it was called.

0
💬 0

3197.889 - 3203.332 Meredith Whittaker

It used to be Axolotl. And then it was when Signal was launched, which was the Redbone, TechSecure protocol.

0
💬 0

3204.833 - 3226.207 Tim

integrated into one app for ios in 2013 it was changed right but initially i believe moxie wanted to implement that crypto for signal for for twitter direct messages if i'm not mistaken i think it was at twitter before yeah and uh so his idea to roll out um i mean definitely he devoted a

0
💬 0

3227.248 - 3253.239 Tim

a few years of his life to um implementing and rolling out mass end-to-end encryption right and but i believe it was he wanted to do it at twitter at the time and maybe i'm wrong here then our listeners will correct me two minutes into the show but clearly none of the large or and i don't actually know for sure that the that story so don't don't correct me it's my mistake i've

0
💬 0

3255.18 - 3265.304 Interviewer

I heard some other things, but I think one of the problems was that Twitter didn't see itself as a messaging platform, which was probably a mistake from their point of view.

0
💬 0

3270.226 - 3285.145 Meredith Whittaker

these ideas, right? I remember a project where we wanted to use them as a key store for other services and sort of, you know, you'd always have these really exciting conversations with the security guys and then it wouldn't go anywhere because then, you know, the other guys would get involved.

0
💬 0

3287.668 - 3309.48 Tim

The point I was trying to make is none of the big platforms ever rolled out mass-scale end-to-end encryption until eventually WhatsApp did. And it took an Edward Snowden revelations and an ongoing scanner for, I think, roughly one year until Facebook at the time said, okay, we'll implement this.

0
💬 0

3309.72 - 3313.442 Meredith Whittaker

So Signal existed before the WhatsApp integration.

0
💬 0
0
💬 0

3314.503 - 3334.316 Meredith Whittaker

And the WhatsApp integration was driven by Brian and Jan, who are the co-founders. And my understanding there is that they were rushing to get that done before the Facebook integration to make sure that they weren't selling something that would violate their principles. Um, and I know Moxie was working on that. I remember that, that period of time.

0
💬 0

3334.816 - 3356.57 Meredith Whittaker

Um, and that, you know, Facebook or, you know, then Facebook bought it and I don't know the term, you know, the deals there exactly, but, um, it, you know, remains using signals protocol there, you know, and, uh, and, but that, that it, you know, it certainly was a post Snowden moment, right? You saw Android, uh,

0
💬 0

3357.61 - 3380.647 Meredith Whittaker

And iOS implementing full disk encryption, you saw Google encrypting, adding HTTPS for its networks. And I think a lot of this was just like, we need to distance ourselves from bad government spying by adding encryption that proves that we're not actually sort of part of the problem, that they have just sort of...

0
💬 0

3381.929 - 3407.11 Meredith Whittaker

The bad government has attacked us as taxpaying corporations, taken this data that we really want to protect, but God, how could we have known? And so now we're encrypting things and that's ultimately good, right? But it's... It was a way of not looking at the full story of like, why is that there to begin with? And, you know, what else is not being encrypted? What other data is being given over?

0
💬 0

3407.39 - 3418.663 Meredith Whittaker

And why do you have the choice to do that to begin with instead of a more socially beneficial democratic process of determining how we're comfortable with technology entering our lives?

0
💬 0

3418.979 - 3442.795 Tim

I probably would have even had an even better or even stronger perspective on it. It seems like the governments didn't really attack these corporations. The governments made their business model look bad. So now they needed to change something to convince people, no, no, no, no, your data is safe here. We'll encrypt something and moving on, moving on.

0
💬 0

3443.195 - 3462.108 Meredith Whittaker

For the gentle listener, I was kind of joking because that rhetoric around being attacked and being like, you know, like, oh my gosh, was like very much the mood at that time. I was at Google when Snowden dropped and I remember just things popped off and I actually had to get, I got on a plane.

0
💬 0

3462.128 - 3477.278 Meredith Whittaker

I don't know, you all probably have a memory similar to me of like when the Guardian stories with the Verizon, like the Glenn Greenwald stories. It was night in New York. It was probably morning the next day you guys saw it. I remember sitting on my couch and being like, holy fuck.

0
💬 0

3478.038 - 3490.186 Meredith Whittaker

And realizing just how big that was because it was the kind of thing we'd been talking about, speculating about in the rooms that you and I were in, Linus. And then it was like, oh, receipts, shit. And there was a lot of unclarity.

0
💬 0

3490.226 - 3507.818 Meredith Whittaker

There was, you know, that prism slide where it was like, is Google just giving them full access that, you know, people were rioting inside, you know, security engineers were threatening to quit. And then that morning I got on a plane to Tordev. And so, yeah, in Berlin, actually.

0
💬 0

3510.199 - 3534.671 Tim

I remember that, I mean, this time and we at CCC, we really had to bite our tongue and write like a media communications strategy and said, nobody says I told you so. We need to act, I mean, we need to at least act surprised as well, right? And not say, well, we told you all along because, you know, so we really...

0
💬 0

3535.458 - 3538.041 Meredith Whittaker

I mean, I give you all a pass because you did tell us all along.

0
💬 0

3540.424 - 3554.8 Tim

But of course, it gave a lot of, you know, international media attention to our cause. And we, you know, we needed to play that public attention wisely. I think we maybe did.

0
💬 0

3555.69 - 3578.672 Interviewer

I find it quite interesting how encryption as a topic has changed over time. It's just more or less 10 years ago that Facebook actually changed to HTTPS on their website by default. And So there was a time not so long ago, you know, where most of the data was flowing around on the internet, mostly unencrypted.

0
💬 0

3578.992 - 3594.976 Interviewer

And that although there was these already mentioned crypto wars, you know, about general encryption, but it was also always for nerds and for specific applications. Then it also got this nice paint with this whole cryptocurrency thing

0
💬 0

3595.816 - 3626.133 Interviewer

craze going uh on which made it somehow popular and almost took the word away we're so wrestling and uh yeah and it was also the rise of of of uh encrypted messaging that was really giving uh it's a new fuel so single was in the middle of all of this as we already heard So I'd like to focus on Signal for a moment as an organization that you now head.

0
💬 0

3628.214 - 3636.558 Interviewer

What's your understanding of what Signal is and what it's not and how the organization deals with it?

0
💬 0

3637.999 - 3669.151 Meredith Whittaker

Well, I mean, I love Signal and I'm really… Yeah, it's the only cool tech company in my view. And I think sort of boiling down like what is Signal in one word is a little – I'll start somewhere and we'll end another place because I think it's actually – it's a number of things and kind of represents even more. Signal started – Back, you know, the late 2000s, right?

0
💬 0

3669.171 - 3694.84 Meredith Whittaker

And we can, you know, we can date it to whenever, right? Signal as the integrated app was 2013, but, you know, Red Phone, Text Secure predated that. And this is, you know, there's no iPhone. Jabber, like, is the competition, right? It's like web client-based chat. There's no, you know, people aren't carrying smartphones. There isn't, WhatsApp doesn't exist. iMessage doesn't exist.

0
💬 0

3695.336 - 3698.365 Meredith Whittaker

You have a very, very different marketplace.

0
💬 0

3698.445 - 3702.517 Interviewer

Well, there's still ICQ and AIM, I think, at the time.

0
💬 0

3703.157 - 3727.684 Meredith Whittaker

Yeah, yeah. I mean, I remember using, like, I don't remember, like, send text messages sometimes, but they were expensive on my BlackBerry, maybe. But we're talking about a drastically different tech ecosystem. And this is particularly important in the context of messaging and communications apps. Because, of course, you need a network effect for those to work.

0
💬 0

3727.764 - 3744.661 Meredith Whittaker

No one buys the first telephone, right? Because you can't use one telephone. Your friend has to have a telephone. And all your friends have to have an app if you're going to use it, right? Particularly, it takes two to encrypt. You know, group encryption, it takes everyone in the group. And if everyone isn't using...

0
💬 0

3745.342 - 3765.341 Meredith Whittaker

your app for communication, it's very difficult in a saturated marketplace where you have a WhatsApp, where you have iMessage, where you have these normative models that people go to just to have their regular communication to sort of introduce a new app for communication. secure messaging or insecure messaging, right?

0
💬 0

3765.381 - 3774.305 Meredith Whittaker

Because people don't switch outside of the network unless their friends switch outside the network and there's a collective action problem there and an inertia problem there and all of that, right?

0
💬 0

3774.985 - 3791.052 Meredith Whittaker

So when I think about, there's many things that are very precious about Signal, but the fact that Moxie carried it on his back for that decade and was actually able to keep it going and surviving without selling out, without selling data,

0
💬 0

3791.601 - 3816.008 Meredith Whittaker

And actually creating something that is now able to scale to hundreds of millions of people means that Signal actually has a position in this ecosystem that makes it useful to people. That means that it's actually providing encrypted communication to people. people all over the globe because their friends are using it, right?

0
💬 0

3816.048 - 3837.374 Meredith Whittaker

And my contention here, and I'm willing to discuss this, I don't think we can recreate Signal, right? You could shift because it has that user base, right? You can introduce a new app, but how do you get people to use it without an OEM, without an existing installed user base, without some way of kind of making it useful to people.

0
💬 0

3837.974 - 3882.719 Meredith Whittaker

Because again, it's, you know, one telegram telephone or, you know, a couple thousand hackers who all use PGP, but they can't talk to their dad on PGP, right? They can't talk to anyone outside of themselves on PGP. So, you know, Signal has both sort of kept this, this form that is very heterodox in tech. It's a nonprofit. So, you know, it Yeah, absolutely. The diametric opposite is the norm.

0
💬 0

3882.98 - 3907.581 Meredith Whittaker

That's really, really important, irrespective about the flaws of the nonprofit model more generally. So it's achieved this pretty rare, like it's the only thing like it in the ecosystem. And I think it also serves as a model for how we could think about building tech differently. Like how do we disarm technology?

0
💬 0

3907.941 - 3918.585 Meredith Whittaker

deconstruct the massive centralized power of a handful of platform companies that basically control most of the infrastructure and information ecosystem in our world and our jurisdiction in the US?

0
💬 0

3919.445 - 3930.989 Meredith Whittaker

And how do we build other models that may be interoperable, that are more open, that are more rights preserving, and that aren't subject to the pressures of and incentives of the surveillance business model?

0
💬 0

3931.629 - 3963.257 Tim

Now, I find it interesting, you mentioned the network effect that for a long time worked against you right you said okay it took moxie a decade um now that network effect you know the eu regulatory bodies have an idea on how to weaken it by enabling messenger interoperability thus pretty much trying to force the large messenger operators be it whatsapp or whatever else people use

0
💬 0

3964.678 - 3994.732 Tim

to offer an interoperability interface so any new messenger would have it easier than Signal did to reach the critical amount of users to actually have the network effect. Now, I know Signal's position strongly opposes this idea, or you decided not to participate in this interoperability or making use of it.

0
💬 0

3997.213 - 4025.182 Tim

There are two forces or two goals of Signal, I guess, that contradict each other here, and that is its security and building open communication systems. maybe you can explain a little bit why after, you know, and having understood how hard it is to build signal and, you know, against a market, um, why you would still oppose messenger interoperability.

0
💬 0

4025.943 - 4060.045 Meredith Whittaker

Yeah. No, I, I like this question cause I'm going to, I'm, I want to clarify our position. Um, which is a, there's a little bit of nuance. Um, I don't oppose interoperability in principle. If the interoperability mandate of the DSA were... Interoperability needs to be an option, and it has to happen at this rigorous security bar.

0
💬 0

4060.866 - 4068.671 Meredith Whittaker

You have to make sure that you're implementing metadata security, sealed sender. Basically, you're adopting the signal...

0
💬 0

4069.071 - 4092.628 Meredith Whittaker

privacy and security bar as the conditions for interoperability that would be really cool right and i think that's the issue here i mean and and i want to put an asterisk here for a moment saying like there are a lot of other complexities around policy like you know how do you how do you take in a you know, who deals with a law enforcement request, right?

0
💬 0

4093.409 - 4112.066 Meredith Whittaker

Even if you have no data to give them, you can't just ignore that. Who, you know, like if a user has a complaint, who do they write in? What is like, you know, if you're interoperating with a platform for communication that also has a social media arm, there's a totally different regulatory environment for like, you know, telegram or WhatsApp with channels than for a signal.

0
💬 0

4112.646 - 4135.017 Meredith Whittaker

How does that sort of, you know, work in terms, you know, there are, so I want to say this isn't simple and there's a whole can of worms. There's a massive can of worms as the EU often opens. But the, you know, like the conditions of interoperability are actually really, you know, they're really political here. Right. So in order to interoperate with WhatsApp and,

0
💬 0

4135.577 - 4160.256 Meredith Whittaker

Am I going to be giving Signal user data to Meta? Well, that would violate the entire premise of what I'm spending my life's energy doing and Signal, right? Is Meta going to decide to cut off the account of one of our users? Who gets to decide that? Are they collecting other data because they aren't implementing some of our libraries or whatever?

0
💬 0

4160.716 - 4179.525 Meredith Whittaker

And so I think that's where the rubber meets the road. And we have to have a duty of care to the people who rely on Signal. That is, we're absolutely not going to compromise you that way. Because we know, you know, if we're going to be very real about it, there's a woman living in jail right now in the U.S.

0
💬 0

4180.405 - 4199.175 Meredith Whittaker

because Metta turned over her Facebook messages between her and her daughter that documented them obtaining reproductive care in the state of Nebraska after the Dobbs decision. Like, that's the stakes of this conversation, even when we're talking about the technical details of interoperability, right? Right.

0
💬 0

4199.571 - 4203.454 Tim

Dobbs' decision is reverting Roe v. Wade, right?

0
💬 0

4204.114 - 4204.274 Meredith Whittaker

Yeah.

0
💬 0

4204.695 - 4213.962 Tim

Okay, so just for some of the listeners that are not that much into the U.S. policy. Basically outlawing abortion. Allowing states to outlaw abortion.

0
💬 0

4214.262 - 4218.445 Meredith Whittaker

Access to life-saving health care that more than half the population may need.

0
💬 0

4222.209 - 4241.203 Tim

So your answer in short is interoperability is fine, but you do not see any path currently being debated that would result in upholding the security promises that Signal has worked long for to be able to make to its user base.

0
💬 0

4242.533 - 4263.48 Meredith Whittaker

We will continue to advocate for a path that raises that bar, meets or exceeds signals bar. And if it succeeds, I'm like, yeah, I want to I want to talk about that. But, you know, that's those are the conditions under which we would interoperate. So we're not you know, we don't take a stand against it. You know, we just say like, look, this is these are the complexities and this is.

0
💬 0

4264.9 - 4276.648 Meredith Whittaker

Signal stands with the people who rely on Signal, not with a sort of vision for some muddy middle where we're all interoperating, but we've sort of sold people out and made them susceptible to what we describe with meta.

0
💬 0

4277.689 - 4288.376 Interviewer

Do people in the EU understand what you're talking about if you are offering these technical explanations why it's complicated? Because we have the impression that they don't really get it.

0
💬 0

4288.516 - 4304.267 Meredith Whittaker

Is this mean or median people? Yeah. I mean, I don't think most people don't understand this at all because they like got laundry to do and this isn't their area. Right. I think some of the politicians I've talked to seem to get it, but it's not.

0
💬 0

4307.79 - 4334.865 Meredith Whittaker

I think that, you know, there's an inertia governing this process that means it's not clear how far these points have absorbed into the bedrock, so to speak. So I don't – I would also – I'd want to be cautious as an American. I'm based here for now, but I also don't have as – I would say my instinct on the general understanding of people in the EU is not what I would rely on.

0
💬 0

4336.065 - 4371.836 Interviewer

Which brings me to an interesting point because we are actually very interested in your view as an American, knowing how things work on that continent. What's your impression of how Europe deals with tech, these new technologies coming up and how it impacts society? Can you just give me a feeling for how this is to you? In a good way, in a bad way, whatever you feel, just to…

0
💬 0

4372.862 - 4393.382 Meredith Whittaker

One meta observation, having been here for a bit of duration now is, and I've thought this before, but it's interesting to think of Europe as one thing. When you go to different countries and meet different people and you're like, wow, this is not one thing.

0
💬 0

4393.762 - 4396.104 Tim

I mean, that's the same in the US, right?

0
💬 0

4396.124 - 4402.446 Meredith Whittaker

It is in the US, yeah. I mean, you know, it's big and it's… But there is a common theme somehow.

0
💬 0

4402.646 - 4413.091 Interviewer

I'm just focusing on what you can probably match to Europe in general or at least to the kind of discussions you have on a political level when you face EU institutions.

0
💬 0

4413.812 - 4446.068 Meredith Whittaker

It's split in an interesting way. Because on one part, if you go to kind of the startup ecosystem, the VC ecosystem, like that world, there's a lot of smart people and cool people doing cool things. And there's sometimes a bit of magical thinking that I see, which is really like... If we wish hard enough, if we're able to figure it out, we're going to be able to create competitors to the U.S.

0
💬 0

4446.128 - 4450.99 Meredith Whittaker

incumbents, right? And we're going to have our own thing. Own search engine.

0
💬 0
0
💬 0

4453.17 - 4467.336 Meredith Whittaker

Which often just – sometimes I'm just like, okay, that's a money play, right? Like you get enough in your Series A, Series B, and then you'll get acquired and no one will do anything with it or you'll get rich or whatever. You may not necessarily believe that. Like markets float on hype and –

0
💬 0

4468.436 - 4494.578 Meredith Whittaker

So, OK, but it is there is this thread where it's it's almost a willful misunderstanding of the reality of incumbent platforms, of the history that, you know, accrued that type of power to U.S. companies and of the dependencies that Europe and most of the rest of the world have on these companies. Like three, the three cloud companies based in the U.S. have 70, 70 percent of the global market.

0
💬 0

4495.984 - 4498.305 Meredith Whittaker

You have five major social media platforms.

0
💬 0

4498.325 - 4499.365 Interviewer

Amazon, Google, and Microsoft.

0
💬 0

4499.606 - 4522.295 Meredith Whittaker

Yeah, yeah. AWS, Azure, GCP. And then, you know, I think the other percentage are made up by U.S. companies as well. And then there's some Chinese companies. And then you have five platforms that effectively shape our global information ecosystem, like our perception of reality. The four biggest jurisdictions in the U.S., right? Yeah.

0
💬 0

4522.998 - 4527.239 Interviewer

Which five platforms would you... This is what's going to happen.

0
💬 0

4527.259 - 4559.242 Meredith Whittaker

I'm going to remember four of them because there's always like a last in a list. So TikTok is a non-US one, right? And that's the one they all freaked out about recently. Because it's non-US? Well, yeah. I mean, I think so flatly, yes. Facebook, Instagram, X, and then... It's not Twitch, but anyway, there's another one on that. YouTube. And then there's Twitch, too.

0
💬 0

4560.483 - 4581.251 Meredith Whittaker

For all intents and purposes, that's a huge amount of concentrated power that, again, relies on network effects, relies on economies of scale, relies on all kinds of global infrastructure. It's trillions of dollars that can't just be interrupted by investment. This is a social media company. Social media and platform companies, right?

0
💬 0

4581.871 - 4583.472 Interviewer

And isn't Telegram one of them?

0
💬 0

4584.193 - 4600.683 Meredith Whittaker

Well, Telegram doesn't, I think, run most of their own infrastructure. They don't have a cloud business model. And they also don't really have a business model. It seems like they have this crypto play, but it's not clear how that money moves. There's a lot of UAE investments.

0
💬 0

4600.703 - 4605.926 Interviewer

So you're not talking about these big tech... Okay, you're focusing on cloud, not so much on the social media aspect.

0
💬 0

4606.026 - 4627.398 Meredith Whittaker

And what is it difficult? Where is this... Where's the normative shape of the tech industry coming from? If the cloud companies all of a sudden decided to cut off half their APIs and change their infrastructure, there's most startups in the entire world, including organizations like Signal, Telegram, whoever's riding on top.

0
💬 0

4628.078 - 4651.783 Meredith Whittaker

you know all their engineers pagers go off they gotta you know respond to that right it's it's not it's unidirectional that way right if you know if sanctions go up you know say uh let's do a like a wild dictator gets elected in the u.s and decides that these you know europe is now sanctioned right and then amazon can't do business with europe right like what happens like this is data in the u.s that would never happen yeah

0
💬 0

4652.484 - 4668.465 Meredith Whittaker

I make things up as a creative person. Remember, I come from literature and rhetoric. Just stories in your head. Yeah. And then you see the social media platforms, which we do know that there is a...

0
💬 0

4669.386 - 4689.926 Meredith Whittaker

you know, a far right that is, has really focused some intelligence and attention on building alternative media ecosystems across those platforms and kind of using the affordances of surveillance advertising driven media platforms to shape the Consciousness.

0
💬 0

4690.086 - 4692.227 Interviewer

Back to the magical thinking of Europe.

0
💬 0

4692.648 - 4703.534 Tim

I'd like to understand how you see... So in this world, there are a few Europeans that say, here's 50 million, let's compete with these guys.

0
💬 0

4703.594 - 4712.859 Meredith Whittaker

And that's probably what the Europeans... It's like $500 million European sovereign AI fund. And then you're like, but that's half a training run.

0
💬 0

4713.559 - 4715 Linus

What are you buying with that?

0
💬 0

4717.861 - 4744.468 Meredith Whittaker

Which is disturbing because it's like, okay, well, that's a lot of money also. Let's not be flip about it. And it could be going to really good things. It could be supporting interesting open projects. It could be supporting interoperable alternatives or smaller clouds for more heterodox news, open source projects. There's really cool stuff that is languishing without that money.

0
💬 0

4745.348 - 4762.175 Meredith Whittaker

And I think it's, you know, where is that money going? Well, if you're talking about going into AI, it's going to one of those three cloud companies, right? It's renting infrastructure from Microsoft, Amazon, or Google for model development or for deployment, which is inference, right? Like an inference is really expensive.

0
💬 0

4762.215 - 4773.86 Meredith Whittaker

Like you don't just train once you then, once you use a model, using it as way, way more expensive than normal information retrieval. And so you're also, you know, like it's just this massive computationally expensive thing.

0
💬 0

4775.461 - 4789.254 Meredith Whittaker

And, you know, you're not creating European sovereignty, you're creating a feeling of being, I don't know, like, not behind, which like, you know, and feeling of not being ashamed by being technologically...

0
💬 0

4789.634 - 4794.216 Interviewer

But apart from the magical thinking, is there anything else you would stick to in Europe?

0
💬 0

4795.116 - 4815.95 Meredith Whittaker

There's the other side, which is I often find a much more sophisticated and clear-eyed view of these problems, right? Like having this discussion about that concentrated power in the hands of infrastructure and media ecosystem way easier in Europe. I mean, people feel it, right? They see it. And there's been a history of pushing back against the encroachment of U.S.

0
💬 0

4815.99 - 4828.099 Meredith Whittaker

tech, both effectively and often very ineffectively, that I really enjoy. And particularly in Germany, there's a very high sensitivity to privacy, very often clear-eyed view.

0
💬 0

4829.048 - 4843.288 Meredith Whittaker

on some of these debates, which doesn't always translate into policy, but there's at least... I find the intellectual environment around this stuff, when you talk to people who are knowledgeable and have thought about it, to be very... teach me a lot and be really sophisticated.

0
💬 0

4843.548 - 4866.905 Interviewer

Yeah, the GDPR is probably a German thing somehow in its core. For sure. So how does this affect the talk to European politicians and how do you see the trends in regulations and trying to apply new laws and regulations towards this whole tech industry?

0
💬 0

4868.362 - 4895.925 Meredith Whittaker

Yeah. I mean, there's, you know, both threads. You see kind of, you know, two wolves inside European politics. You know, the one wanting its own tech industry and the one wanting to make sure that they're not subject to a sort of U.S. tech colonialism. And I think you get, you know, some weird laws. You saw the AI Act, which went on forever and then, you know, kind of had this last minute effect.

0
💬 0

4896.545 - 4920.548 Meredith Whittaker

Last-minute brinksmanship around whether foundation models, these big LLMs that are now the trendy kind, should be included or not. And you often see bold regulatory attempts that then get kind of shaped in odd ways. Yeah. Trying to have it both ways, right? Like, how do we regulate the Americans away and get our own, right?

0
💬 0

4920.648 - 4938.377 Meredith Whittaker

But how do we do that in a way that is reflected in principles, not in, you know, actually declaring that as an intent? And I think that is, you know, I think you're seeing a huge amount of money be spent by the U.S. companies in Brussels right now, which is also influencing things in interesting ways.

0
💬 0

4939.138 - 4958.87 Meredith Whittaker

And then this is something I'm theorizing a lot in my intellectual work, and I think is really important. You also see what I'm calling the politics of intellectual shame be really pervasive in this conversation. And this is not just Europe. This is across the board. I mean that there is a real...

0
💬 0

4960.689 - 4987.877 Meredith Whittaker

a real fear among a lot of people who are in decision-making positions, politicians or academics or whoever, and not even in decision-making positions, but it matters when it's them, of being stupid about tech. of being behind the ball on tech. And this plays right into patriarchal dynamics, like men hate when someone else knows something more than them, and particularly if that's a small woman.

0
💬 0

4987.897 - 4989.959 Meredith Whittaker

I think we enjoy it right now.

0
💬 0

4989.979 - 4992.241 Linus

Totally. We're having a good time.

0
💬 0

4992.261 - 4993.542 Meredith Whittaker

Yeah. Well, you all are...

0
💬 0

4996.625 - 5021.647 Meredith Whittaker

generally and i don't i don't want to gender this like in such a schematic but there is a there is a there's an ego that is like can be very very fragile here and i the way i put it before is it kind of turns uncertain men into yes men like they don't want to ask the dumb question they don't want to be like what's an llm what's a server like how does that work and that type of insecurity the fear of being behind the fear of being called like you know

0
💬 0

5022.367 - 5051.264 Meredith Whittaker

technically unsophisticated or hampering progress or putting your finger on the scales of science look the nobels how could you stand in the way of all this progress right i think really gives the upper hand to the companies and those who have an interest in creating products and you know growth and domination via these technologies because people really don't want to challenge them because you know challenging their dominance or or their plans you

0
💬 0

5051.784 - 5069.737 Meredith Whittaker

gets conflated with somehow being anti-science or being stupid about tech or not being smart enough to have a position on a topic. And I think that's something because I kind of came up through Google asking every dumb question in the book because I didn't come from that world, right? So I had to ask like... how does a computer work? Right.

0
💬 0

5069.837 - 5082.1 Meredith Whittaker

I'm like, I'm like, can someone diagram what a function is? I don't know any of this stuff. Right. Um, but I kind of, I think I have a sensitivity to that. Cause I also, I remember feeling it. I remember people being mean about it.

0
💬 0

5082.16 - 5108.439 Meredith Whittaker

Like if I didn't, you know, like back in the day when I was trying to learn this stuff and I think that, you know, a discourse that collapses scientific progress into kind of the success of a handful of tech companies is praise on that type of insecurity and has created an environment in which people have no idea what AI is and are still professing boldly on how to regulate it.

0
💬 0

5110.86 - 5122.286 Interviewer

So I read it as you think that the European positions might be slightly under-informed and probably not well thought out in the current situation? Yeah.

0
💬 0

5122.614 - 5140.037 Meredith Whittaker

I should be clear. I don't think that what I was describing and the politics of intellectual shame are not unique to Europe, but I think are in Europe as well. And particularly folks who feel like, you know, the Americans beat us. We got to get ahead. Right. I think, you know.

0
💬 0

5140.816 - 5161.383 Meredith Whittaker

Where I see the European position being most, let's say, under-informed or perhaps just in some cases pernicious is in the chat controls regulation and the desire, the apex of magical thinking, which is let's rename a backdoor client-side scanning and then let's mandate...

0
💬 0

5162.083 - 5180.551 Meredith Whittaker

Scanning everyone's private messages, comparing what's in those messages against some database of permissible or impermissible content, and then taking action on those in the name of protecting children, which is the justification during this instantiation of the crypto wars.

0
💬 0

5181.091 - 5198.035 Interviewer

Let's stick to this topic because it's still an ongoing battle right now. We are more or less talking about this in every of our shows. And yeah, it's still totally unclear what's going to come out of this. How do you see this discussion evolve?

0
💬 0

5199.155 - 5204.476 Meredith Whittaker

Well, I see this as an ongoing power struggle, right?

0
💬 0

5205.256 - 5206.157 Interviewer

Between who?

0
💬 0

5206.717 - 5228.529 Meredith Whittaker

Well, between... This is not a misunderstanding. I think a lot of the people pushing for this understand that backdoors are dangerous and understand that the pretext is flimsy. But that asymmetric power constitutes itself in part through information asymmetry.

0
💬 0

5229.844 - 5248.159 Meredith Whittaker

And there's a deep discomfort that dates back to 1976, when Diffie-Hellman were trying to publish their paper, introducing public cryptography, and the US government was trying to suppress it, trying to say, don't publish this, right? And then, you know, but databases weren't quite big enough, networks weren't quite big enough or ubiquitous enough for it to matter matter.

0
💬 0

5248.639 - 5267.667 Meredith Whittaker

But they were already looking at like, oh, shit, we don't want this in the public. Right. And then you go through the 90s and there's, you know, the Clipper chip and key escrow. And you have Stuart Baker writing in Wired magazine like PGP is just for terrorists. We have proof. It's, you know, no, no. He was PGP is for pedophiles. Right. Which really echoes what we're hearing now. Right.

0
💬 0

5267.687 - 5286.537 Meredith Whittaker

Like who even has a computer in 1994? I believe when this op ed is written. And then we have post 9-11, and then it's like, actually, PGP is for terrorists, right? And encryption is for terrorists. All the while, our dependency on digital infrastructures for communications is growing and growing and growing. Our dependency on digital infrastructures generally is growing.

0
💬 0

5286.917 - 5308.171 Meredith Whittaker

And the need for encryption to protect commerce becomes existential to the internet industry. And then what do you do about communications, right? And I think... I think this has been an anxiety that is pervasive among those who, you know, law enforcement, governments, whoever, who feel that they need to constitute their power via information asymmetry.

0
💬 0

5308.723 - 5330.94 Meredith Whittaker

And any encryption that protects people, not just commerce, is a threat to that, right? And so what I don't see is that we're going to win an argument, right? Or that we're going to win this via strength of argument. I do think we can fight. And I think we're in a position now where we're seeing chat controls. I believe Hungary just tried to raise it and didn't get the support.

0
💬 0
0
💬 0

5331.411 - 5349.278 Meredith Whittaker

There was the Belgian proposal a few months ago, also didn't get the support at the last minute. And we just had the Dutch law enforcement authorities writing a memo to the government there saying, yo, don't support this. You're talking about a very dangerous backdoor that would undermine Dutch cybersecurity, right?

0
💬 0

5349.899 - 5357.782 Meredith Whittaker

At the same time, we have reporting in the Wall Street Journal that provides a receipt for what all of us should have suspected all along is that

0
💬 0

5358.862 - 5376.533 Meredith Whittaker

back doors that were inserted in the u.s telecommunications infrastructure for government intercept have been hacked by you know chinese intelligence and maybe others right so we're you know i think at this moment we have a lot of there's a yeah the facts are on our side and

0
💬 0

5377.065 - 5386.047 Meredith Whittaker

And the fact that the facts are on the side is permeating into these discussions and making it harder and harder for them to push it forward in the European Commission.

0
💬 0

5386.127 - 5389.708 Tim

But that usually means they just do another attempt next year.

0
💬 0

5389.728 - 5410.655 Meredith Whittaker

Exactly. And that's why I think we're not going to win. There's going to be another pretext if we win this one, right? There's going to be another angle if we win this one. We just have to keep building our muscle to sustain this fight probably forever. Because I don't think the will to power is going away. I think they're just going to keep trying to rearrange the reasoning.

0
💬 0

5410.675 - 5414.977 Interviewer

But how do you deal with it? If you say the strength of your argument is not enough?

0
💬 0

5415.217 - 5416.057 Meredith Whittaker

I do yoga every day.

0
💬 0

5416.757 - 5417.258 Interviewer

Does it help?

0
💬 0

5417.818 - 5418.718 Meredith Whittaker

I mean, yes, it does.

0
💬 0

5419.719 - 5421.259 Interviewer

In terms of political discussions?

0
💬 0

5423.34 - 5440.79 Meredith Whittaker

Well, in terms of political discussions, it helps that we're right. You know, like we bring in, there's a huge amount of evidence that a lot of people haven't seen in these political discussions. I think we're on the, you know, our side has been on the back foot for a while.

0
💬 0

5441.87 - 5462.102 Meredith Whittaker

There has been just in civil society, there's been a cutting funding to privacy advocacy has happened, you know, since around 2000. You know, there's sort of a history here. I think kind of there's a move toward tech accountability happened out of the after the 2016 election. You know, there's the Cambridge Analytica scandal. There's all of this.

0
💬 0

5462.142 - 5477.113 Meredith Whittaker

And it's like, OK, we're going to you know, we need to hold tech accountable. And then there are, you know, a number of the way to hold tech accountable is to. you know, attack the business model is my view, but there aren't that many pieces of legislation or proposals that actually do that.

0
💬 0

5477.853 - 5496.028 Meredith Whittaker

Many of them sort of use the wrapping and the language of accountability, but are actually just expanding surveillance, right? It's like, we're going to hold them accountable. So we need a database. So we need to, you know, we need to know who's logging into websites so we can find the bad guy. We need to know what's in your

0
💬 0

5497.695 - 5516.581 Meredith Whittaker

You know, you're messaging so that we can make sure that these tech companies aren't allowing crime on their platforms, etc., etc. So it was basically a hijacking of this, you know, in many cases, kind of righteous moment where people recognize that this business model was pretty harmful to fulfill the wishes that have been pervasive since well before then.

0
💬 0

5517.296 - 5536.014 Meredith Whittaker

at the same time that we're seeing privacy advocacy and a lot of those, you know, a lot of the things Linus, you and I had been doing for a long time, receiving less and less support and sort of, you know, out of the limelight. And so I think it was in that environment that things like check controls, that things like the online safety bill and other,

0
💬 0

5537.22 - 5562.81 Meredith Whittaker

paradigmatic examples of this client-side scanning to save children meme grew up. And then one of the reasons I was, there are many, many reasons I decided to move from being on the board of Signal to full-time at Signal. One of them was that I saw this And I was like, I realized there weren't that many people fighting it.

0
💬 0

5563.49 - 5569.533 Meredith Whittaker

And that one of the things that I could bring was a staunch willingness to fight it.

0
💬 0

5570.822 - 5575.844 Co-Host

And how do you do that? I mean, can you walk us through a day?

0
💬 0

5575.864 - 5595.073 Meredith Whittaker

Open up my laptop. Obviously, it's not just me. Nothing like this is a singular thing. We work with a pretty broad coalition of folks. I'm sure many of your friends, many listeners perhaps are part of that. Signal doesn't have

0
💬 0

5595.453 - 5621.395 Meredith Whittaker

a policy arm it's a very kind of lean targeted pretty senior organization uh but we do work with people around the globe you know edry and the eu you know a number of other organizations to keep tabs on what's happening we also are in a a good position you know we're We're a nonprofit. We have a we are very committed to kind of rigorous communication.

0
💬 0

5621.475 - 5637.689 Meredith Whittaker

So we don't you know, we don't have a history of hyper marketing. We don't do hyper marketing now. Right. And so we're very careful to, you know, when we make a claim, when we make a statement, we're backing that with citations. It's accurate. You know, we're really marshalling kind of the technical knowledge and prowess that we have.

0
💬 0

5638.49 - 5654.245 Meredith Whittaker

to, you know, I almost think of it as like clarifying the record, right? There's, you know, if there's a report that says client-side scanning is actually safe, we know it's not safe. Okay, well, there's an academic coalition that has written this letter. Signal can write a letter. We can, you know, begin to put a

0
💬 0

5658.364 - 5685.273 Meredith Whittaker

Given the dynamics I just outlined, and then I do media, I do public speaking, I think a lot about how to tell this story in a way that isn't boring or alienating for regular people, particularly because the story on the other side is so arresting. It's like we have to save children from abuse. Yeah. And that every one of us, it like hits you in the heart, right?

0
💬 0

5685.333 - 5706.82 Meredith Whittaker

Like you, you know, like myself, right? Like my amygdala is activated. I, you know, suddenly I just want to do something. I want to help, you know, what, give me the thing to do. How do we do that? Right. And then sitting across from that and being like, well, let me, let me tell you about a one-way function, right? Like you can't, that's not, that's not going to work. Right.

0
💬 0

5707.56 - 5732.85 Meredith Whittaker

And so like, how do you, How do you enter into that debate in a way that isn't dismissing the very grim and real problem that is being evoked and make it clear that the solution to that problem that is being presented will not solve that problem, one, and two, will cause drastically worse problems for many people around the world? And that's the task at hand right now.

0
💬 0

5734.232 - 5741.557 Interviewer

So basically the discussion is led by pointing the other side to the infeasibility of the approach.

0
💬 0

5742.218 - 5766.134 Meredith Whittaker

Well, the infeasibility, the danger of the approach, a lot of evidence around the infeasibility of the approach that is either kind of willfully ignored or just not understood, and then figuring out how we explain that without Being either accidentally or genuinely callous about the concerns that have brought people to the table, right?

0
💬 0

5766.274 - 5768.597 Interviewer

Then they will say, but we have to do something.

0
💬 0

5769.237 - 5791.997 Meredith Whittaker

Well, how about funding social services? How about, you know, what do you do? And I mean, like, if we're going to go there, we're going to go there. Prince Andrew's walking around, right? Jimmy Savile's walking around. You know, like what are the infrastructures in place to make sure that when children are going through this, they're believed, they're protected.

0
💬 0

5792.017 - 5809.228 Meredith Whittaker

You know, what happens when it's your priest? What happens when it's your teacher? What happens when it's your brother, right? Like these are the questions that are really hard to look in the face because they implicate social pathologies and interpersonal relationships and power dynamics that are really, really difficult and often...

0
💬 0

5810.228 - 5827.197 Meredith Whittaker

you know, relate to emotionally challenging factors outside of that, right? Or people's past experiences or what have you, right? So you're going right into kind of, you know, very traumatic subjects. But I don't think we can have that conversation without having a real conversation, right?

0
💬 0

5827.257 - 5855.594 Meredith Whittaker

And then, you know, when you begin to pull back the layers there, you say, oh, well, the UK has been pushing for client-side scanning, right? as a remediation to child abuse. But the UK government in 2023 funded social services at 7% of the amount recommended. So the roofs on the schools in the UK are collapsing. There isn't support for this.

0
💬 0

5856.235 - 5882.062 Meredith Whittaker

And then if you look at, and I don't have public numbers to share, but I've had a number of personal conversations. Okay, well, how many law enforcement people, how many people are tasked with actually sort of pursuing the criminality that may be reported via online imagery? In some cases, it's two. In one case, it's two in one country, two people. Right.

0
💬 0

5882.622 - 5900.092 Meredith Whittaker

So like you're not actually like if you begin to map this, what you see is a story that does not add up. And you see like like what you know, and this is where I get enraged because I'm like, you are fucking trading on children's pain to get your back door, whatever the fuck you want.

0
💬 0

5901.223 - 5926.379 Meredith Whittaker

pretending that you're solving it so taking up the space for actual solutions that could actually like help real children who are suffering now and turning no attention to every glaring problem in this massive list which is you know pretty obvious you know even for me and you know like like and i'm not an expert here i've just sort of you know sifted through this so i think you know that's the that's the dynamic we're walking into

0
💬 0

5927.745 - 5935.568 Interviewer

They might only have two people, but at some point in time, they might have doubled by adding another one.

0
💬 0

5935.788 - 5956.237 Tim

I agree that, I mean, it is quite telling how much emphasis is being laid on, hey, we really need client-side scanning and then the world is going to be safe. And, you know, if you say, well, how about we, you know, we fund support or any kind of prevention, preventive activities in social care, it's like,

0
💬 0

5956.457 - 5980.398 Meredith Whittaker

Yeah. Yeah. Well, sorry, we use the prefix online, so that's not our... And I think it's also, like, there's something... People gravitate to the abstraction, right? If this is online child abuse, then we don't have to deal with it in our real lives. It becomes an abstraction that we can almost blame on the same platforms that have been so unaccountable.

0
💬 0

5980.438 - 5990.546 Meredith Whittaker

We can blame it as an internet phenomenon, not a phenomenon in like, oh wait, our church doesn't have the infrastructure to actually deal with this in a humane way, right? And I think that's a dynamic that we're also seeing here.

0
💬 0

5991.169 - 6006.622 Tim

This is, by the way, one thing I find so interesting about Signal as a secure messenger. Well, it has become mainstream, but it has also managed to maintain a reputation of...

0
💬 0

6008.383 - 6031.938 Tim

goodness right i mean saying like okay textman signals oh yeah that's the secure messenger blue symbol looks nice you know very friendly user interface and or it would be like let's text on 3ma be like oh that's the complicated black one or well how about telegram and that's really okay that's a completely different end of the internet um

0
💬 0

6033.419 - 6057.612 Tim

and that makes me think of, uh, the curious case of Pavel Durov being detained in, in France and apparently, um, at least charged, um, because they refused to cooperate, um, in, in numerous cases. Um, Why do you dare to come to Europe, Meredith?

0
💬 0

6058.272 - 6083.493 Meredith Whittaker

Well, I'm a brave person, Linus. But to be serious, I think... This is one of the places where more public education is necessary because Telegram is actually, you said the other end of the internet, it is very, very, very different from Signal. So Telegram is a social media platform. It allows mass broadcast to millions of people. You can go viral on Telegram.

0
💬 0

6083.513 - 6108.168 Meredith Whittaker

You can find strangers on Telegram via directories. There's a near me feature that will geolocate things happening near you. All sorts of things that are not private, are not secure, are regulated completely differently from private and secure communications like Signal, which is solely a private and secure interpersonal communications app.

0
💬 0

6108.528 - 6109.909 Tim

What about Signal Stories, though?

0
💬 0

6110.546 - 6119.171 Meredith Whittaker

Signal stories don't go viral, right? They go to your, you know, it's if I sent all the people in my contact list one by one, a photo of something.

0
💬 0

6119.231 - 6120.652 Tim

I don't get your stories.

0
💬 0

6120.692 - 6124.255 Meredith Whittaker

They're so cute, Linus. I wish, I wish, like, it's, you know.

0
💬 0

6124.275 - 6127.457 Tim

I mean, that feature was implemented. I was like, okay, where can I turn it off?

0
💬 0

6127.497 - 6131.6 Meredith Whittaker

I'm sorry. Well, you're all missing my stories is all I'm saying, and they are pretty good.

0
💬 0

6132.721 - 6134.483 Interviewer

Have you sent us some? No.

0
💬 0

6134.763 - 6157.414 Meredith Whittaker

Well, have you activated them to check? But we do let you deactivate them forever and never bother you about them, which is part of the way that I think we maintain our reputation for not being shitty is we try to literally not be shitty. Yeah, thanks. Right? And when we're designing... Signal, we're actually very, very careful not to be a social media platform.

0
💬 0

6158.315 - 6182.223 Meredith Whittaker

We think about that in the design phase so everything we do can be as encrypted as possible so that we don't know anything about you or as close to zero about the people who use Signal as possible. And what we do know is we can say, yes, this phone number did sign up for a Signal account. We know when that phone number signed up for a Signal account. And we know last time they accessed it. Right.

0
💬 0

6183.503 - 6211.993 Meredith Whittaker

But we would like to even not know that if it were possible. On the other hand, Telegram is a social media platform which retains huge amounts of data, has a duty under law to cooperate in turning over that data, and has search functions, has directories so you can find new things. So it's a very different beast. And I think, one, because Durov has been very

0
💬 0

6215.144 - 6238.953 Meredith Whittaker

I'm trying for a diplomatic word, has made statements that are not supported by fact around Telegram being private and secure and kind of taken on this yeoman's defender of free speech and privacy position. People often think Telegram is private and secure because it has a DM's feature, right? But Signal is just private and secure. So the TLDR on that is...

0
💬 0

6239.813 - 6261.083 Meredith Whittaker

There's really no danger for a signal here because we are very, very far away from Telegram. And we have set ourselves up so that one, such cooperation isn't required. And two, such cooperation is not possible because we literally like you could put a gun to my head. I don't have that data. Whereas Telegram has servers and servers and servers full of that data.

0
💬 0

6261.704 - 6275.105 Interviewer

Okay, leaving out Signal completely now, what do you think happened? Pavel Dorf was put into custody, he's now free on bail and Franz talked to him.

0
💬 0

6275.64 - 6282.005 Meredith Whittaker

I have no idea. I mean, this is like... Didn't he send you like a telegram message? I haven't checked.

0
💬 0

6283.906 - 6284.747 Tim

You missed the story.

0
💬 0

6284.987 - 6303.816 Meredith Whittaker

Yeah. I mean, this is like overlaid the French legal system. I am not a lawyer, especially not in France. some of the vagaries of their legal system in which, you know, any judge can open an investigation and the basis for the charges will not be known until, you know, trial.

0
💬 0

6303.856 - 6327.664 Meredith Whittaker

And we're looking at years and years until then with me not speaking French well at all with like weird translation, you know, like, so I, I want to stay away from speculating there, but what it looks like based on the charges that were released in the press release is that it was, you with requests for data and then kind of a handful of other charges added on that aren't as severe as those.

0
💬 0

6329.124 - 6331.204 Interviewer

So does Signal get these requests too?

0
💬 0

6332.005 - 6358.534 Meredith Whittaker

You can go to signal.org slash bigbrother and every request that we have been forced to comply with because we fight them and have unsealed are posted there showing exactly how close to no data we are able to turn over And showing, and I think this is interesting for some of your listeners, probably you see what they are, what the law enforcement agencies in these requests are requesting.

0
💬 0

6359.055 - 6374.644 Meredith Whittaker

And it's often huge lists, right? Like massive amounts of data, which gives you a sense of just how much data like surveillance, like a telegram or, you know, another platform is commonly able to provide that signal is not.

0
💬 0

6376.144 - 6384.328 Tim

All right. Oh, yeah. Oh, so it's actually every single request. I thought it would maybe be like, you know, aggregated.

0
💬 0

6384.348 - 6390.69 Meredith Whittaker

Yeah, it's a PDF of the request. It's not a transparency report. It's transparency.

0
💬 0

6391.971 - 6397.673 Tim

I like that.

0
💬 0

6398.214 - 6401.515 Meredith Whittaker

You know, although it's just the ones that we can unseal. So we do have to go.

0
💬 0

6401.555 - 6406.051 Tim

I would assume there are probably some with gag orders, right? Well, you wouldn't be able to say.

0
💬 0

6406.492 - 6420.997 Meredith Whittaker

There's no gag order on me being able to say. And there are some, but we fight. That's the fight to unseal it. And what I don't recall the answer for right now is whether we're in one of those fights right now or not. But I'd have to check with my friends at the ACLU.

0
💬 0

6425.759 - 6431.721 Interviewer

So Cigna as a company, how does it work? I mean... It's a non-profit. Yeah.

0
💬 0

6431.921 - 6433.202 Meredith Whittaker

So we're funded by donations.

0
💬 0

6433.802 - 6434.462 Interviewer

Only. Only.

0
💬 0

6434.662 - 6461.267 Meredith Whittaker

Only. Yeah. And we are thinking about in the future, maybe having a paid tier for some features, something like backups, encrypted backups, which we're building right now. Could we charge people for media storage or other expensive features? But that would be in addition to donations and squarely within the nonprofit structure that keeps us safe from pressure to surveil.

0
💬 0

6462.512 - 6468.377 Interviewer

And how do you get your talent? How do you get people to work for Signal?

0
💬 0

6468.638 - 6483.491 Meredith Whittaker

We pay very well. And we are... I mean, it's a really cool mission, right? So imagine the jobs in tech are kind of depressing in many cases. Not everyone wants to go optimize an ad server and then Signal...

0
💬 0

6484.671 - 6503.74 Meredith Whittaker

You know, signal you can work for core infrastructure for dissent and human rights work and journalism around the world that, you know, without which a lot of those things would be deeply imperiled. Like it's it's a real cool thing to get to do and support. And we pay well.

0
💬 0

6504.764 - 6510.673 Tim

So not only do you get respected for your six-digit salary, but for doing a good thing, earning that money.

0
💬 0

6510.713 - 6512.736 Meredith Whittaker

It's the original Silicon Valley dream.

0
💬 0

6515.06 - 6519.527 Interviewer

So the last place in tech where people are actually happy?

0
💬 0

6520.782 - 6545.223 Meredith Whittaker

Well, I don't, I would never presume to speak to the consciousness of another person without there. But I think, yeah, I am very happy. I think a lot of the people who are at Signal are very happy. And it's, I think it's also like we're kind of part of a project. And this is, you know, that shows that what we have in tech, what's built in the tech industry is not inevitable. Yeah.

0
💬 0

6545.848 - 6564.975 Meredith Whittaker

There's a series of choices, a series of incentives, a business model that has shaped tech into the form we have now. But it does not have to be that way. Right. Like we can rewrite the stack. We can build alternatives. Nonprofits can work. Right. We need capital. We need will. We need talent. We need all of those things. But the thing we have now is not inevitable.

0
💬 0

6565.035 - 6587.511 Meredith Whittaker

And I think, you know, I think of Signal as like a keystone species in the ecosystem, kind of like, you know, like setting the bar, kind of regulating the rest, right? Like, you know, you can have privacy. You can have, you know, the right to private communications. You can subsist outside of this paradigm. And I think the future I want is that it's not just Signal, right?

0
💬 0

6587.811 - 6601.276 Meredith Whittaker

There are many, many other organizations and efforts sort of doing it differently, rejecting that paradigm, drawing in capital there and away from the other place and beginning to marshal.

0
💬 0

6602.276 - 6618.38 Meredith Whittaker

the type of political will that is often very shallow, like the 500 million AI fund, but marshal it for something that is actually substantive and is actually making the kind of change to the tech ecosystem that I think we need to have a livable world.

0
💬 0

6619.18 - 6627.502 Interviewer

And you mean not only a model for other communication companies, but also a model for any kind of technology company?

0
💬 0

6629.236 - 6643.363 Meredith Whittaker

Yeah, well, I mean, we build a communications app, but we rely on telecom networks. We rely on server infrastructure. We rely on core libraries.

0
💬 0

6643.503 - 6664.88 Interviewer

That's not what I meant. I mean, I understood you that you think that the... modus operandi of Signal as a company might be something that other companies could also leverage and do. It's not only limited to some much needed devices.

0
💬 0

6665.44 - 6686.355 Meredith Whittaker

I do think that there's a model there. I think I'm interested right now in researching hybrid structures and tandem structures. Are there sort of for-profit structures Areas of tech that aren't, you know, driven by kind of, you know, like surveillance?

0
💬 0

6686.395 - 6707.751 Meredith Whittaker

Are there ways you could fund nonprofits, fund some of this core infrastructure, these, you know, libraries and other things that have been languishing for decades? You know, how do you sort of revitalize that? And then are there, you know, are they ways to build truly independent infrastructure outside of the, you know? three companies, five platforms model.

0
💬 0

6707.771 - 6710.733 Meredith Whittaker

I think it's just clearly critically dangerous at this point.

0
💬 0

6712.554 - 6747.289 Tim

So when it's about building infrastructures in our hands, that's not going to be easier in an AI world, right? Where it's model data that we need, huge investments into these models just for training than for operating them. Do you see any future for this whole AI thing in users' hands, in our hands, serving our actual privacy needs and let's say private and business needs?

0
💬 0

6748.254 - 6777.451 Meredith Whittaker

Well, I think my answer to that is that that future will rely on laying an independent infrastructural bedrock and actually transforming some of the way we govern and think about digital technology generally, including being really attentive to things like how is data created? Who gets to decide what data we use to reflect our complex lives and realities? Who gets to decide

0
💬 0

6778.348 - 6799.032 Meredith Whittaker

how patterns in that data are made sense of, what analysis is done to that data, and then what we do with the sense we make of it, right? What decisions we make, right? And so, you know, we do all that, we transform what AI is and what it means. Because no longer, you're no longer just scraping all the detritus off the stupid web.

0
💬 0

6799.512 - 6823.904 Meredith Whittaker

which was deposited or created via this surveillance business model, packaging that in an LLM and calling that intelligence, right? You're actually having to grapple with the epistemic process by which data becomes a proxy for reality. And that proxy shapes our lives and institutions. And so I think AI itself, right now we're talking about these massive models, right?

0
💬 0

6824.024 - 6826.965 Meredith Whittaker

This laws of scale, this sort of like big,

0
💬 0

6827.461 - 6852.604 Meredith Whittaker

american guy dream of the you know the largest in the world um but ai is a you know it's a very slippery term it's not a technical term of art it can apply to many many different things and there are small models there are sort of you know heterodox approaches there are expert systems which they're now trying to bolt onto the side of generative systems because like wait probabilistic answers aren't true so we need to bolt truth back on and we're kind of repeating a lot of the

0
💬 0

6853.385 - 6858.509 Meredith Whittaker

you know, a lot of the, a lot of the history of AI kind of speed running it in a search for a business model.

0
💬 0

6859.629 - 6881.385 Meredith Whittaker

So my answer there is that like a lot of the things that need to be done to, you know, simply disarm and draw down the centralized power of these, you know, surveillance and infrastructure companies are the same things that would need to be done to redefine in a sense, what AI is and our relationship to how truth, how, you know,

0
💬 0

6882.304 - 6896.431 Meredith Whittaker

decisions, how, you know, I don't want to use the word truth actually, but like how information is made via analyzing data and who gets to control that. And I think, you know, my sensitivity to data in that answer comes directly from my measurement experience, right?

0
💬 0

6896.471 - 6919.584 Meredith Whittaker

Like where you, you know, one upgrade to the Linux kernel across our server fleet fundamentally changed the kind of data we were able to create. like, how it populated the schema and, like, meant that that data wasn't necessarily fungible with the data collected on the older version of the kernel, right?

0
💬 0

6920.165 - 6934.797 Meredith Whittaker

And in order to solve that problem, I had to get a guy to go sit with the kernel maintainers for, like, two years to, like, make sure that the update wasn't going to, like, fuck up the way we got TCP DOMs, basically. So, like, that, you know, that's... And that's...

0
💬 0

6936.098 - 6960.27 Meredith Whittaker

then think about social data then think about data that reflects like who gets access to resources then think about all of the other things and um i think i think it's actually an exciting idea to think on like you know how do we how do we create systems where we're much more attentive to that and recognize that there is a you know it really matters how those choices are made how those you know how data is created who gets a say in it and who gets a say in how it's used

0
💬 0

6961.576 - 6977.84 Tim

Would you say in that future, so there is these large gen AI models and whatnot. And I, well, from other discussions we've had, I know that you probably believe there is like a stronger...

0
💬 0

6979.543 - 7000.569 Tim

future for like specialized models expert models not the generative ones would that I mean would that be in a prediction of how this whole AI thing is going to evolve not from the business model perspective or from the political perspective but from the actual technology and research perspective?

0
💬 0

7000.609 - 7010.082 Tim

Do you think there is still going to be exponential improvement in the Gen AI world or do you think it's now the time for the smaller speedboats?

0
💬 0

7010.848 - 7028.596 Meredith Whittaker

Well, I think it's definitely time for smaller speed boats. And I want to index on that word improvement. Because if we scratch the surface on some of these large models, some of which are generative, you begin to realize that a lot of the claims to improvement and accuracy are based on...

0
💬 0

7029.396 - 7058.421 Meredith Whittaker

really narrow benchmarks and evaluations right that don't reflect the performance of these models and that's how germans optimize their diesel engines we have exactly um so it's you know there are things that gen ai models can do i don't see them realistically going away but i do see the struggle for a market fit that can produce the kinds of returns necessary to prop up

0
💬 0

7059.099 - 7081.89 Meredith Whittaker

a massively energy intensive, massively infrastructurally intensive, extraordinarily capital intensive... Mm-hmm. Industry, right? Like, so you have, you know, billions of dollars for a training run, just huge amounts of energy and effort needed to create a model. But like, okay, who's going to keep paying for a chatbot that's wrong? Right?

0
💬 0

7082.45 - 7097.382 Meredith Whittaker

You know, and so I think like, there is a struggle for market fit. I think you see this with things like Microsoft Recall, where they, you know, push to implement this, you know, I don't know. Yeah. Microsoft Recall was supposed to ship with Windows 11.

0
💬 0

7097.602 - 7101.271 Interviewer

It does, but it's not turned on by default now.

0
💬 0

7101.556 - 7132.529 Meredith Whittaker

Yeah. And we won that little thing. But it's, you know, it's an AI system whose value proposition is that it will remember everything you were doing on your device for the last N months. That's a nice value proposition. I get it for you. I don't, I obviously don't use it right. But like, and how does it remember is really the key here.

0
💬 0

7132.569 - 7145.165 Meredith Whittaker

It remembers because it's taking screenshots of your device every five seconds, you know, creating a library of those screenshots and accessing those as the data on which it, you know, is able to claim intelligent memory.

0
💬 0

7145.925 - 7150.167 Tim

That is not an efficient way to approach the problem.

0
💬 0

7150.407 - 7174.078 Meredith Whittaker

And I don't need to know that I was doom scrolling. That's not a proud moment of memory for me. And to me, what that says, that's not a very useful purpose. It's probably going to be marketed to enterprises for worker surveillance, is my guess. But it shows that Microsoft is really trying to find a market for this, right? Because they clearly circumvented their QA process.

0
💬 0

7174.118 - 7189.801 Meredith Whittaker

They clearly circumvented their security evaluation. There was a lot of things that clearly didn't happen. and didn't happen at a company that is actively, yeah. And they need, you know, they invested, you know, they have open AI, they're invested a huge amount in Azure. They're building out infrastructure everywhere.

0
💬 0

7189.841 - 7205.967 Meredith Whittaker

All of them are, but Microsoft is kind of, you know, Microsoft is yoked to open AI. So they're, they took the leader position for a second. Um, and I think, I think given also that Microsoft is really trying to regain some, like a good reputation in the security world, it,

0
💬 0

7206.488 - 7227.348 Meredith Whittaker

It's indicative of how desperate that rush to market fit and the AI exceptionalism that is driving it is that they just mess that up so egregiously. And I can hear some hacker in a Microsoft hallway being like, I don't fucking know. I just left that meeting. Because you can kind of sense how those things happen.

0
💬 0

7227.694 - 7239.959 Interviewer

What would you say is the dividing line right now between useful applications for machine learning, expert systems, AI stuff and the hype?

0
💬 0

7241.26 - 7251.044 Meredith Whittaker

Well, I mean, the question to dig into is useful to whom? Because hype is useful to investors, right? You know, yada, yada.

0
💬 0

7251.064 - 7252.524 Interviewer

I'm not talking about, yes.

0
💬 0

7252.905 - 7254.545 Meredith Whittaker

I mean, I think AI is...

0
💬 0

7256.543 - 7267.425 Interviewer

In a computer science-y way. I mean, really like making applications possible that haven't been before that actually do useful stuff to people or society.

0
💬 0

7267.805 - 7284.868 Meredith Whittaker

Yeah, I mean, one use I think about a lot that is definitely, I'm sure, useful to intelligence services is, you know, I'm sure we all assume that POTS telephony data is being collected in mass by every intelligence service who can and has been for many, many, many years.

0
💬 0

7285.948 - 7302.239 Meredith Whittaker

And that data was probably not that useful for a long time because, you know, you're going to have to know you're going to have to have a human review it or, you know, something. Well, it's probably a lot more useful now that you can quickly transcribe that with AI and sort of synthesize and search using these generative systems. Right.

0
💬 0

7302.92 - 7315.404 Meredith Whittaker

So that's one example where I think it's probably almost certainly very, very useful and changes the calculus on like how dangerous this surveillance business model is as well. But who is that useful to? It's not me and you.

0
💬 0

7318.708 - 7331.862 Interviewer

Yeah, you're going right into surveillance capitalism again. Yeah, no, well, that's my, I'm really good at making that turn. I'm looking for a rosy outlook into the future, like hope. Anything in store?

0
💬 0

7335.866 - 7348.95 Meredith Whittaker

I refuse to plant my hope in delusion. Okay, thanks. And I am probably one of the more optimistic people you'll meet.

0
💬 0

7352.711 - 7354.851 Tim

All right. Meredith, it was a pleasure.

0
💬 0

7355.271 - 7360.573 Meredith Whittaker

So fun. Always so fun. Thank you for having me. It's been a really, really delightful conversation with you all.

0
💬 0
0
💬 0
Comments

There are no comments yet.

Please log in to write the first comment.