Menu
Sign In Pricing Add Podcast
Podcast Image

How About Tomorrow?

How Dangerous Can AI Get, Dax is Down on DeepSeak, and AI First App Development

Mon, 03 Feb 2025

Description

Dax is finally warming up in frigid Florida, AI isn't as dangerous as everyone thinks it is, DeepSeek is full of holes, Adam's concerned about techno signatures, and what does it mean to have AI first app development?Links:Lex Fridman PodcastAnthropicdax pranked by claudeGranolaSuperhuman Email ProductivityDax’s Remote Dev Setup videoBare Metal & Servers9950x VPS MiamiCockpit ProjectTailscalethdxr/environmenttmux/tmux WikiNeovimSyncthingAI Code EditorThe editor for what’s nextBun JavaScript RuntimeSponsor: Terminal now offers a monthly box called Cron.Want to carry on the conversation? Join us in Discord. Or send us an email at [email protected]:(00:00) - Shocking if true! (00:28) - Going for a walk without pants on (02:22) - What are the threats of AI to the world? (17:15) - Dax on DeepSeek (27:49) - Dropping Claude and assessing AI (36:31) - Techno signature follow up (42:39) - AI first app development (55:35) - Dax's remote set up video walk through ★ Support this podcast ★

Audio
Featured in this Episode
Transcription

0.21 - 5.282 Dax

You're sick of it. This is our last episode ever. We're not going to do this podcast anymore. Adam doesn't want to talk to me.

0
💬 0

29.153 - 46.469 Adam

I was just on the news. I just read about a plane crash, and that's not good. Yeah, I saw it last night, and I'm getting on a plane tomorrow, so really bad timing. Oh, man. Yeah. I guess two things collided in the air. I always worry about that. You always think, like... I don't know. Could something just run into the side of us?

0
💬 0

46.489 - 51.172 Adam

Cause they didn't know we were here and they didn't look at the radar or whatever. Sonar. I don't know.

0
💬 0

51.192 - 60.519 Dax

Yeah. And the situation, it was kind of crazy. It was right as a plane was landing and it was a Blackhawk helicopter that was like in the air, right over the ground in the airport.

0
💬 0

60.799 - 66.703 Adam

Anyway, that's a damper to start out with. How are you?

0
💬 0

66.923 - 74.108 Dax

I'm good. It's finally kind of warming up again. Like I went outside with no pants on today, which is good, but I'm still wearing, you know,

0
💬 0

75.209 - 98.107 Adam

a long sleeve long sleeve shirt yeah i went for a walk this morning outside at 5 30 in the morning because it was 50 degrees here which is amazing it's been so cold and it should not be 50 before 6 a.m that's unusual going outside when it's 50 degrees is uh it's really dangerous you should have seen how i was dressed probably lighter than you're dressed right now

0
💬 0

99.821 - 116.556 Dax

Well, I'm going to Boston, so I have to like go and pack. Oh, no. All my like just heavy clothes from New York and my like mountaineering jacket. That's funny.

0
💬 0

117.016 - 118.037 Adam

What are you doing in Boston?

0
💬 0

118.177 - 126.845 Dax

Liz's friend is having an engagement party. So we're going for that. And then I'm going to visit AJ while I'm there. Nice. We're going to hang out on Friday.

0
💬 0

127.935 - 134.864 Adam

That's awesome. Yeah. Tell him I said hi. Or if he's a listener. Hi, AJ. I'll just bypass Dax. He's a terrible middleman.

0
💬 0

136.987 - 138.529 Dax

He is a listener. I'm sure he'll hear this.

0
💬 0
0
💬 0

139.23 - 141.513 Dax

Not until after we hang out. That's true.

0
💬 0

143.379 - 146.72 Adam

Uh, I've been listening to a lot of stuff. Uh, we don't want to talk about AI more. Do we?

0
💬 0

147.04 - 149.021 Dax

I just listened to talk about whatever you want. Good.

0
💬 0

149.121 - 180.877 Adam

Okay. Whatever. I was just listening to, uh, Oh my God. Uh, just sorry. This reminded me of, uh, this reminded me of this stupid show. Casey and I've been watching on Netflix. called later daters. And it's like these 50 to 60 plus. Yeah, it's like, it's these older people, 50s and 60s dating, like divorcees, widowers, et cetera. And there's this woman who they have like a dating coach.

0
💬 0

181.337 - 198.765 Adam

who seems to know stuff about relationships. And she encourages this lady to like open up conversations, like break the ice by talking about like a podcast you just listened to. But like the woman didn't quite understand. She doesn't seem to grasp the idea that you have to like actually talk about the podcast.

0
💬 0

198.805 - 222.36 Adam

And she would just open all her dates with, so I was listening to this podcast with Matthew McConaughey. And that's all she would say, like that line. And like, you have to keep going. Yeah. It just made me think of it, though, when I said, I was listening to this podcast. I just wanted to break the ice with you. Yeah, pause. That's all. I was listening to the podcast.

0
💬 0

222.44 - 235.437 Adam

I've been listening to all of Lex's stuff because Prime's going to be on there and I just forgot I liked his podcast. So I was listening to like some of his back catalog and he had Anthropic CEO on was super interesting.

0
💬 0

235.517 - 249.688 Dax

The Anthropic CEO seems solid. Like I don't get a sketchy vibe from him. He was always trying to like, I feel like he's trying to be really practical with how he talks about all this stuff, which is pretty different for most people in this space. So yeah.

0
💬 0

250.373 - 267.446 Adam

Yeah, it was very illuminating. I'm not going to try and regurgitate it because it won't be as illuminating coming out of my mouth. But you should go listen to it. He's clearly very focused on safety. And it's just fun to listen to people building these things, running companies, building these models, talk about...

0
💬 0

268.547 - 289.66 Adam

the risks and like the future and how it could play out because i always hear people talk about ai safety and it's like yeah well i don't know like it's all kind of vague and fuzzy but he like talks about the specific categories of threat that they pose and like how to kind of like mitigate those things it's just super interesting what is one example that you remember because i don't know anything about this

0
💬 0

290.103 - 311.211 Adam

Yeah, so he has, like, I can't remember the name of this. I think Anthropic came up with this system for categorizing the different levels of, like, threat that these models pose to society. Or level two is, like, the, like, state actors could use it to further their goals. Level three is, like, normal people could use it to, like, cause harm to humanity.

0
💬 0

311.551 - 333.698 Adam

And level four is that the AI itself, along with humans, is actually a threat. So, like... the AI can take its own actions, even like circumvent things. Like he talked about, they have to worry about, you know, they have these benchmarks or these tests that they do for safety to make sure that the model can't do certain things, like can't tell people how to make smallpox or whatever.

0
💬 0

334.398 - 357.031 Adam

So they have these tests, but they have to worry at level four that the AI will just like sandbag and pretend that it's not smart enough, even though it is because it knows it wants to pass the test. Yeah. Which is super interesting to think about. What do you do if these models can scale to super intelligence, smarter than us? How do you control something that's smarter than us?

0
💬 0

357.151 - 358.331 Adam

It's just super fascinating.

0
💬 0

358.572 - 370.961 Dax

I don't really understand the lower levels because what is... Does he talk about what practically is the difference between that and someone publishing a book? that has instructions on how to make smallpox?

0
💬 0

371.162 - 384.891 Adam

He didn't, no. Yeah, I guess, so what you're saying is, like, how is level three and below anything new to the world? Is it just more efficient? Like, a dumber person could figure out how to make an atomic bomb because AI is so smart?

0
💬 0

385.091 - 400.927 Dax

Given the stakes, it's like, if you're someone that's like, oh, I want to, like, unleash smallpox on the world, but I'm too dumb, and I can't figure it out. You know what I mean? It's such an ambitious goal, so it just... Like, to be that ambitious but, like, not just figure it out without AI, you know?

0
💬 0

401.228 - 416.485 Adam

So he speaks to that, like, the world is... The state that it is, it's mostly been safe because the overlap of people who are extremely intelligent and the people who want to do a lot of harm to people is a small overlap.

0
💬 0

417.746 - 431.837 Adam

Generally, there's not a lot of people that fit both those things, but the fear is that AI increases that overlap because now you take people who want to do a lot of harm and you give them intelligence they didn't have. I guess that's the vague general idea. Yeah.

0
💬 0

432.057 - 451.218 Dax

I think this is where I would disagree with the way all these people think about it, because I feel like they look at it from this really academic point of view, which is. I have like raw horsepower intelligence and I have. you know, trained knowledge in something. And that's what gives me capability.

0
💬 0

451.979 - 470.114 Dax

But in the real world, especially when it comes to violent stuff like that, none of that matters. It's all about motivation. Like if someone is really motivated, they will figure this stuff out. It's not like the thing that was blocking them was just like, oh, I'm not smart enough. You know, it's not really what the issue is.

0
💬 0

470.414 - 479.585 Dax

I will agree that like a lot of crimes happen because they're more convenient and this would make certain things more convenient. I kind of see that point, but yeah.

0
💬 0

479.866 - 491.51 Adam

So I remember when North Korea, there was a lot of tension with North Korea and like they were shooting a lot of rockets just to like flex their muscles. And there's a lot of talk about like how soon could North Korea develop nuclear weapons?

0
💬 0

492.41 - 504.475 Adam

Is that not like that's not because they're not smart enough, not smart enough, but like they don't have the knowledge of how to do it or it takes years to develop that technology. Is that not something I could be faster at?

0
💬 0

505.205 - 528.341 Dax

Yeah, it could be fast. I mean, it was taking the current form, right? If you're someone that is trying to go from not knowing how to do this to knowing how to do this, what does North Korea have? They have motivation, for sure. This is probably like their top priority. They have enough funding to... Figure it out. So given enough time, they will. There's like no stopping that. Yeah.

0
💬 0

528.542 - 536.89 Dax

Do certain tools help them do that faster? Definitely. The same way that Microsoft Excel probably helps them figure out stuff faster.

0
💬 0

536.931 - 537.952 Adam

Yeah, sure. Okay.

0
💬 0

537.972 - 568.849 Dax

So I get why this feels like really specific, but... If you're talking about that level of impact in the harm space, we should see the equivalent level of impact for people trying to do anything good, right? So I'm not like, I want to cure cancer. And I'm not like suddenly as a random person any closer to doing that. Yeah. So yeah, I think that side of it is a little overstated.

0
💬 0

568.929 - 584.861 Dax

I think they're kind of like... I think they're just kind of in this bubble. That's kind of a little bit like, like feeding this narrative into itself. So like, yeah, that's why the whole safety thing, I don't, I don't fully get it. Like every technology makes certain things more convenient.

0
💬 0

584.941 - 596.868 Dax

It's a lot more convenient to produce firearms today than it was a hundred years ago, like much crazier firearms. And yeah, you have to think about it, but I don't, I just don't see that happening.

0
💬 0

598.399 - 621.47 Dax

acquisition of knowledge being the place that people get stuck it's it's usually that the u.s and like all the countries try to have a crazy strict control over the raw material need to make a nuclear weapon that's probably where the bottleneck is and even that you know the countries work around there's always someone that's against the u.s that has access and

0
💬 0

623.17 - 625.971 Dax

Yeah, I feel like this detail is kind of irrelevant in the grand scheme of things.

0
💬 0

626.011 - 643.559 Adam

Yeah, if I'm being honest, I don't really buy all the AI safety talk. Like, it's so hard to know what's just noise. Like, what is just posturing and, like, even competitive. Like, some of these CEOs, there's a bit of, like, pulling the ladder up, right? That's been kind of at least theorized. I don't know if it's been proven.

0
💬 0

643.639 - 660.97 Adam

But, like, when the people that have the biggest AI companies training the big extensive models are the ones leading the charge on, we need to make this harder. Yeah. I don't know. Is there some other motive involved? But yeah, I guess like and then it seems like the other dialogue, you don't know what like is grounded in reality.

0
💬 0

661.01 - 681.27 Adam

There's so many people that talk about AI safety that don't seem to have any idea what that looks like. It's like at the government level, like they have no idea. Like nobody has any clue what that practically looks like. So yeah, it just feels like that whole conversation is either not grounded in reality or might have other kind of like hidden agendas behind it. I'm not scared.

0
💬 0

681.79 - 690.459 Adam

I say bring on the AI. It's like if it could solve problems and make things easier, yeah, I feel like if it gives the good guys more tools too.

0
💬 0

691.741 - 715.712 Dax

then yeah what's the problem i don't know yeah it's just funny because it's it's such a virtual thing it's like something that like you imagine someone going to a store and buying a physical hammer and like smashing your head in that's like so so real whereas like this is just entirely in the virtual space and it's just it's hard to imagine that uh you know it just feels like They have a point.

0
💬 0

716.512 - 729.861 Dax

It's not that knowledge isn't harmful or dangerous, but just compared to just physical... Buying a vehicle and ramming it through a crowd is just so much more effective than anything that's bottlenecked by your knowledge.

0
💬 0

730.882 - 748.617 Adam

I guess on the digital front, though, there is a lot of havoc that systems could do, like banking systems. If autonomous AI stuff that had its own agenda... It could cause a lot of problems in the world, even if it's only digital and doesn't have physical form, right?

0
💬 0

749.017 - 769.1 Dax

Even if it's not its own agenda, if there is some system that's not controlled by AI and now there's a whole set of new vectors of, well, how can someone manipulate this system? Just because it's hard enough for us to create security around deterministic systems. This is like a not deterministic system.

0
💬 0

769.12 - 793.072 Dax

So you never know if there's a certain set of words in the right order will make it ignore all the safeguards you put in place. So that to me is like a very practical application of AI safety. And that's not even like about the AI being capable, it's actually a flaw with it being not very capable that it can be like reprogrammed by accident in these little ways.

0
💬 0

793.132 - 794.913 Dax

So I get that side of things for sure.

0
💬 0

795.253 - 806.155 Adam

I listened to another podcast of Lex's with, I think his name was Adam Frank. He's like some kind of a astro something, astrophysicist maybe.

0
💬 0

806.415 - 807.176 Dax

He looks at space.

0
💬 0

807.716 - 829.74 Adam

He looks at space, but he like, he like thinks about his job is like, I guess they just got the first grant for looking for techno... What did he call it? Techno... Signatures? Techno signatures. It's like bio signatures would be like looking at a planet and saying, is there any like... Are there gases that would prove that there's life on this planet?

0
💬 0

830.661 - 836.562 Adam

But techno signatures are like, does this prove that there's advanced technology? So they're like actually looking at...

0
💬 0

837.743 - 860.005 Adam

exoplanets in the habitable zone or whatever uh and trying to find like signs that they have created technology i can't remember what some of them were a super fascinating guy just go listen to lex podcast what are you doing listen to us just go just listen like the last five episodes are all good i just listened to them all we're just we're now just a podcast that summarizes that other podcast

0
💬 0

862.92 - 863.741 Adam

That's probably a thing.

0
💬 0

863.961 - 877.13 Dax

That's funny. That's pretty cool. I think there's something else. There's some clips of some other thing I was watching that was somewhat similar. So did you talk about what... What is like the primary thing they're looking for? Is it, are they looking for like Dyson spheres? Like what are they looking for?

0
💬 0

877.17 - 884.958 Adam

No. So he did talk about Dyson spheres, which I didn't remember knowing what those were. That's wild. Which I think they proved you can't, they couldn't actually make a Dyson sphere.

0
💬 0

884.978 - 889.583 Dax

Did you just say you didn't remember knowing what that was? Do you mean like you forgot you knew about it?

0
💬 0

889.623 - 890.303 Adam

Yeah, I forgot I knew about it.

0
💬 0

890.363 - 893.006 Dax

And then they remind you that you did actually know about what it is?

0
💬 0

893.406 - 903.714 Adam

Yes. Listen, I don't have a great memory. Okay. And I know I've heard of Dyson spheres, but until I heard him talk about them on this episode, I didn't recall.

0
💬 0

904.214 - 929.912 Adam

It's like basically this big sphere around your star, around the sun that like all the energy from that sun, which that's another crazy thing that he talks about is like the levels of civilization, the level, whatever they are, the energy output. Yeah. But the technosphere thing, I think the main one they're looking at, what did he say? What did he say? it was not Dyson spheres.

0
💬 0

930.092 - 952.211 Adam

It was satellites, radio waves. No, it wasn't waves. I don't remember, man. I'm sorry. It would be interesting content and conversation. I just don't remember. Oh, what would you personally look for? Well, let's see. What would I look for? If there was a civilization with technology, uh,

0
💬 0

953.723 - 958.044 Dax

I would look for screens. I would look for... Wait, screens?

0
💬 0

958.184 - 981.709 Adam

Oh, yeah. No, this isn't like they have images. He did talk about imaging. This is just... We're going off the rails. I could just talk about different podcast episodes forever. But he did talk about in the next however many hundred years that we'd be able to have Manhattan-size imaging, interstellar view, cities the size of Manhattan.

0
💬 0

982.455 - 1001.771 Adam

what did he say, 26 kilometer resolution or something on exoplanets. They have this idea for, it sounds like science fiction for sure. And that's the cool thing about science fiction. That would be crazy. The way it worked is like, you send all these like sensors, cameras, I guess, way away from earth, the opposite direction from the sun.

0
💬 0

1001.791 - 1017.806 Adam

I can't remember how far he said in the solar system, but like long ways. And they're looking at planets that are just past the sun because of the way large bodies work space and time. Yeah. So the sun basically like focuses the image, uh,

0
💬 0

1018.527 - 1042.519 Adam

of the star just beyond the edge of the sun and these cameras are looking at yeah it's super wild so using the sun is like this amplification of our ability to to view exoplanets anyway let's talk about something that's not on another podcast My memory's not good enough for this exercise.

0
💬 0

1042.539 - 1046.84 Dax

You know what's definitely on other podcasts? It's the whole DeepSeek thing from this past week.

0
💬 0

1046.92 - 1059.644 Adam

Yeah, we didn't really talk about DeepSeek much, did we? Can you just run that on your local machine? Can I just start getting coding benefits from DeepSeek R1 without an API call?

0
💬 0

1059.844 - 1067.667 Dax

It's not really practical. I mean, it's like a reduced version of the model, and it's very slow, and the hardware requirements are pretty crazy. So no, you can't.

0
💬 0

1067.927 - 1073.732 Adam

Okay, so how do people use DeepSeek R1 right now? How does it exist?

0
💬 0

1073.872 - 1087.784 Dax

Is it commercialized in any way? There's like a hosted one from the company, but it's in China, so people feel sketched out by that. But then it's been re-hosted because open source has been re-hosted by a bunch of providers that you're familiar with. Like Cloudflare, I think, has a version of it.

0
💬 0

1088.284 - 1088.825 Adam

Oh, okay.

0
💬 0

1088.845 - 1098.643 Dax

There's been a few others. I don't think it's good. Oh, really? Well, it's like not better than anything else. It's just a... recreation of what's already there.

0
💬 0

1099.103 - 1106.087 Adam

They did it for less. It's like the... Not Indiana Jones. What's the guy? MacGyver. They just MacGyvered it and they made it out of duct tape.

0
💬 0

1106.107 - 1111.05 Dax

I don't believe any... I mean, it's just like... None of the information about it is true.

0
💬 0

1111.31 - 1120.595 Adam

It's just... Oh, whoa, whoa. Stop. Hold on. Catch me up. I didn't know that it wasn't true. What are the facts that aren't true? Because I don't even know the facts around it, really.

0
💬 0

1120.635 - 1136.246 Dax

I just heard it's cheap. They're claiming they did... They're claiming they trained the model for $5.5 million, which is like... a crazy uh man like several orders of magnitude less than what OpenAI's models costs. Everyone was like dunking in OpenAI.

0
💬 0

1137.046 - 1143.068 Adam

Is it a currency thing? Or were they talking maybe yen or something? No, make it even cheaper.

0
💬 0
0
💬 0

1151.05 - 1155.431 Adam

And you just think, you think, why would they, oh, because it's like a competitive thing? They're trying to lie.

0
💬 0

1155.451 - 1180.835 Dax

So the reason, it's very noisy. There is true interesting things that they did. Like, so you can't take that away from them. Like, it's impressive. But that doesn't mean what they're saying about how it was done is true either. The numbers are just like way too much of a lie. Like, there's no way that one, they're that low. Two, there's a lot of reasons for them to make it up. Right.

0
💬 0

1182.315 - 1189.117 Adam

No one's been able to reproduce it for once. That's not a thing. Could you make some of them explicit? Yeah, say some of the reasons because I don't always connect dots.

0
💬 0

1189.477 - 1194.799 Dax

Well, China's not allowed to have certain DPs. What? Because of the export.

0
💬 0

1194.819 - 1195.239 Adam

I didn't know this.

0
💬 0

1195.259 - 1196.159 Dax

Because of the export controls.

0
💬 0

1196.439 - 1201.841 Adam

Okay, you've got a lot of context here. You need to lay it all out. Spell the case out for why DeepSeek is a fraud.

0
💬 0

1202.241 - 1208.824 Dax

On paper, NVIDIA is not allowed to export... certain levels of GPUs to China.

0
💬 0

1209.345 - 1211.426 Adam

Okay. NVIDIA is an American company, right?

0
💬 0
0
💬 0

1212.867 - 1217.33 Adam

Okay. See, these are things I just don't know for sure. So you got to spell it out.

0
💬 0

1217.77 - 1225.095 Dax

So they can't be like, hey, here's exactly what we use if they're using a bunch of stuff they're not supposed to have. So that like throws a bunch of punch into this.

0
💬 0

1225.355 - 1230.679 Adam

Sorry, going back just real quick. The reason they can't export them to China is like American law?

0
💬 0

1231.019 - 1235.062 Dax

Yeah, yeah. We banned exports of GPUs above certain capability. Okay? Okay.

0
💬 0
0
💬 0

1236.082 - 1242.125 Dax

There's another interesting fact that someone pointed out recently that Singapore is 20% of NVIDIA's revenue.

0
💬 0

1242.185 - 1245.907 Adam

Okay. Is Singapore in China? I'm so dumb. I'm so sorry.

0
💬 0

1246.147 - 1249.948 Dax

Singapore is like a very small island nation in that area.

0
💬 0
0
💬 0

1250.909 - 1256.031 Dax

So it's China adjacent. Why would they be 20% of NVIDIA's revenue? That's a little weird.

0
💬 0

1256.191 - 1261.093 Adam

Oh, so they're buying all the GPUs and then just taking them into China? Are they smuggling them?

0
💬 0

1261.694 - 1264.255 Dax

These export controls practically are just not...

0
💬 0

1264.995 - 1290.453 Dax

effective like it's like how do you exactly we're talking about earlier like there's always going to be a way if you're sufficiently motivated to to get these things um and of course there's something around it i could just buy a bunch of them and take them to china there's no one at china in china at china there's no one at china that's gonna stop me bringing them in right it's just that america is supposed to not the u.s is telling nvidia you can't do this um

0
💬 0

1291.798 - 1309.318 Dax

And the other thing I was thinking about was like, man, what a deal of the century. You could just be the dude in Singapore smuggling this stuff, adding a 20% whatever. And that's like, it's like 20% fee on like $20 billion of GPUs. Like that's, that is crazy. That is really wild.

0
💬 0

1310.455 - 1331.287 Dax

So the point is there's like so many reasons why one, they wouldn't say to like, well, sorry, one, they couldn't say what they actually did. And two, like, there's a lot of reasons to just, and that's what they always do. They always like lie about the price of things to like create, uh, just competitive noise in the market. Yeah. It's a good strategy. It works.

0
💬 0

1331.327 - 1344.772 Adam

Yeah. Okay. But DeepSeek put out a paper. I know this because all the software engineering nerds who act like they're smart enough to understand papers are like, oh, check out this paper. This is amazing. Like you don't know what the paper says. Just stop.

0
💬 0

1345.772 - 1353.495 Dax

If you literally put the paper into DeepSeek and talk to it about it, you would learn more than just listening to people talking about it. Yeah, probably.

0
💬 0

1354.055 - 1362.838 Adam

Okay, but question. Did they not have to outline in the paper what hardware they use and all that stuff? I guess they don't have to, but would they not generally do that?

0
💬 0

1363.059 - 1379.316 Dax

They talked about their techniques, and their techniques are interesting and novel, so you can't take that away from them. But then they separately claim that we use these techniques on this hardware. to achieve this outcome. But there's so many ways to lie about that.

0
💬 0

1379.576 - 1391.941 Adam

If it's in the single digits of millions of dollars, I feel like there's somebody out there sufficiently motivated to reproduce. Can they not reproduce based off the paper or there's still some secret stuff?

0
💬 0

1392.142 - 1416.773 Dax

The thing is, if you... Okay, let's say someone told you that, hey, I can... run a SQL query that filters a trillion rows in half a second. right? You, as someone that understands this stuff, you're like, I'm not even going to waste my time reproducing that because that makes no sense. Okay. Yeah.

0
💬 0

1416.873 - 1434.823 Adam

So I imagine that something similar is going on here. So, so you're saying like, I have so many, I'm sorry, I keep interrupting you. I just feel like you're, you're moving a hundred miles an hour and I'm like at the stop sign still. So you're saying that like big companies just all believe this is a bunch of BS. Like it's just,

0
💬 0

1435.543 - 1441.946 Adam

like the broader people in the know in the industry just dismiss this thing right out. And we're all excited about it, but they're like, yeah, whatever.

0
💬 0

1441.966 - 1461.935 Dax

Yeah, I mean, just because it's such a hyped space, it's so hard to tell what's real and what's not. And also, the noise comes on both sides. So remember that we're saying novel because it's been published publicly. We don't know that OpenAI already ran across this and is using it to develop their stuff, the techniques in there.

0
💬 0

1463.216 - 1482.872 Dax

So this might not even be a surprise to them as much as be, oh, they like independently came across the same techniques and they know that, yeah, it's not causing like a thousand X decrease in draining costs. But then the other noises, and so now, and this is a part where I'm like, okay, this could be noise from the other side, but I did think about this when it came out.

0
💬 0

1484.489 - 1503.947 Dax

OpenAI is claiming that they have proof that DeepSeq was trained on outputs of their models or of like some maybe potentially like unauthorized access to stuff from OpenAI. Okay. And there's like some like, again, this doesn't mean anything, but like.

0
💬 0

1504.741 - 1529.487 Dax

like the pseudoscience part of this is people were able to get DeepSeek to reply and make the exact same mistakes that O1 makes, which seems like maybe it's a coincidence. Maybe it means something. But yeah, the point here is like, it's just such a crazy hype space with a ton of money that there's like zero ability to draw any kind of, this is what's happening right now in the moment.

0
💬 0

1529.627 - 1531.247 Dax

It's just impossible for situations like this.

0
💬 0

1531.633 - 1544.041 Adam

Yeah, I guess has, like, OpenAI, so you said they said this thing about them using their outputs. Have, like, people like Sam Altman or any of the figures in this space come out and said anything about DeepSeek publicly?

0
💬 0

1544.081 - 1566.14 Dax

You know how Sam Altman is. He just did the whole, like, generic, wow, it's really impressive, and I'm invigorated by the competition. You know, just like the fucking, he, to be honest, ChatDBT is more human than Sam Altman already. Yeah. Did you see what Claude did to me yesterday? No, what? Did you tweet about it or something? I can't believe it did this.

0
💬 0

1566.18 - 1588.389 Dax

So I was trying to deal with, again, bringing it all back down to earth. I was trying to insert something into a Postgres database. And of course, on conflict, you want to do an update operation. Of course. I'm used to my SQL where you can just say on any conflict, do this operation. But on PostgreSQL, it specifies like, oh, when this conflicts, do that. When that conflicts, do this.

0
💬 0

1588.47 - 1613.656 Dax

But I was like, OK, can I just on conflict on anything? Is that possible? And Claude, in a single reply, writes out, hey, yeah, you can do this. And then it writes out the query. And then right after it does that, it continues writing, being like, just kidding. That syntax doesn't exist. What? It said just kidding? Oh, my God. This tweet has 5,000 likes. I didn't even notice. What?

0
💬 0

1613.896 - 1641.243 Adam

You tweeted this? I got to see this. I'm trying to do so many things. I was also trying to look up techno signatures because I feel so bad about not knowing. THDXR. Okay, so you just tweeted this recently? I tweeted it last night. Last night. Oh, wow, bro. Claude is straight up pranking me now. Can I make it do on conflict on anything? Yes, in Postgres you can use on conflict. Just kidding.

0
💬 0

1641.263 - 1642.443 Adam

What in the world?

0
💬 0

1646.073 - 1669.032 Dax

that's hilarious yeah what it's funny because we're so used to these models being quirky but like think about this in a traditional product like imagine you have a product and you have a button and the button is like click here to do something useful and you click it and it pops up being like just kidding we don't do that like that that would be so ridiculous to actually ship something that did that yeah uh like that's something the terminal would do

0
💬 0

1670.233 - 1695.248 Dax

but this is like in in claude and to be honest something i just i've just been annoyed with claude more and more for the past couple weeks and this to me was like same this is like the final straw where i'm like you're straight up just joking right now like i i'm gonna actually consider i think i'm gonna stop paying for it i'm gonna re i need to like reassess what i'm paying for yeah uh yeah because i just keep signing up for them and then it's like it's easy to forget what

0
💬 0

1695.988 - 1718.21 Adam

The Claude thing, I heard somebody or somebody tweeted this the other day that Claude was getting dumber. And he talks about it on the podcast. Apparently, Lex asked him a question from Reddit, which was like, why does Claude just keep getting dumber? And he kind of goes on to say that people report this on all the major models. This isn't just unique to Claude. Yeah, it's not.

0
💬 0

1718.63 - 1735.483 Adam

He kind of explains like... It was kind of hand-wavy. I don't know. I didn't really take from it that I believe they don't get dumber. He said they don't intentionally... They never change the weights. They do sometimes change the system prompts and they change some other things. And I don't know. But he basically was saying like, most people, it's just like a psychology thing.

0
💬 0

1735.624 - 1738.766 Adam

You're really impressed at first and then you just get less impressed over time.

0
💬 0

1738.786 - 1743.83 Dax

That's what I was wondering. Is that the case? And the more you use it, the more you understand the boundaries, like...

0
💬 0

1744.17 - 1763.877 Adam

But I do genuinely feel like it's gotten dumber in the last couple weeks. And I don't know what to do with that feeling. Because if I felt it, and then I read someone else felt it, and then I learned that Reddit feels it, there's something there, right? Because it's like things that I felt like it was doing a pretty good job... A few weeks ago, it feels like it's not doing as good.

0
💬 0

1763.897 - 1785.234 Dax

Is it just a feeling? Yeah, I'm on the side that it's just a feeling. I mean, I think I would doubt that it's that clear cut. Like they must constantly be optimizing or like playing with the amount of compute they're allocating to inference. And there's like ways to like kind of make it more efficient for you to run.

0
💬 0

1786.181 - 1794.352 Adam

Is that the kind of tinfoil hat theory that it's a cost thing, that they're just using less resources over time for inference?

0
💬 0

1794.372 - 1817.59 Dax

They have to balance it. There's no way that on day one of releasing something, they nailed it and they never have to tweak that. I would be surprised if... there's not any thing where they explicitly know that, oh yeah, we did this because we made this trade-off. But I do agree that it must be a psychology thing as well, because if I really think about it, the thing that's not static is...

0
💬 0

1818.795 - 1838.823 Dax

I'm trying to use this stuff more and more. And it's really hard to keep track. You know, it's that thing where, like, everyone's like, oh, yeah, I know what I eat every day. And, like, you know, like, I know I eat this many calories or whatever. But then you make them write it down. You realize, like, it's so different with people's, like, perception of how much they eat or what they eat is.

0
💬 0

1838.843 - 1855.247 Dax

So I think it's kind of similar where I know that I'm using it, trying to use it more and more aggressively. And I know over time, as I get more comfortable with it or, like, becomes more and more of my workflow, I'm definitely pushing the boundaries of it more. That just happens with any tool. So it's hard to say that that's not a factor.

0
💬 0

1857.951 - 1858.952 Adam

Oh, is that the end of your thought?

0
💬 0
0
💬 0

1861.281 - 1872.586 Adam

That wasn't good enough for you? No, it's good. It's good. I just thought you were, like, on a roll, and I'm, like, looking up techno signatures. Oh, my God. Just give up on techno signatures.

0
💬 0

1872.626 - 1873.366 Dax

We've moved on.

0
💬 0

1873.386 - 1885.831 Adam

No, I found it. I found it. So I'm going to tell you at some point. But I do want to respond to the last thing you said, which I totally knew what you were saying. Oh, I'm sorry. This has been a weird one. Yeah.

0
💬 0

1886.411 - 1897.375 Dax

By the way, it just straight up smells like fire in my house right now. So I hope I'm not burning down. Yeah, that's not great. I think it's because Liz turned on the heater and like, you know, how's in Florida? You're not really supposed to use the heater.

0
💬 0

1897.735 - 1908.48 Adam

Yeah, you never use it. And then when you turn it on for the first time, it does. It smells like there's like a actual like wood burning fire in your house. I know that smell. I did have a follow-up to what you said. Sorry, I had a question.

0
💬 0

1910.541 - 1927.77 Adam

Do you know, when we were just talking about inference and the GPU resources allocated to inference, they have to use, I guess now, thousands of GPUs to do the training. Do you know, orders of magnitude-wise, what inference looks like compared to training resources, like infrastructure?

0
💬 0

1927.95 - 1931.552 Dax

They still allocate most of their stuff to training, not to inference.

0
💬 0

1931.812 - 1941.605 Adam

Okay, so if they have 10,000 GPUs, like most of the 9,000 are used for... I don't know the exact ratio, but I know it's more on the training side than on the inference side.

0
💬 0

1941.965 - 1955.678 Dax

Okay. Yeah, I mean, it just makes sense because why when... If you don't win the model battle, the inference, the fact that people are using your product is kind of irrelevant. So it doesn't make sense to over-allocate there.

0
💬 0

1955.738 - 1977.332 Adam

Intuitively, that made sense to me, and I figured that was the case. It's just interesting when you think about a business, the lifeblood of Anthropic or OpenAI is this huge farm of GPUs, and that huge investment in GPUs is useful for training new models. So they just always had to be training new models to get... the thing out of that huge investment. Right.

0
💬 0

1977.372 - 1981.156 Adam

Which I guess they always will be training new models. So maybe it doesn't matter.

0
💬 0

1981.176 - 2005.284 Dax

I mean, in the end to me so far, and I felt this from the beginning, this feels like the worst part of the sack to be in. It is the most difficult and the most expensive and it is the most like commodified. So yeah, I mean, I think the thing that people point out with deep seek is like, It's impressive to create something as good as open AI stuff.

0
💬 0

2007.205 - 2023.058 Dax

It's totally realistic to assume making a model that's 1% better than open AI stuff costs like $50 billion. Like that's like totally realistic. And that's like an argument in favor of being like, this is why open AI will, you know, it's not really a threat to them.

0
💬 0

2024.019 - 2045.877 Dax

Simultaneously, it's also like condemning this entire business because it's just like if it's going to take that much capital to make these marginal improvements and it's like a crazy competitive space where the costs are being driven to zero and all these companies are competing. Yeah, it's just... I don't know. To me, it never made sense.

0
💬 0

2045.957 - 2072.394 Dax

If I was an investor, this is not the part of... And I want to bet on this AI thing. This just feels like the worst place to put your money. It's so intense. So capital intensive, right? Yeah. When I see that, I'm like, I need to invest in someone that benefits from... having access to cheap AI models, not the people building the cheap AI models. And yeah, like VC Twitter, like it's funny.

0
💬 0

2072.414 - 2087.497 Dax

They just go on these little things. And currently they're on this, they've swung back and forth. And currently they're all saying, oh yeah, like the application layers where you're going to make a lot of money. But like, you know, a couple of weeks ago they were saying the opposite. But I do, that does make more sense to me. Again, it's not,

0
💬 0

2088.843 - 2106.623 Dax

I'm not taking the moonshot bet because the moonshot bet is you invest in open AI and they eliminate the whole economy, which I get. And I like bets like that. It's just for me, this one is not the one that that would go for. Yeah, there's something like less crazy is probably going to be the outcome.

0
💬 0

2106.903 - 2115.025 Adam

Yeah, and Sam Altman sucks. That's an easy way to not want to take that bet. Well, I mean, OpenAI or its competitors. It could be Anthropic or... Yeah, okay, sure, I guess.

0
💬 0

2117.546 - 2123.168 Dax

One last thing on this. Yeah. I did come across something today. Do you remember Mistral?

0
💬 0

2124.048 - 2125.428 Adam

Yeah. Whoa, yeah.

0
💬 0

2125.648 - 2148.196 Dax

Okay, so... Where'd that go? They're like, this is maybe the worst company fundraise of all time because... They raised, like, $150 million on, like, a $300 million valuation or something. What? Like, gave up half the company? Yeah, so they, like, gave up half their company. And, like, that is nowhere near enough money to, like, play in this game.

0
💬 0

2148.236 - 2151.816 Adam

Like, they're trying to do the frontier model thing. Like, they're on that.

0
💬 0

2151.836 - 2162.623 Dax

Yeah, exactly. Oh, jeez. What the fuck are they going to do? I mean, maybe they're like a French company and maybe it's just like they're just going to serve the French market because I guess the company's there.

0
💬 0

2162.744 - 2167.467 Adam

Maybe they're going to train models for a thousand times cheaper than OpenAI. Maybe they're going to go the deep secret.

0
💬 0

2168.088 - 2171.531 Dax

That's possible. But again, like you just get away with it if you need any more money.

0
💬 0

2171.551 - 2172.091 Adam

Yeah, that's crazy.

0
💬 0

2172.131 - 2178.095 Dax

If you need one million more dollars, like what? What deal are you going to make?

0
💬 0

2178.996 - 2200.663 Adam

I didn't remember if Mistral was... There's been so many of these companies doing image stuff. I think the image space is even more messed up in my brain. And I thought they maybe were one of the image generating things, but no. I want to talk about the app layer, the AI app space, because that's also kind of top of mind for me. Maybe it's because VCs are excited about it.

0
💬 0

2201.564 - 2230.122 Adam

And Mark Andreessen was just on Lex Friedman. And like I said, I've listened to all of his podcasts. I want to talk about my experiences and I want to hear from you how you think about those companies. But first, techno signatures. The main one that we're looking for is chlorofluorocarbons because nature can't create those. That requires... some sophistication technology.

0
💬 0

2230.723 - 2250.042 Adam

And like he talked about Earth, we pumped so many in the atmosphere that we blew a hole in the ozone layer and that that would be detectable using the right instruments from far away. That seems pretty solid. I'm increasingly convinced that there's nothing out there, but... Really? Oh, because I'm increasingly convinced. I listen to a lot of sci-fi.

0
💬 0

2250.062 - 2258.967 Adam

Now I'm increasingly convinced that it's everywhere. It's a dark forest. They're all out there. They're just being quiet. That's how I feel. Tell me why you feel that way. Why do you think there's nothing out there?

0
💬 0

2259.447 - 2283.075 Dax

I desperately want that not to be the case. And I think in a lot of ways, it's unlikely that there's nothing out there. But man, given just the size of the universe, when I say nothing out there, I mean, even if there is, it's not in our... perceivable universe or whatever. And like, you know, the galaxies are separating faster and faster over time. Right. So like, there's no way we'd ever reach.

0
💬 0

2283.115 - 2297.515 Dax

Yeah. So it just feels like, I don't know. It feels like a, I just get like a much, I get like a negative feeling. feeling towards that whole thing. It feels like so impossible and unlikely, but again, not based on science, just based off of how I feel.

0
💬 0

2297.535 - 2321.815 Adam

Yeah, just feel. I guess like, okay, I have a lot of thoughts. First, you just said that and it reminded me that I just heard how things can't travel across space and time faster than the speed of light. according to our understanding of physics. But the actual universe moves faster than the speed of light. So yeah, the galaxies moving apart are moving faster than the speed of light, right?

0
💬 0

2322.175 - 2325.118 Dax

Because there's new space being created in between them.

0
💬 0

2325.138 - 2325.538 Adam

I guess.

0
💬 0

2325.859 - 2328.822 Dax

If you map that to velocity, I guess.

0
💬 0

2328.902 - 2331.765 Adam

I mean, I'm just making this up. Now you're really losing me.

0
💬 0

2332.946 - 2334.387 Dax

It's like if I magically created...

0
💬 0

2335.448 - 2355.273 Adam

between you and me more space it's like we've moved further apart at a certain rate right but we didn't yeah so maybe that's what he's talking about i don't know i i just got ahead of my skis here just even just trying to think about what you're saying but i think what adam frank just said on this podcast was that space time moves faster than the speed of light

0
💬 0

2356.013 - 2372.373 Adam

like the expansion of it, but you can't, an object can't move across space and time faster than the speed of light. But if it is true that the galaxies are moving apart faster than the speed of light, then yeah, you could never get to another galaxy because we can only dream to ever move at the speed of light, which would be a crazy accomplishment.

0
💬 0

2372.933 - 2390.021 Dax

But if it's moving faster... Yeah, unless you, like, do something crazy, like you violate or, like, you have a completely new model that, like, just... Yeah, it just totally breaks that. But, yeah, outside of that, you know, speaking, quote-unquote, practically, whatever that means in this space, like, yeah, maybe our galaxy is explorable.

0
💬 0

2392.803 - 2396.725 Dax

And, man, like, even that just feels like... I can see there being nothing there.

0
💬 0

2396.845 - 2404.45 Adam

So, okay, so my stance, I guess, it's that... It's about time. It's not about distance.

0
💬 0

2404.49 - 2406.272 Dax

It's like... How long stuff has been around for.

0
💬 0

2406.312 - 2419.288 Adam

Maybe... Yeah, maybe civilizations... And I'm stealing this from all the various science fiction writers and actual scientists that I've listened to in the last year. But... Yeah, maybe it's that like...

0
💬 0

2420.552 - 2442.515 Adam

intelligent uh societies just don't last very long so the chance that overlap is happening like the you know our whatever 100 years 200 years of technological advancement here is just such a tiny little blip in the broader expands to the universe, that the chance of that blip happening at the same time as a bunch of other blips is maybe super low.

0
💬 0

2442.876 - 2461.514 Adam

But that maybe life is super common, just not intelligent societies that last long enough. If we can get past... Adam Frank talks about this too. If we could get past all the terrible things that could end our civilization, whether that's nuclear war, climate change, AI, whatever.

0
💬 0

2461.534 - 2480.667 Adam

If we get past all those hurdles and we can figure out how to live for hundreds of thousands of years, millions of years as a civilization, then the chances of finding life maybe is more realistic because you're around long enough to see. I don't know. I'm just saying stuff that I don't have any credibility to say.

0
💬 0

2481.747 - 2503.68 Dax

this is all just like different answers to the Fermi paradox thing. But to me, I find the problem with the Fermi paradox, which is just to reiterate, it's a given the size, the age of the universe, we'd expect it to be like full of life, even how like long stuff has been around and given how much there is. And there isn't. So then you ask, okay, what are some explanations of that?

0
💬 0

2504.4 - 2518.942 Dax

And there's a lot of good explanations. That's a problem. There's so many good explanations and they could all be true, but the result of all of them, is that life is exceedingly rare and you're unlikely to intersect with it. So that's what kind of bums me about this concept.

0
💬 0

2520.304 - 2535.038 Dax

It bums you out because it would be nice to like... I don't want to die, but if I'm going to die, it's because of an alien invasion. I'm kind of down for that because at least I learned something deeply important for a few seconds before I get wiped out.

0
💬 0

2535.778 - 2559.015 Dax

interesting okay yeah like i don't want to die in like car accident like it's done oh yeah no that's terrible like yeah like i want like if i'm gonna die at least give me some crazy existential moment okay yeah what's your like top three ways to die what would be i'm sorry yeah yeah no i got you existential dread and etc etc okay anyway so

0
💬 0

2559.896 - 2584.036 Adam

Let's talk about the AI app stuff. So this idea was seated in my head just a few days ago when Andreessen was on Lex and he talked about... I think the example they used was email. AI first email. And how so many apps just have AI bolt-ons now. We have a little button in the corner that's like, ask AI. But...

0
💬 0

2584.736 - 2605.089 Adam

companies that are started with the whole premise of like rethinking the product, the entire category of product with AI first. So he used the example of an AI company building an email client or something, which I've now, I think I've downloaded. I don't know if it's the one that they're invested in, but he kind of threw that out there and just said like all the different categories.

0
💬 0

2605.509 - 2609.992 Adam

And then I heard you, did you tweet about this?

0
💬 0

2610.686 - 2614.389 Dax

No, I told you something that you can't repeat. Oh, yeah. It's not public information yet.

0
💬 0

2614.409 - 2631.283 Adam

Which I will not repeat. Thank you. Okay, that's what it was. Yeah, it was a DM. I knew some other data point hit my brain that was like, oh, the app layer of AI. That's a thing. And it's like when you learn a new word and then you start seeing it everywhere.

0
💬 0

2631.703 - 2640.15 Adam

So could you tell me with your big brain that you've been thinking about this probably for like 10 years, could you tell me what is going on in the app AI space?

0
💬 0

2640.35 - 2655.239 Dax

Yeah, so the way I look at it is there's a new capability. Again, this is, I would categorize it in two categories. There is the boring parts is what we're talking about now. And this is the bet that society will continue to be roughly the same. And this isn't like a...

0
💬 0

2656.536 - 2681.291 Dax

know truly disruptive like a totally disruptive thing um you're speaking to like the bolt-on thing or you're speaking to like the commodity of like foundation you're just talking about like building a traditional product but thinking through ai that's like a not very bold like way of looking at all this so part of me like doesn't want to engage with that because like i said i don't i don't believe so far in that it's like much bigger bet

0
💬 0

2682.071 - 2703.978 Dax

but i believe generally that's where you should put your attention and things that kind of fall in that category that said let's say this this ends up you know not being that crazy thing and this is a way this is the direction things go so right now we're in the era of there's a new thing and nobody knows how to build good ux around it right there if you imagine when like the iphone came out

0
💬 0

2704.798 - 2723.466 Dax

pull to swipe or sorry, pull to refresh. Oh yeah. Someone had to come up with that. And the moment they did, it was so obvious. I think we're like in that phase where almost every single product that added AI is just a stupid ass little button that's on top of other shit. And it's just kind of getting in your way and you're always accidentally clicking on.

0
💬 0

2723.786 - 2748.654 Dax

So that's just like, that's the era we're in. But at some point we'll see stuff that is like, oh, obviously. And I think we're actually already starting to see some of that stuff. Have you seen this Granola AI product? No. Okay, so I think it's a brilliant example of what you're talking about, rethinking products from an AI lens. And they did it in a way that is very well executed.

0
💬 0

2748.754 - 2768.623 Dax

It's not the first thing you would think of, right? But they were like, okay, problem existed forever. how do we make people who take notes for meetings, how do we make that easier? Boring problem, been around forever, years of products that do that. Bunch of AI products that do that, right?

0
💬 0

2768.723 - 2791.62 Dax

There's a bunch of AI products that are like, I'm Bob the AI and I'm a bot and I've joined your Zoom call and I'm here to take notes. And it's just like, Weird, totally unnatural, not relating to your current habits thing at all. Weird social norms around it. Like it's just not a good way to introduce this idea to people. So what this product does is it runs on your Mac.

0
💬 0

2792.441 - 2812.087 Dax

it records all the audio from your, um, your meeting. Yeah. From anything that's happening. So we're like, we're also in this era where like no one's doing like direct integrations anymore. Cause AI can just handle a like raw input. So if you can record audio from your Mac, you now support every single app.

0
💬 0

2812.607 - 2813.587 Adam

That makes so much sense.

0
💬 0

2813.707 - 2829.776 Dax

The box, right? Yeah. This shows up in a bunch of different places. Uh, when people are putting AI products, uh, It's totally invisible and it's totally out of your way. They give you a typical notepad you take notes on. Okay. You take your shitty little notes, you know, you comments here and there, whatever.

0
💬 0

2830.857 - 2851.477 Dax

When the meeting is done, AI will go through your notes and augment them with what it knows about the meeting. So if you're like, oh, priority, it knows what you were talking about. saying this is a priority and it'll like make your notes much nicer and just like a one-step process. So it doesn't feel like an AI product. It just feels like a magically good product.

0
💬 0

2851.657 - 2869.514 Dax

I take notes with the same habits that I've had forever. And then at the end, I just get much better notes than I would do with any other app. And I think this is kind of what you're talking about where they're re-imagining it and they've done it in a way where it's not like, you need to chat with my AI bot, right? It's like totally invisible. Super smart.

0
💬 0

2869.534 - 2887.547 Dax

So I think we'll start to see products that they are technically powered by AI, but it's invisible. The only way you can tell is the outcome or the quality of the product is much higher just because all of these structuring unstructured data problems are like effectively solved now.

0
💬 0

2888.194 - 2909.08 Adam

Man. Does that make sense? Yeah, it makes a ton of sense. I've already downloaded Granola now. I feel like this is very exciting as a person who has an entrepreneurial side. It just kind of makes you want to build like a million companies. Not a million. Let's build one. Yeah, just like one company. It just makes you want to build something, doesn't it? It feels like the Wild West.

0
💬 0

2909.14 - 2917.582 Adam

It's like starting over. All the digital products we use just could be reimagined. And there's so many categories of those. And it kind of makes you just want to build some of them.

0
💬 0

2917.802 - 2932.719 Dax

I do think, though, that people should be aware that this isn't a reset to, like, 2010. Because in 2010... What was 2010? Like, you know, it was a similar situation. Like, nothing was built and there was, like, all these opportunities to build these pretty, like, basic, straightforward applications.

0
💬 0

2932.899 - 2937.965 Adam

Wait, 2010, what was the new thing that enabled, like, mobile? What are you talking about?

0
💬 0

2938.005 - 2963.986 Dax

Just, like... More internet, more web, more capability of like SaaS, like was kind of created in that era, all that stuff. Gotcha. In that time, you were shifting people from not using computers to using computers to solve this problem. So as much as it feels like, oh, we're in a reset and there's always a new opportunity, it's not the same because you can't just deliver an MVP. Oh, sure.

0
💬 0

2964.007 - 2976.255 Dax

You can deliver an MVP in 2010. But if you want to build a new email AI product, you need to build something as good as superhuman, as a floor. And then you can do...

0
💬 0

2977.472 - 2978.832 Adam

The stuff that's extra stuff.

0
💬 0

2978.892 - 2979.713 Dax

Innovative, right?

0
💬 0
0
💬 0

2980.613 - 3002.279 Dax

Okay. It's still going to be quite hard just because the bar is very high to get something to switch from something that just out of all the normal app features are pretty exhaustive and work pretty well. That said, that side of things has also just gotten easier to do as well. But yeah, I am feeling this with Radiant because yeah, categorizing financial transactions was very, very difficult.

0
💬 0

3003.799 - 3021.383 Dax

like prior to AI. And now it can do like a really good job. Even like a shitty thing I implemented, I like was able to go through my stuff with like, and I've done this for years, right? Like I manage all my business transactions. I've gone through every single one of them for years and years. And just having AI do a first pass and then me doing a second pass, it's much better.

0
💬 0

3021.564 - 3038.111 Dax

And this is just the beginning of all this stuff. But we still have to build like the entirety of a straightforward app. And you have to do that while the incumbent fails to do the new thing, which I think will happen. It's just, you know, not as easy as it seems.

0
💬 0

3038.692 - 3054.441 Adam

Yeah, there's like the table stakes part that's kind of boring where you just have to have all the features that people expect from an app like that. in order to unlock the new way of thinking about it. So for the Granola case, it was like they had to build an actual note-taking app and all that comes with it.

0
💬 0

3054.662 - 3068.75 Dax

That's a good example of something that works because those table stakes scope is really small. And they benefit from this new dynamic of not having to do 100 integrations with every single, like we support Zoom, we support Google Meet, we support whatever.

0
💬 0

3068.79 - 3079.403 Adam

And how did you explain how that dynamic came to be? Because I get it for like recording audio. it just works for everything. But what you're saying, there's this whole era of not integrating directly with stuff. What's that about?

0
💬 0

3079.463 - 3095.154 Dax

Yeah, so let's say you're like, I mean, let's say we're not actually doing this, but for Radiant, there's 5,000 financial accounts that we need to support for all the various places people have their data. We could just send AI to go like,

0
💬 0

3095.874 - 3123.123 Dax

visit the site for you and like figure out how to pull out your information instead of like mainly doing integration with which is each thing because i can operate at like a like one level down like it doesn't need an api a developer needs an api like an ai agent like in theory doesn't need one so you can kind of like give it a general set of instructions that'll work on any raw input so anywhere where you like needed all these like nice clean integrations that you can probably make do with

0
💬 0

3124.051 - 3128.517 Dax

a much messier unsanctioned integration. Interesting.

0
💬 0

3129.269 - 3148.318 Adam

Okay, that didn't really answer my question. I mean, I don't feel satisfied. Maybe it did, but I think it's like there's another example and I can't remember. I feel like there is another company where it was like, oh, that's a clever way of integrating with everything. Oh no, it was the conversation we had about like an AI tool that just looks at the file system.

0
💬 0

3148.518 - 3163.727 Adam

Use that as a source of truth and you don't have to integrate with every editor. You just interact with the file system. Yeah, so that's like a very clever way to get around This like this thing on your landing page where you have all the things you support. Yeah. It's like just what's a common denominator.

0
💬 0

3163.867 - 3180.46 Dax

The other side of this, though, is if you look at a lot of these products like granola, like there was the other one that I forgot the name of it. It records everything. It takes a good screenshot every three seconds and then like has AI index it. And you can ask it like, hey, what was that thing I read the other day about whatever?

0
💬 0

3180.48 - 3191.378 Dax

So you see how all of these things are native apps at the OS level. It just brings up the question like, isn't Microsoft and Apple just going to bake these in?

0
💬 0

3191.618 - 3194.959 Adam

Oh, yeah. If you're building that kind of stuff, it's scary.

0
💬 0

3195.159 - 3205.962 Dax

Yeah, if you think about this stuff, we're getting these one-off solutions that people come up with, but at the end of the day, if it was just integrated at the OS level, it would just work everywhere and kind of be just a lot more awesome.

0
💬 0

3206.482 - 3211.404 Adam

Yeah. So it feels like that should be the ultimate thing. The Apple intelligence kind of thing, like...

0
💬 0

3212.424 - 3235.329 Dax

apple intelligence should do that stuff if it ever actually does anything sucks apple intelligence sucks but in theory this doesn't even work yet like i don't even think they did they like turn it off because it was like doing bad things i've had it for a while and i have not used it once i think somehow it's made things even worse i feel like i used it even less now than i used to I don't know.

0
💬 0

3235.349 - 3243.718 Adam

I don't know what they're doing. It's pretty bad. Hopefully they do that thing where they catch up really fast because I would like Apple software to be good because I love their hardware.

0
💬 0

3243.738 - 3272.351 Dax

Yeah, we'll see. But I will say this. This type of thinking is new for me where I'm like, see how I described a very clearly good opportunity. And then the ideal, which would be like, you know, Apple, Microsoft integrating, but that ideal might be 10 years away. So there's still plenty of time to make money, you know, successful in that time. Yeah. But I've like shifted to like not.

0
💬 0

3273.892 - 3296.051 Dax

If I can see the ideal and it's not aligned with what I'm doing, I just don't want to work on it. It just feels bad to me now. Even if it's 10 years, you just don't want to invest in that idea. Yeah, I want to have a real shot of... building the ultimate thing. Even if that means, even if the opportunity is great, otherwise.

0
💬 0

3296.272 - 3306.138 Adam

Are you quitting terminal? Is that what you're saying? Is it not AI enough for you? It's not AI enough. You missed the meeting yesterday.

0
💬 0

3306.158 - 3325.951 Dax

I'm just saying. I was the only one that remembered the meeting. Yeah, that's true. That's the funny part. There was no meeting. We have weekly Wednesday meetings and I was like, oh, I can't make it. So I posted at 2.30 when we have the meeting. Hey guys, I can't make the meeting. And nobody else said anything. The meeting didn't happen. So everyone missed it.

0
💬 0

3326.031 - 3328.251 Dax

I was the only one that actually remembered that it was supposed to happen.

0
💬 0

3328.271 - 3333.033 Adam

It only would have happened if you started it. But the fact that you didn't start it because you weren't going to make it.

0
💬 0

3333.053 - 3339.075 Dax

It's funny. There's something else I wanted to talk about. It's totally unrelated to all this.

0
💬 0

3339.095 - 3341.516 Adam

Totally unrelated to AI and apps and aliens.

0
💬 0

3342.496 - 3364.923 Dax

yes uh i posted a video last week or was it earlier this week no it was it was on sunday post on sunday best video i've ever made in terms of oh really views yeah i gotta check out the sst youtube a video again i think it's it's really not the execution of the video i think we're just picking like some pretty good topics what's your handle i did it i did it just

0
💬 0

3366.704 - 3369.446 Adam

Nope, that's something Korean. That's definitely not it.

0
💬 0

3369.746 - 3370.406 Dax

What? Really?

0
💬 0
0
💬 0

3372.327 - 3372.887 Dax

I mean, I guess.

0
💬 0

3374.648 - 3375.449 Adam

SST dev.

0
💬 0

3375.609 - 3376.729 Dax

You don't have to look it up. I'll just tell you.

0
💬 0

3377.029 - 3394.979 Adam

I got it. No, I got it. I don't use my computer. Is it that one? Yeah, so I made a video on my remote dev setup. Oh, I've been wanting this video. I can't believe I didn't see it. How do I not see? This is how big the world is. Anytime you think everyone just sees all your stuff, if anyone sees your videos, I should see your videos.

0
💬 0

3395.279 - 3423.048 Adam

right that's true yeah and i didn't know you made this video like are you ever on youtube no no i go to youtube from twitter links that people post would you see it oh because you think i would see i mean sometimes you would think i would see your tweets i don't know that's true i feel like we're friends and i should know when you make a good video that i really want to see and this is one i've wanted you to outline because i didn't want to bug you too much and be like hey could you tell me how you do the remote team bucks thing but now you've just made the video and i can watch it like every other normie this is awesome

0
💬 0

3423.348 - 3433.257 Dax

Yeah, it was. I think a lot of people were waiting for it, which is why I think it did pretty well. So this is our best performing video ever, which we're really happy about. I love the title. I don't use my computer. Yeah.

0
💬 0

3433.437 - 3434.578 Adam

I mean, the thumbnail.

0
💬 0

3435.379 - 3462.536 Dax

Yeah. So YouTube comments. Let's talk about YouTube comments real quick. Oh, yeah. For me personally, this is where I experienced just like. the dumbest of all humanity. I think it is really wild that people like I've been on Twitter a long time. Of course, I get dumb, annoying comments there, but YouTube somehow just consistently tops it. It surfaces persona that I run into a lot on the Internet.

0
💬 0

3462.817 - 3482.488 Dax

And to me, it's like a very miserable persona. It's a persona of someone that thinks that every single thing they interact with is a scam somehow. Like they're like, they're so eager to be like, I think what's driving them is they want to feel like they're smart and they like picked up on something that everyone else is falling for.

0
💬 0

3482.508 - 3505.268 Dax

But they're so desperate for that moment that every single thing that they perceive, they like project onto it that, oh, this is like a scam somehow. Yeah. So a bunch of people were just like, this is an ad or like, They were talking about how, like, I only do this because it's free. Because I mentioned that my server that I use now is sponsored. But, like, I've been doing this for years now.

0
💬 0

3505.288 - 3506.829 Adam

Shout out to ReliableSite.com.

0
💬 0

3506.869 - 3508.15 Dax

ReliableSite, yeah.

0
💬 0

3508.291 - 3509.291 Adam

It's very reliable.

0
💬 0

3511.413 - 3531.805 Dax

But, like, I literally was paying for it before I got that deal. Mm-hmm. And also in the video, I outline how you can start really small and the entry price for this. Again, people love saying $5 VPS. It's just a $5 VPS. Realistically, maybe more like 15 for something that's decent, but reasonable price.

0
💬 0

3531.825 - 3544.591 Dax

But everyone was just like, as soon as their brains work together, like, oh, this is the angle. A bunch of comments were around talking about how like, I was trying to trick them into doing this because it's expensive.

0
💬 0
0
💬 0

3546.313 - 3556.107 Dax

And I'm just like, how, like, how do you go through life like this? Like everything must be so miserable if you're just perceiving it as like every person you interact with is trying to rip you off somehow, you know?

0
💬 0

3557.049 - 3569.578 Adam

Yeah. The internet kind of sucks. It's kind of amazing, but it also kind of sucks. I'm just reading YouTube comments now. I wish I hadn't. Sorry, would you just not, just don't remind me that YouTube exists and I'll be a happier person.

0
💬 0

3570.578 - 3573.979 Dax

That's funny. What's even at the top right now? I think one of those is probably at the top.

0
💬 0

3574.019 - 3590.685 Adam

So it's funny. I just saw Kevin Naughton commented an excuse to not do any work for the next three or four weeks. I really do need to spend like two days and just like copy all your NeoVim setup and my NeoVim is so bad right now. I know. I just need to do all that work. And I just, it's so hard to take a time out.

0
💬 0

3590.785 - 3610.572 Adam

It's that stupid meme that I do hate because I resonate with it of like the cavemen with like the square wheels. And they're like too busy. Leave me alone. And the guy's like, but here's a wheel. It's that, but it's just so hard. Maybe you should just go use Cursor. You know what? I've actually thought about downloading it. I'm doing it right now. I do want to... I have it downloaded.

0
💬 0

3610.612 - 3617.436 Adam

Yeah, I want to download it. Like, why have I... It's like, all this stuff is free. Paid for by VCs. Why am I not using all of it? Just play around with it.

0
💬 0

3617.676 - 3619.176 Dax

It's not free, but like... It's not free?

0
💬 0

3619.196 - 3619.857 Adam

I don't think it's free.

0
💬 0

3619.877 - 3651.176 Dax

You have to pay for... It's not crazy. Yeah, it's not that expensive. I just assumed it was free. It's just so miserable for me going... Yeah, this is like another point of... for me, which is being very dramatic. Just dress around my editor. I really like NeoVim and it is truly incredibly productive. But this cursor style of thing, if it continues to get better...

0
💬 0

3652.352 - 3672.764 Dax

That's just going to be the most productive thing. But it doesn't address the parts that I particularly find annoying. I hate the clunkiness and the slowness of VS Code and navigating and stuff. And yes, you're doing all that less with this type of thing, but it's not taking it to zero. I don't see why NeoVim would get something that's equivalent. I've seen the current effort for it.

0
💬 0

3673.385 - 3683.171 Dax

And I go visit the GitHub and I read it once a week. And I'm just like, this just doesn't feel like... it's going to be good. And there's so much setup involved.

0
💬 0

3683.191 - 3698.284 Adam

Yeah. It's the we have cursor at home and it's like cursor at home is like four libraries duct taped together and like socks. Like why am I installing something like that on my machine? What is going on? It's like there's too much. There's too much steps. Too many steps.

0
💬 0

3698.384 - 3716.216 Dax

I don't mind switching editors. I just wish the foundation that this new stuff was built on was not VS Code because VS Code sucks. That said, I think Zed Will probably, because they're in this hyper-competitive mode. Wait, you think they will what? I think their AI stuff will get as good as Cursor's, if not better.

0
💬 0

3716.336 - 3718.258 Adam

So they are working on AI stuff then?

0
💬 0

3718.278 - 3719.158 Dax

They have to be.

0
💬 0

3719.338 - 3734.327 Adam

Because I just had the thought in my sleep last night, which is just an indictment on my sleep. I had the thought like, oh, poor Zed. How does Zed have a chance when there's all these AI things now? But they're doing the AI thing. It's like there's so many editors already. If you're not an AI editor, good luck.

0
💬 0

3735.529 - 3762.177 Dax

Right? Yeah, no, it's, it's true. Like they have a tough battle because they, okay. It kind of goes in two directions on one hand, like, yeah, it was way faster to ship cursor by building on VS code. On the other hand, I've just found as I get older, that doing the more extreme thing always ends up having a good benefit that you can't predict. So them going ground up, building a new editor,

0
💬 0

3763.703 - 3772.522 Dax

way harder all the ship fast mindset would be like that's a waste of time just focus on the part that differentiates ai part but i can see how

0
💬 0

3773.661 - 3796.378 Adam

actually know like this is going to end up being the thing that wins so to me it's plausible like i don't i don't think they're screwed uh and that they are going to do ai stuff yeah i just didn't even know they were working on if they're working on the ai stuff then yeah good for them and they're not built on i have no idea i don't keep up on this stuff i've just i mean i use neovim because someone said use neovim so i do i mean they say ai in their um

0
💬 0

3796.818 - 3815.103 Dax

integrate upcoming lms your workflow generate transform analyze code so and cursor is not a lot of features it's like a really small set of features to be honest i've never played with it i'm literally setting it up right now but yeah so i'm like okay that gives me some hope because maybe the editor experience won't suck but then it's not in the terminal anymore so then my whole setup

0
💬 0

3816.283 - 3822.346 Dax

is now like a lot more confusing. Like I like having everything in a single terminal and switching between it.

0
💬 0

3822.586 - 3845.126 Adam

Yeah, all my muscle memory is around like switching between Tmux panes and doing all this stuff. And if I'm just in some editor now, I guess like I can get the Vim experience in the files, the actual files I'm modifying, but like, Okay, can I go back to something just on behalf of the normies that listen to us? Why is VS Code bad again? I know we all hate VS Code, but just someone remind me.

0
💬 0

3845.166 - 3845.946 Adam

Why is it bad?

0
💬 0

3845.966 - 3850.91 Dax

Whenever I try to use it, it's like a slow piece of shit and the Vim emulation is like really bad.

0
💬 0

3851.23 - 3851.91 Adam

So it's slow.

0
💬 0

3851.97 - 3853.872 Dax

Yeah, to me it feels bad to use.

0
💬 0

3853.912 - 3860.517 Adam

Okay. I just take everyone's word for it when everyone's like making fun of VS Code. I'm like, yeah, VS Code. But I didn't actually know why.

0
💬 0

3861.077 - 3866.281 Dax

It just doesn't feel good to use. That's really all it comes down to for me. Okay, okay.

0
💬 0

3867.418 - 3877.186 Adam

Oh, I'm going to try Cursor. I'm going to give it a go. I hope it doesn't botch the whole terminal code repo. YOLO.

0
💬 0

3877.246 - 3895.498 Dax

Here we go. Yeah. Zed does have this. They have their own remote protocol thing, so I could continue to effectively host Zed on my server, even though the front end of it is running on my machine. That's cool. But again, then I have to like... have like a separate terminal window, unless my terminals run inside of Zed.

0
💬 0

3895.858 - 3904.383 Adam

Ah, just use the integrated terminal. I hear it's good. Skeptical, but. Skeptical. I'm going to give it a shot. I'll let the listeners know if Cursor's good. They probably already know, but I'll let you know.

0
💬 0

3904.423 - 3909.086 Dax

No, use Cursor and use Zed. And then go fix your NeoVim. Yeah, I need to fix my NeoVim.

0
💬 0

3909.659 - 3918.065 Adam

Okay, I'll try Zed. If Zed has AI stuff, I'll start there, actually, because I'd rather use the thing that you think is good, generally, in life.

0
💬 0

3918.685 - 3922.067 Dax

Introducing Zed AI. This was, like, in August. They're definitely stuff.

0
💬 0

3922.127 - 3930.773 Adam

Definitely stuff. Zed. What is it? Zed.dev. The editor for what's next. Humans and AI.

0
💬 0

3931.654 - 3955.999 Dax

Let's go. I had this thought the other day. I was like, if you're, like, a VC-funded company... You've probably shifted towards AI. If you look at everyone's websites, no matter how random it is, they seem to really focus on AI. Most of them just took their existing slogan and added and AI to it. Wait, is that literally what Zed did? Maybe. Yeah, with humans and AI. And AI.

0
💬 0

3956.38 - 3977.365 Dax

I saw something the other day, and I was like, yeah, I'm looking at Terso's website, and at the bottom now they have unlimited databases, personalized scale, supercharge, which, you know, probably was there before your LLM applications. So there's like that. Okay. We've all observed this, you know, whatever.

0
💬 0

3977.385 - 4004.972 Dax

But then I think about, okay, there's VC funding companies at this stage that had not done this at all. The guys that see hasn't done this, but like ignoring us, um, And I'm like, what is that like? I'm like, yeah, like Bun didn't go and add like the best way to run JavaScript for humans and AI, you know? That's a good point. I'm not making fun of Zed because with Zed it actually makes sense.

0
💬 0

4005.232 - 4022.424 Dax

But a lot of just general purpose things have now added and AI to it. So I'm like, how are they thinking about this stuff? Like they're just in a way like heads down ignoring it. I'm sure they're not actually, but like, you know, their strategy is heads down ignoring it. Yeah. Oh, what?

0
💬 0

4023.244 - 4035.812 Dax

All right, this is probably a coincidence, but I went to Bunsite, and they have a used buy section, and one of them is Midjourney. Oh, so they also kind of... Maybe that's their, like... That's their little tip. It's just a coincidence.

0
💬 0

4036.372 - 4047.019 Adam

Tip of the hat to AI. Used by X, Typeform, Midjourney, and Tailwind. That's an interesting collection of companies. You know who else uses it? Terminal. Terminal. We've got to get the Terminal logo on the Bunsite.

0
💬 0

4047.059 - 4054.905 Dax

Terminal uses it. Let's go. I... I think I might be the number one bun user. You might be. I think I'm the number one bun user because I use it everywhere.

0
💬 0

4054.925 - 4059.168 Adam

Actually, I've been bun-pilled. I'm enjoying bun quite a lot because I just copy everything you do.

0
💬 0

4060.169 - 4083.832 Dax

I cannot stop talking about how good their product execution is. It is so good. Yeah, they're incredible. Every single time they put out a feature, I've been like, And I don't get it. And then fast forward three weeks later, I'm using it. Like it just like invisibly just snuck into every little piece. So we're launching a new update in the SD console.

0
💬 0

4084.252 - 4103.083 Dax

We have this workflow section that's the config where you can like set up your CI steps. And before we didn't let that be configured. So most people don't have to. The defaults make sense. But if you want to configure it. We were like, okay, how do we like let you run shell scripts, but like in JavaScript and have your own JavaScript conditionals. And we're like, okay, fuck it.

0
💬 0

4103.103 - 4113.705 Dax

We're just going to drop bun shell in there. So now the config is just like your workflow is just bun shell. And they already figured out all of that stuff. So really great product execution. Amazing.

0
💬 0

4113.985 - 4119.626 Adam

There's nothing better than that. Like a weight dollar sign and then put your shell command in there. That feels so good.

0
💬 0

4120.486 - 4122.707 Dax

Yep. Yep. Yep. Cool. All right.

0
💬 0

4122.727 - 4126.676 Adam

I got to go for non-biological reasons. Okay, no, they're biological.

0
💬 0

4126.716 - 4128.397 Dax

No one believes you.

0
💬 0

4128.457 - 4136.581 Adam

I gotta go, Dax. When I say I gotta go and you're like, one more thing and then you have like four more things. We could pause if you want to do a two-hour episode.

0
💬 0

4137.582 - 4140.283 Dax

Okay, no, that's fine. You can go. You don't want to talk to me. It's fine.

0
💬 0

4140.483 - 4143.585 Adam

I want to talk to you. You don't want to talk to me. It's fine. I'm going to pee myself.

0
💬 0

4143.605 - 4149.588 Dax

This is our last episode ever. We're not going to do this podcast anymore. Stop it. Adam doesn't want to talk to me.

0
💬 0

4149.888 - 4152.79 Adam

Okay, I'm going. Goodbye.

0
💬 0
Comments

There are no comments yet.

Please log in to write the first comment.