Michael Truell
👤 PersonAppearances Over Time
Podcast Appearances
So around that time, for some of us, there were a lot of conceptual conversations about what's this going to look like? What's the story going to be for all these different knowledge worker fields about how they're going to be made better by this technology getting better?
And then I think there were a couple of moments where the theoretical gains predicted in that paper started to feel really concrete. And it started to feel like a moment where you could actually go and not do a PhD if you wanted to work on, do useful work in AI. Actually felt like now there was this whole set of systems one could build that were really useful.
And then I think there were a couple of moments where the theoretical gains predicted in that paper started to feel really concrete. And it started to feel like a moment where you could actually go and not do a PhD if you wanted to work on, do useful work in AI. Actually felt like now there was this whole set of systems one could build that were really useful.
And then I think there were a couple of moments where the theoretical gains predicted in that paper started to feel really concrete. And it started to feel like a moment where you could actually go and not do a PhD if you wanted to work on, do useful work in AI. Actually felt like now there was this whole set of systems one could build that were really useful.
And I think that the first moment we already talked about a little bit, which was playing with the early bit of Copilot, that was awesome and magical. I think that the next big moment where everything kind of clicked together was actually getting early access to GPT-4. So it was sort of end of 2022 was when we were tinkering with that model. And the step up in capabilities felt enormous.
And I think that the first moment we already talked about a little bit, which was playing with the early bit of Copilot, that was awesome and magical. I think that the next big moment where everything kind of clicked together was actually getting early access to GPT-4. So it was sort of end of 2022 was when we were tinkering with that model. And the step up in capabilities felt enormous.
And I think that the first moment we already talked about a little bit, which was playing with the early bit of Copilot, that was awesome and magical. I think that the next big moment where everything kind of clicked together was actually getting early access to GPT-4. So it was sort of end of 2022 was when we were tinkering with that model. And the step up in capabilities felt enormous.
And previous to that, we had been working on a couple of different projects. We had been because of Copilot, because of scaling Oz, because of our prior interest in the technology, we had been tinkering around with tools for programmers, but things that are like very specific.
And previous to that, we had been working on a couple of different projects. We had been because of Copilot, because of scaling Oz, because of our prior interest in the technology, we had been tinkering around with tools for programmers, but things that are like very specific.
And previous to that, we had been working on a couple of different projects. We had been because of Copilot, because of scaling Oz, because of our prior interest in the technology, we had been tinkering around with tools for programmers, but things that are like very specific.
So, you know, we were building tools for financial professionals who have to work within a Jupyter notebook or like, you know, playing around with, can you do static analysis with these models? And then the stuff up in GPT-4 felt like, look, that really made concrete the theoretical gains that we had predicted before. Felt like you could build a lot more just immediately at that point in time.
So, you know, we were building tools for financial professionals who have to work within a Jupyter notebook or like, you know, playing around with, can you do static analysis with these models? And then the stuff up in GPT-4 felt like, look, that really made concrete the theoretical gains that we had predicted before. Felt like you could build a lot more just immediately at that point in time.
So, you know, we were building tools for financial professionals who have to work within a Jupyter notebook or like, you know, playing around with, can you do static analysis with these models? And then the stuff up in GPT-4 felt like, look, that really made concrete the theoretical gains that we had predicted before. Felt like you could build a lot more just immediately at that point in time.
And also, if we were being consistent, it really felt like this wasn't just going to be a point solution thing. This was going to be all of programming was going to flow through these models. And it felt like that demanded a different type of programming environment, a different type of programming. And so we set off to build that sort of larger vision around that.
And also, if we were being consistent, it really felt like this wasn't just going to be a point solution thing. This was going to be all of programming was going to flow through these models. And it felt like that demanded a different type of programming environment, a different type of programming. And so we set off to build that sort of larger vision around that.
And also, if we were being consistent, it really felt like this wasn't just going to be a point solution thing. This was going to be all of programming was going to flow through these models. And it felt like that demanded a different type of programming environment, a different type of programming. And so we set off to build that sort of larger vision around that.
Technically incorrect, but one point away. Amon was very enthusiastic about this stuff. Yeah. And before, Amon had this, like, Scaling Laws t-shirt that he would walk around with, where it had the, like- charts and like the formulas on it.
Technically incorrect, but one point away. Amon was very enthusiastic about this stuff. Yeah. And before, Amon had this, like, Scaling Laws t-shirt that he would walk around with, where it had the, like- charts and like the formulas on it.
Technically incorrect, but one point away. Amon was very enthusiastic about this stuff. Yeah. And before, Amon had this, like, Scaling Laws t-shirt that he would walk around with, where it had the, like- charts and like the formulas on it.
of vs code that are doing sort of ai type stuff what was the decision like to just fork vs code so the decision to do an editor seemed kind of self-evident to us for at least what we wanted to do and achieve because when we started working on the editor the idea was these models are going to get much better their capabilities are going to improve and it's going to entirely change how you build software both in a you will have big productivity gains but also radical and not like the act of building software is going to change a lot