The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside
Eiso Kant
Pretty much what we've seen consistently with every two-year generation of training from NVIDIA is about a 2x performance increase. But training is about 2x every two years. On inference, though, I think there's a lot of hope on Blackwell because it looks like for inference, Blackwell might potentially unlock a much, much larger gain.
0
💬
0
Comments
Log in to comment.
There are no comments yet.