The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside
Eiso Kant
You have a multi-trillion parameter model that is what we often, you know, often architected as an MOE, meaning that not all of those parameters activate during inference time, but they're still very large. It's too expensive. Every request that you make to that model is not a couple of cents. And so you have to find a way to actually build models that you can actually run for customers.
0
💬
0
Comments
Log in to comment.
There are no comments yet.