Menu
Sign In Pricing Add Podcast

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch

20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

655.208 - 670.63 Eiso Kant

We are taking huge amounts of data and we're forcing this generalization of learning to happen in a very small space. And this is why we essentially see these difference in capabilities. Larger models require, essentially, it's easier for them to generalize because we're not forcing so much data into such a small compression space.

0
💬 0

Comments

There are no comments yet.

Log in to comment.