Menu
Sign In Pricing Add Podcast

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch

20VC: Raising $500M To Compete in the Race for AGI | Will Scaling Laws Continue: Is Access to Compute Everything | Will Nvidia Continue To Dominate | The Biggest Bottlenecks in the Race for AGI with Eiso Kant, CTO @ Poolside

351.536 - 370.24 Eiso Kant

But when we have areas where we have very little data, models really struggle to learn truly more capable areas. And I mean improvements in reasoning, improvements in planning capabilities, improvement in deep understanding of things. While as humans we don't require so much data, the way to think about models is that they require magnitudes order more data to learn the same thing.

0
💬 0

Comments

There are no comments yet.

Log in to comment.