Menu
Sign In Pricing Add Podcast

The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch

20VC: AI Scaling Myths: More Compute is not the Answer | The Core Bottlenecks in AI Today: Data, Algorithms and Compute | The Future of Models: Open vs Closed, Small vs Large with Arvind Narayanan, Professor of Computer Science @ Princeton

1082.354 - 1104.128 Arvind Narayanan

With respect to training costs, if you want to build a smaller model at the same level of capability or without compromising capability too much, you have to actually train it for longer. So that increases training costs. But that's maybe okay because you have a smaller model. You can push it to the consumer device or even if it's running on the cloud, your server costs are lower.

0
💬 0

Comments

There are no comments yet.

Log in to comment.