Menu
Sign In Add Podcast

All-In with Chamath, Jason, Sacks & Friedberg

In conversation with Sam Altman

221.596 - 234.801 David Friedberg

Does that mean that there's not going to be long training cycles and it's continuously retraining or training submodels, Sam? And maybe you could just speak to us about what might change architecturally going forward with respect to large models.

0
💬 0

Comments

There are no comments yet.

Log in to comment.