All-In with Chamath, Jason, Sacks & Friedberg
In conversation with Sam Altman
David Friedberg
Does that mean that there's not going to be long training cycles and it's continuously retraining or training submodels, Sam? And maybe you could just speak to us about what might change architecturally going forward with respect to large models.
0
💬
0
Comments
Log in to comment.
There are no comments yet.