All-In with Chamath, Jason, Sacks & Friedberg
DeepSeek Panic, US vs China, OpenAI $40B?, and Doge Delivers with Travis Kalanick and David Sacks
Chamath Palihapitiya
Yeah. So when you have a big, large parameter model, The way that you get to a smaller, more usable model along the lines of what Sax mentioned is through this process called distillation where the big model feeds the little model. So the little model is asking questions of the big model and you take the answers and you refine. And by the way, you can see this, Nick, I sent you a clip.
0
💬
0
Comments
Log in to comment.
There are no comments yet.