Lex Fridman Podcast
#434 – Aravind Srinivas: Perplexity CEO on Future of AI, Search & the Internet
Aravind Srinivas
But now you got this new transformer model that's 100X more efficient at getting to the same performance, which means if you run the same job, you would get something that's way better if you apply the same amount of compute. And so they just trained transformer on like all the books, like storybooks, children's storybooks, and that got like really good.
0
💬
0
Comments
Log in to comment.
There are no comments yet.