Menu
Sign In Pricing Add Podcast

Lex Fridman Podcast

#459 – DeepSeek, China, OpenAI, NVIDIA, xAI, TSMC, Stargate, and AI Megaclusters

7670.606 - 7690.874 Nathan Lambert

Pre-training is all about flops, right? It's all about flops. There's things you do, like mixture of experts that we talked about, to trade off interconnect... Or to trade off other aspects and lower the flops and rely more on interconnect and memory. But at the end of the day, it's flops is everything, right? We talk about models in terms of how many flops they are, right?

0
💬 0

Comments

There are no comments yet.

Log in to comment.