Decoder with Nilay Patel
DeepSeek, Stargate, and the new AI arms race
Decoder Host
And no matter what, maintaining those models once they've been trained and serving them to millions of customers costs quite a lot. Assuming that just because you can train a model cheaply doesn't mean you can run inference on it at the scale OpenAI, Google, and Anthropic do. In other words, we might need those chips and data centers after all.
0
💬
0
Comments
Log in to comment.
There are no comments yet.