Menu
Sign In Pricing Add Podcast

Acquired

Nvidia Part III: The Dawn of the AI Era (2022-2023)

2471.348 - 2505.616 David Rosenthal

So to give people a sense of why we're saying that the idea of training on very, very, very large amounts of data here is crazy expensive, GPT-1 had roughly 120 million parameters that it was trained on. GPT-2 had 1.5 billion. GPT-3 had $175 billion, and GPT-4, OpenAI hasn't announced, but it's rumored that it has about 1.7 trillion parameters that it was trained on.

0
💬 0

Comments

There are no comments yet.

Log in to comment.