Acquired
Nvidia Part III: The Dawn of the AI Era (2022-2023)
Ben Gilbert
So while memory has scaled up, I mean, we're going to get flashing all the way forward, the H100s, on-chip RAM is like 80 gigabytes. The memory hasn't scaled up nearly as fast as the models have actually scaled in size. The memory requirements for training AI are just obscene.
0
💬
0
Comments
Log in to comment.
There are no comments yet.