Acquired
Nvidia Part III: The Dawn of the AI Era (2022-2023)
Ben Gilbert
Now, with transformers, even if your string of text that you're inputting is a thousand words long, it can happen just as quickly in humid, measurable time as if it were ten words long, supposing that there were enough cores in that big GPU. So the big innovation here is you could now train sequence-based models in a parallel way.
0
💬
0
Comments
Log in to comment.
There are no comments yet.