Acquired
Nvidia Part III: The Dawn of the AI Era (2022-2023)
Ben Gilbert
And basically, the algorithm that other people had been trying over the years just wasn't massively parallel the way that a graphics card sort of enables. So if you actually can consume the full compute of a graphics card, then perhaps you could run some unique novel algorithm and do it on, you know, a fraction of the time and expense that it would take in these supercomputer laboratories.
0
💬
0
Comments
Log in to comment.
There are no comments yet.