Menu
Sign In Pricing Add Podcast

Acquired

Nvidia Part III: The Dawn of the AI Era (2022-2023)

630.541 - 652.638 Ben Gilbert

And basically, the algorithm that other people had been trying over the years just wasn't massively parallel the way that a graphics card sort of enables. So if you actually can consume the full compute of a graphics card, then perhaps you could run some unique novel algorithm and do it on, you know, a fraction of the time and expense that it would take in these supercomputer laboratories.

0
💬 0

Comments

There are no comments yet.

Log in to comment.