Acquired
Nvidia Part III: The Dawn of the AI Era (2022-2023)
Ben Gilbert
And then a second thing happens after this unsupervised pre-training step, where you then have supervised fine-tuning. The unsupervised pre-training used a large corpus of text to learn the sort of general language, and then it was fine-tuned on labeled data sets for specific tasks that you sort of really want the model to be actually useful for.
0
💬
0
Comments
Log in to comment.
There are no comments yet.