Lex Fridman Podcast
#416 – Yann Lecun: Meta AI, Open Source, Limits of LLMs, AGI & the Future of AI
Yann LeCun
Right. And the thing is, those self-supervised algorithms that learn by prediction, even in representation space, they learn more concepts if the input data you feed them is more redundant. The more redundancy there is in the data, the more they're able to capture some internal structure of it.
0
💬
0
Comments
Log in to comment.
There are no comments yet.