Menu
Sign In Add Podcast

Lex Fridman Podcast

#416 – Yann Lecun: Meta AI, Open Source, Limits of LLMs, AGI & the Future of AI

2390.194 - 2409.82 Yann LeCun

Right. And the thing is, those self-supervised algorithms that learn by prediction, even in representation space, they learn more concepts if the input data you feed them is more redundant. The more redundancy there is in the data, the more they're able to capture some internal structure of it.

0
💬 0

Comments

There are no comments yet.

Log in to comment.