Menu
Sign In Pricing Add Podcast

Lex Fridman Podcast

#430 – Charan Ranganath: Human Memory, Imagination, Deja Vu, and False Memories

10497.697 - 10526.947 Lex Fridman

I mean, the process of allocating attention across time seems to be a really important process. Even the breakthroughs that you get with machine learning mostly has to do attention is all you need. It's about attention. Transform is about attention. So attention is a really interesting one. But then like, yeah, how you allocate that attention

0
💬 0

Comments

There are no comments yet.

Log in to comment.