Lex Fridman Podcast
#430 – Charan Ranganath: Human Memory, Imagination, Deja Vu, and False Memories
Lex Fridman
I mean, the process of allocating attention across time seems to be a really important process. Even the breakthroughs that you get with machine learning mostly has to do attention is all you need. It's about attention. Transform is about attention. So attention is a really interesting one. But then like, yeah, how you allocate that attention
0
💬
0
Comments
Log in to comment.
There are no comments yet.