Menu
Sign In Add Podcast

All-In with Chamath, Jason, Sacks & Friedberg

"Founder Mode," DOJ alleges Russian podcast op, Kamala flips proposals, Tech loses Section 230?

3948.737 - 3967.464 Chamath Palihapitiya

And the reason is that the way that these, if you go back to the actual models themselves, the way that they're architected, right? Like if you look inside of a transformer, what is it? There's a neural network part and then there's a self-attention part. What is the self-attention thing trying to do? It's trying to figure out the momentum and the importance of a given input.

0
💬 0

Comments

There are no comments yet.

Log in to comment.