Menu
Sign In Pricing Add Podcast

Lex Fridman Podcast

#452 – Dario Amodei: Anthropic CEO on Claude, AGI & the Future of AI & Humanity

17814.77 - 17824.32

And so I think that's, if you want the deepest reason why we want to have interpretable monosemantic features, I think that's really the deep reason.

0
💬 0

Comments

There are no comments yet.

Log in to comment.