Lex Fridman Podcast
#452 – Dario Amodei: Anthropic CEO on Claude, AGI & the Future of AI & Humanity
Lex Fridman
Yeah, you've written about this as kind of organs question. Yeah, exactly. If we think of interpretability as a kind of anatomy of neural networks, most of the circus threads involve studying tiny little veins, looking at the small scale and individual neurons and how they connect. However, there are many natural questions that the small scale approach doesn't address.
0
💬
0
Comments
Log in to comment.
There are no comments yet.