Lex Fridman Podcast
#452 – Dario Amodei: Anthropic CEO on Claude, AGI & the Future of AI & Humanity
Yeah, well, maybe I want to distinguish two things. So one is the complexity of the feature or the concept, right? And the other is the nuance of how subtle the examples we're looking at. So when we show the top dataset examples, those are the most extreme examples that cause that feature to activate. And so it doesn't mean that it doesn't fire for more subtle things.
0
💬
0
Comments
Log in to comment.
There are no comments yet.