Lex Fridman Podcast
#452 – Dario Amodei: Anthropic CEO on Claude, AGI & the Future of AI & Humanity
Chris Olah
So in some ways it's like how I map out the model. I think that people focus a lot on these quantitative evaluations of models. And this is a thing that I've said before, but I think in the case of language models, A lot of the time, each interaction you have is actually quite high information. It's very predictive of other interactions that you'll have with the model.
0
💬
0
Comments
Log in to comment.
There are no comments yet.