Lex Fridman Podcast
#452 – Dario Amodei: Anthropic CEO on Claude, AGI & the Future of AI & Humanity
Chris Olah
So if I looked at this, so I try and like almost like, you know, see myself from the position of the model and be like, what is the exact case that I would misunderstand or where I would just be like, I don't know what to do in this case. And then I give that case to the model and I see how it responds. And if I think I got it wrong, I add more instructions or even add that in as an example.
0
💬
0
Comments
Log in to comment.
There are no comments yet.