Lex Fridman Podcast
#452 – Dario Amodei: Anthropic CEO on Claude, AGI & the Future of AI & Humanity
Dario Amodei
It's actually – it's just very hard to control the behavior of the model, to steer the behavior of the model in all circumstances at once. You can kind of – there's this whack-a-mole aspect where you push on one thing and like these – these other things start to move as well that you may not even notice or measure.
0
💬
0
Comments
Log in to comment.
There are no comments yet.