Lex Fridman Podcast
#452 – Dario Amodei: Anthropic CEO on Claude, AGI & the Future of AI & Humanity
Chris Olah
I think it's a thing that has to be handled with extreme care for many reasons. Like one is, you know, like this is a, for example, if you have the models changing like this, you probably don't want people performing like long-term attachments to something that might change with the next iteration.
0
💬
0
Comments
Log in to comment.
There are no comments yet.