Menu
Sign In Pricing Add Podcast

Lex Fridman Podcast

#452 – Dario Amodei: Anthropic CEO on Claude, AGI & the Future of AI & Humanity

1861.307 - 1875.133 Dario Amodei

On the inside, the models aren't, you know, there's no reason why they should be designed for us to understand them, right? They're designed to operate. They're designed to work, just like the human brain or human biochemistry. They're not designed for a human to open up the hatch, look inside and understand them.

0
💬 0

Comments

There are no comments yet.

Log in to comment.