Lex Fridman Podcast
#452 – Dario Amodei: Anthropic CEO on Claude, AGI & the Future of AI & Humanity
Dario Amodei
Yes. All of those. In particular, linear scaling up of bigger networks bigger training times, and more data. So all of these things, almost like a chemical reaction. You have three ingredients in the chemical reaction, and you need to linearly scale up the three ingredients. If you scale up one, not the others, you run out of the other reagents and the reaction stops.
0
💬
0
Comments
Log in to comment.
There are no comments yet.