Menu
Sign In Pricing Add Podcast

Lex Fridman Podcast

#452 – Dario Amodei: Anthropic CEO on Claude, AGI & the Future of AI & Humanity

13921.143 - 13937.548 Chris Olah

So people, when it comes to AI alignment, will ask things like, well, whose values should it be aligned to? What does alignment even mean? And there's a sense in which I have all of that in the back of my head. I'm like, you know, there's like social choice theory. There's all the impossibility results there.

0
💬 0

Comments

There are no comments yet.

Log in to comment.