Lex Fridman Podcast
#433 – Sara Walker: Physics of Life, Time, Complexity, and Aliens
Sarah Walker
In some sense, AGI is a universal explainer, but it might be that a computer is much more efficient at doing, I don't know, prime factorization or something than a human is, but it doesn't mean that it's necessarily smarter or has a broader reach of the kind of things that can understand than a human does. And so I think we really have to think about, is it a level shift
0
💬
0
Comments
Log in to comment.
There are no comments yet.