Lex Fridman Podcast
#387 – George Hotz: Tiny Corp, Twitter, AI Safety, Self-Driving, GPT, AGI & God
George Hotz
Every layer of the stack. Every layer. Every layer of the stack, removing Turing completeness allows you to reason about things, right? So the reason you need to do branch prediction in a CPU and the reason it's prediction, and the branch predictors are, I think they're like 99% on CPUs. Why do they get 1% of them wrong? Well, they get 1% wrong because you can't know. Right?
0
💬
0
Comments
Log in to comment.
There are no comments yet.