Menu
Sign In Pricing Add Podcast

Lex Fridman Podcast

#387 – George Hotz: Tiny Corp, Twitter, AI Safety, Self-Driving, GPT, AGI & God

3056.824 - 3076.468 George Hotz

Every layer of the stack. Every layer. Every layer of the stack, removing Turing completeness allows you to reason about things, right? So the reason you need to do branch prediction in a CPU and the reason it's prediction, and the branch predictors are, I think they're like 99% on CPUs. Why do they get 1% of them wrong? Well, they get 1% wrong because you can't know. Right?

0
💬 0

Comments

There are no comments yet.

Log in to comment.