Lex Fridman Podcast
#387 – George Hotz: Tiny Corp, Twitter, AI Safety, Self-Driving, GPT, AGI & God
George Hotz
Only loads and stores that are known before the program runs. And you look at neural networks today, and 95% of neural networks are all the DSP paradigm. They are just statically scheduled adds and multiplies. So TinyGuard really took this idea, and I'm still working on it, to extend this as far as possible. Every stage of the stack has Turing completeness.
0
💬
0
Comments
Log in to comment.
There are no comments yet.