Menu
Sign In Pricing Add Podcast

Lex Fridman Podcast

#387 – George Hotz: Tiny Corp, Twitter, AI Safety, Self-Driving, GPT, AGI & God

3253.069 - 3274.926 George Hotz

Yeah, ReLU. Almost all activation functions are unary ops. Some combinations of unary ops together is still a unary op. Then you have binary ops. Binary ops are like pointwise addition, multiplication, division, compare. It takes in two tensors of equal size and outputs one tensor. Then you have reduce ops.

0
💬 0

Comments

There are no comments yet.

Log in to comment.