Lex Fridman Podcast
#387 – George Hotz: Tiny Corp, Twitter, AI Safety, Self-Driving, GPT, AGI & God
George Hotz
Yeah, ReLU. Almost all activation functions are unary ops. Some combinations of unary ops together is still a unary op. Then you have binary ops. Binary ops are like pointwise addition, multiplication, division, compare. It takes in two tensors of equal size and outputs one tensor. Then you have reduce ops.
0
💬
0
Comments
Log in to comment.
There are no comments yet.