Menu
Sign In Pricing Add Podcast

Lex Fridman Podcast

#452 – Dario Amodei: Anthropic CEO on Claude, AGI & the Future of AI & Humanity

16692.368 - 16713.498

So if we think about the car detector, you know, the more it fires, the more we sort of think of that as meaning, oh, the model is more and more confident that a car is present. Or, you know, if it's some combination of neurons that represent a car, you know, the more that combination fires, the more we think the model thinks there's a car present. Um, this doesn't have to be the case, right?

0
💬 0

Comments

There are no comments yet.

Log in to comment.