Menu
Sign In Pricing Add Podcast

Decoder with Nilay Patel

Anthropic’s Mike Krieger wants to build AI products that are worth the hype

4451.245 - 4469.876 Nilay Patel

It seems like those folks might be more inclined to sue you if you send some business haywire because the software is hallucinating. Is this something you can solve? I've had a lot of people tell me that LLMs are always hallucinating and we're just controlling the hallucinations. And I should stop asking people if they can stop hallucinating because the question doesn't make any sense.

0
💬 0

Comments

There are no comments yet.

Log in to comment.