Decoder with Nilay Patel
Anthropic’s Mike Krieger wants to build AI products that are worth the hype
Nilay Patel
It seems like those folks might be more inclined to sue you if you send some business haywire because the software is hallucinating. Is this something you can solve? I've had a lot of people tell me that LLMs are always hallucinating and we're just controlling the hallucinations. And I should stop asking people if they can stop hallucinating because the question doesn't make any sense.
0
💬
0
Comments
Log in to comment.
There are no comments yet.