top | item 42285149

(no title)

daenney | 1 year ago

You can’t fix it. LLMs making things up is a consequence of what they are.

The blog post admits as much, because the “fix it” section is actually titled “How To Mitigate AI Hallucination?”

Mitigation makes some bad less severe. But that’s not fixing it.

discuss

order

No comments yet.