(no title)
60654 | 1 year ago
There are big problems with hallucinations because LLMs are not smart enough to know when they're starting to make mistakes.
But there's lots of work in this area, and generally in different ways to nail neural and symbolic systems together.
No comments yet.