(no title)
throwaway4233 | 1 year ago
So the gaps are the only areas where the LLM can hallucinate on and if your search query is easily available information on the internet, then hallucinations will be less or none.
Edit: I have used RAG with a project that I am working on and it's quite hard to ascertain if the LLM used the information provided as part of the RAG documents or just made up information on it's own, since even without RAG, we were getting similar responses 7 times out of 10.
No comments yet.