The problem with LLMs isn't hallucination, it's context specific confidence (signalfire.com) 4 pts|4 months ago|4 comments