top | item 41097012

(no title)

pstric | 1 year ago

Adam, did you expect that behavior?

discuss

order

selcuka|1 year ago

LLM hallucination is not something the author can solve with a simple hack.

That being said, the LLM can inspect the prompt and give feedback (such as "not enough context, provide some pointers to help research").