top | item 41097012 (no title) pstric | 1 year ago Adam, did you expect that behavior? discuss order hn newest selcuka|1 year ago LLM hallucination is not something the author can solve with a simple hack.That being said, the LLM can inspect the prompt and give feedback (such as "not enough context, provide some pointers to help research").
selcuka|1 year ago LLM hallucination is not something the author can solve with a simple hack.That being said, the LLM can inspect the prompt and give feedback (such as "not enough context, provide some pointers to help research").
selcuka|1 year ago
That being said, the LLM can inspect the prompt and give feedback (such as "not enough context, provide some pointers to help research").