top | item 36910464

(no title)

jarofghosts | 2 years ago

Hallucinating is roughly how they work, we just label it as such when it's something obviously weird

discuss

order

thewataccount|2 years ago

This is something I'm not sure people understand.

LLM's only make a "best guess" for each next token. That's it. When it's wrong we call it a "hallucination" but really the entire thing was a "hallucination" to begin with.

This is also analogous to humans - who also "hallucinate" incorrect answers, usually "hallucinate" incorrect answers less when they "Think through this step by step before giving your answer", etc.