top | item 38287345

(no title)

rexf | 2 years ago

"Hallucinations" is too charitable of a term for making stuff up. It personifies "AI" which doesn't need help selling its "intelligence".

discuss

order

_nalply|2 years ago

You could use "confabulate" instead, could you? "Make stuff up" is very human and would personify "AI" even more.

dragonwriter|2 years ago

Confabulations are also very human, but a much better metaphor than hallucination for the actual errors LLMs make.

dragonwriter|2 years ago

Hallucinations are false current sense perceptions. LLMs don’t have senses and don’t hallucinate at all; the LLM errors described as “hallucinations” are closer, if one needs an anthropomorphizing metaphor, to confabulations.

Which kind of make sense; as LLMs have almost no memory, just an instinct to respond and some instinctual responses (the result of “training”, which is also a bad metaphor; only “in-context learning” is analogous to training/learning for humans, what is called “training” is guided evolution of frozen instincts) and whatever is in their context window. And lack of memory plus a prompt to respond is a major context where confabulations happen with humans (these are specifically called “provoked confabulations.”)