(no title)
oaktowner | 1 year ago
Calling it "hallucination" implies that there are (other) moments when it is understanding the world correctly -- and that itself is not true. At those moments, it is a word generator that is generating words that DO make sense.
At no point is this a conciousness, and anthropomorphizing it gives the impression that it is one.
JohnFen|1 year ago
krapp|1 year ago
There really is no correct word to describe what's happening, because LLMs are effectively philosophical zombies. We have no metaphors for an entity that can appear to hold a coherent conversation, do useful work and respond to commands but not think. All we have is metaphors from human behavior which presume the connection between language and intellect, because that's all we know. Unfortunately we also have nearly a century of pop culture telling us "AI" is like Data from Star Trek, perfectly logical, superintelligent and always correct.
And "hallucination" is good enough. It gets the point across, that these things can't be trusted. "Confabulation" would be better, but fewer people know it, and it's more important to communicate the untrustworthy nature of LLMs to the masses than it is to be technically precise.