(no title)
jurgenburgen | 15 days ago
I think calling the incorrect output of an LLM a “hallucination” is too kind on the companies creating these models even if it’s technically accurate. “Being lied to” would be more accurate as a description for how the end user feels.
webXL|15 days ago
Lying is deliberately deceiving, but yeah, to a reader, who in a effect is a trusting customer who pays with part of their attention diverted to advertising support, broadcasting a hallucination is essentially the same thing.