top | item 40755985

(no title)

dawatchusay | 1 year ago

Is confabulation different from hallucination? If not I do suppose this is a more accurate term for the phenomenon except that the exact definition isn’t common sense without looking it up whereas “hallucination” is more widely understood.

discuss

order

curtis3389|1 year ago

When speaking about LLMs, confabulation and hallucination refer to the same thing. The term "confabulation" is just the most accurate description of what's happening, whereas the term "hallucination" refers to something LLMs are fundamentally incapable of.

amenhotep|1 year ago

Some people seem to get very angry about calling it "hallucination", because it's a computer, computers can't hallucinate! Stop anthropomorphising it!!

So I suppose if you want to stay on the right side of those people - or you are one - you call it confabulation instead.

williamcotton|1 year ago

There’s also the position that a definition of confabulate…

To fill in gaps in one's memory with fabrications that one believes to be facts.

…is much more accurate.

Since we’re talking about a technical process it helps to be more precise in our use of language.

CGamesPlay|1 year ago

From the paper <https://www.nature.com/articles/s41586-024-07421-0>:

> Here we develop new methods grounded in statistics, proposing entropy-based uncertainty estimators for LLMs to detect a subset of hallucinations—confabulations—which are arbitrary and incorrect generations.

todd8|1 year ago

Humans too can be committed to beliefs that are not true. I have friend that believes in and regularly consults her "clairvoyant". I wonder if our AI assistants in the future will be vulnerable to suspicions or popular fantasies about the world, people, or even other AIs they interact with.

techostritch|1 year ago

Isn’t commitment to beliefs that aren’t true part of the value of intelligence? Like right now multiple billion dollar companies are being built on different theories of the future of AI. They can’t all be true.