top | item 38287367

(no title)

hallqv | 2 years ago

You must have a very peculiar definition of a hallucination.

Why would an LLM reciting a fact correctly be hallucinating?

discuss

order

resonious|2 years ago

Real hallucination is when you sense something that isn't physically there. LLMs don't sense anything, so this "hallucination" term is questionable from the get-go.

notahacker|2 years ago

I think the point is the LLM arrives at an obvious and undeniable fact, a misconception common in human discourse, an assertion of something unknowable, a statement of what appears to be opinion, a "creative" response to a brief (both impressive and unimpressive), a reasonable "guess" and a random answer which is only very loosely linked to the prompt in essentially the same way. Humans generally arrive at such responses in different ways, and are often conscious when they're certain, reasonably confident, guessing, needimg an answer to be a particular way to fit their wider goal or bullshitting

So if it's "hallucinating" a probable continuation which asserts something which is [incidentally] understood to be completely wrong or not in the source material by humans, it's going through exactly the same process to arrive at a continuation which [incidentally] is understood to contain only accurate statements or valid summarizations

hallqv|2 years ago

If I had a penny each time a human has confidently concluded something that is entirely incorrect… I’ve inadvertently done it countless times myself, and so has every person I know of.

smeagull|2 years ago

Suppose I make a massive book of predictions. Some of which turn out to be correct.

Am I now capable of predicting the future?

Suppose I wrote the book to be as banal (i.e. highly probable) as possible.

Am I predicting the future now? And, how impressive is it?

M4v3R|2 years ago

> Suppose I make a massive book of predictions. Some of which turn out to be correct. Am I now capable of predicting the future?

If you write a book of random predictions without any insight the vast majority of them will be false, so even if few of them are right it is not impressive nor anyone would say you're capable of predicting the future.

In comparison, the OP states that GPT-4 predictions are 97% correct. And yes, I would say that is pretty impressive. If 97% of anything I say about the future was correct I would be considered a wizard and probably be a billionaire.

hallqv|2 years ago

What you are talking about I would call guessing :)

Fact of the matter is that sota LLMs are highly accurate predictors for many topics, certainly above any living human in terms of total AUC of correct predictions on fact based questions. Some humans are better on certain topics, but noone can match total AUC since LLMs has such breadth.

TerrifiedMouse|2 years ago

Maybe hallucination isn't the right word. "Statistical guess" is a better term IMHO.