top | item 40708770

(no title)

gdcbe | 1 year ago

Why do people talk about hallucinations? Pretty deceptive word of you ask me.

Not an expert though, but isn’t that behaviour inherent to how it works? Bit of a misnomer and giving people the wrong idea of what is going on here.

discuss

order

mariopt|1 year ago

Hallucinations reduce the success rate of AI workflows, which must be taken seriously. Imagine a workflow with 8 steps where each step/agent has a 95% success rate, the success rate of this workflow is only (1-0.05)^8 = 0.66 ~= 66%. Not bad but not enough to replace humans yet (unless 66% makes you profitable).

The hallucinations/errors compound and can misguide decisions if you rely too much on AI.

realusername|1 year ago

Not enough to replace humans in most critical tasks, but enough to replace Google, that's for sure. My own success rate to find information on Google these days is around 50% by query at best.

jldugger|1 year ago

Because "fabrication" seems worse, if more accurate.

firejake308|1 year ago

I prefer "confabulation," which describes the analogous human behavior where you have no idea what the objective truth actually is, so you just make up something that sounds right

ein0p|1 year ago

Fabrication implies malicious intent or at least intentional deception. LLMs don’t have any “intent”.

nicce|1 year ago

In reality, it should sound worse so people don’t trust it so much.

But those who sell AI products don’t want that.

kristiandupont|1 year ago

It's obviously an analogy, but it seems pretty fitting to me? What would you call it?

TillE|1 year ago

Making errors, generating nonsense, being wrong. It's a catchy term but it's not accurate in any meaningful way.

phaedryx|1 year ago

What would you call it when AI doesn't have the answer so it makes stuff up (sometimes in a dangerous way)?

SAI_Peregrinus|1 year ago

Confabulating if you want a non-"vulgar" word. Bulshitting if you don't care.

jamil7|1 year ago

Bullshitting

elevaet|1 year ago

That's called bullshit.

wumbo|1 year ago

Isn’t hallucinating inherent to biological brains too?

It’s normal in small degrees even for mentally healthy individuals.

__loam|1 year ago

Stop anthropomorphizing the token generator please.

x3n0ph3n3|1 year ago

That's the accepted word to describe it making up bullshit instead of regurgitating existing information.