Hallucinations reduce the success rate of AI workflows, which must be taken seriously.
Imagine a workflow with 8 steps where each step/agent has a 95% success rate, the success rate of this workflow is only (1-0.05)^8 = 0.66 ~= 66%. Not bad but not enough to replace humans yet (unless 66% makes you profitable).
The hallucinations/errors compound and can misguide decisions if you rely too much on AI.
Not enough to replace humans in most critical tasks, but enough to replace Google, that's for sure. My own success rate to find information on Google these days is around 50% by query at best.
I prefer "confabulation," which describes the analogous human behavior where you have no idea what the objective truth actually is, so you just make up something that sounds right
mariopt|1 year ago
The hallucinations/errors compound and can misguide decisions if you rely too much on AI.
realusername|1 year ago
jldugger|1 year ago
firejake308|1 year ago
ein0p|1 year ago
nicce|1 year ago
But those who sell AI products don’t want that.
kristiandupont|1 year ago
TillE|1 year ago
phaedryx|1 year ago
SAI_Peregrinus|1 year ago
jamil7|1 year ago
elevaet|1 year ago
wumbo|1 year ago
It’s normal in small degrees even for mentally healthy individuals.
__loam|1 year ago
x3n0ph3n3|1 year ago