(no title)
gwerbret | 2 months ago
> At the heart of the problem is the tendency for AI language models to confabulate, which means they may confidently generate a false output that is stated as being factual.
"Confabulate" is precisely the correct term; I don't know how we ended up settling on "hallucinate".
delecti|2 months ago
tim333|2 months ago
>production of fabricated, distorted, or misinterpreted memories
doesn't fit. LLMs don't produce distorted memories, they guess random stuff and then forget it shortly after.
I think 'guess' may be more accurate.
neRok|2 months ago
adammarples|2 months ago
kace91|2 months ago
Uh, TIL. This is wildly different to the Spanish meaning, confabular means to plot something bad (as in a conspiracy).
Which is a weird evolution in both languages, as the Latin root seems to mean simply “talking together”.
rsynnott|2 months ago