(no title)
mithametacs | 1 year ago
They’re a little bit better about deciphering errors.
They’ll still bullshit* you and send you on wild goose chases.
*hallucinate if you prefer
mithametacs | 1 year ago
They’re a little bit better about deciphering errors.
They’ll still bullshit* you and send you on wild goose chases.
*hallucinate if you prefer
matheusmoreira|1 year ago
And confidently at that. It can't seem to find the backbone to say no to me either.
If I say something like "wait, X doesn't seem to make sense, isn't it actually Y and Z?" it will agree and reformulate the answer as if Y and Z were correct just to placate me. I usually use the LLM to learn new things, I don't don't actually know if Y and Z apply.