top | item 42495964

(no title)

mithametacs | 1 year ago

they’re still bad for generating any significant body of code.

They’re a little bit better about deciphering errors.

They’ll still bullshit* you and send you on wild goose chases.

*hallucinate if you prefer

discuss

order

matheusmoreira|1 year ago

> They’ll still bullshit you and send you on wild goose chases

And confidently at that. It can't seem to find the backbone to say no to me either.

If I say something like "wait, X doesn't seem to make sense, isn't it actually Y and Z?" it will agree and reformulate the answer as if Y and Z were correct just to placate me. I usually use the LLM to learn new things, I don't don't actually know if Y and Z apply.