top | item 37381480

(no title)

simias | 2 years ago

Perhaps, but I wouldn't mind if the models just answered "I'm sorry but I don't have an answer to your question at the time". In fact I think that would be a great answer that would increase the amount of trust I have in ChatGPT.

Instead the model decides to make stuff up and pretend that it knows. That's vastly worse.

It reminds me of the early days of DuckDuckGo, when if you searched for something obscure with no matches online it would still fuzzy match some garbage like a binary blob in a Chinese PDF while Google helpfully would just tell you that it couldn't find anything.

discuss

order

Cannabat|2 years ago

> Instead the model decides to make stuff up and pretend that it knows. That's vastly worse.

Does the model know it doesn’t know though? Does “know” even make sense as a concept here? I don’t know if it can really introspect like that, but of course it would be so much better if it could can have some sort of confidence score with each answer.