(no title)
simias | 2 years ago
Instead the model decides to make stuff up and pretend that it knows. That's vastly worse.
It reminds me of the early days of DuckDuckGo, when if you searched for something obscure with no matches online it would still fuzzy match some garbage like a binary blob in a Chinese PDF while Google helpfully would just tell you that it couldn't find anything.
Cannabat|2 years ago
Does the model know it doesn’t know though? Does “know” even make sense as a concept here? I don’t know if it can really introspect like that, but of course it would be so much better if it could can have some sort of confidence score with each answer.