top | item 45255801

(no title)

afspear | 5 months ago

The article says "Consider the implications if ChatGPT started saying “I don’t know” to even 30% of queries – a conservative estimate based on the paper’s analysis of factual uncertainty in training data. Users accustomed to receiving confident answers to virtually any question would likely abandon such systems rapidly." Maybe. But not me. I would trust it more, and rely on it even more. I can work with someone who says I don't know but is super smart. And I'll bet more people will do the same. Over time, the system may enjoy the rewards of communal trust over and above what it currently enjoys. However, over the long time, this may lead to a more dystopian version of what might happen currently. We may all give blind trust because we all trust it. Given a decade or half of that, and then the system going wrong....Yikes. We have to grapple with the ongoing advice that "ChatGPT can make mistakes. Check important info." And we do. Because we have to, or at least some of us do. And that is a good thing.

discuss

order

No comments yet.