top | item 45429994

(no title)

VSerge | 5 months ago

Wouldn't you know whether a teacher is reliable or not? If reliable, they probably have this reputation also because they can also say when they don't know something. And if you found out a given teacher isn't reliable, you'd be careful about what they say next - or you would just ask someone else.

The problem here is for a child to be thinking this system is reliable when it is not. For now, the lack of reliability is obvious as chatGPT hallucinates on a very regular basis. However, this will become much harder to notice if/when chatGPT will be almost reliable while saying wrong things with complete confidence. Should such models be able to say reliably when they don't know something, this would be a big step for this specific objection I had, but it still wouldn't solve the other problems I mentioned.

discuss

order

No comments yet.