(no title)
gabelschlager | 3 years ago
It can't tell it's wrong, it just reacts in a plausible well to you telling it it's wrong (acknowledging it and giving another explanation). Since ChatGPT is an expansion of GPT-3, those things won't be really solved since being somewhat of a knowledge base is a nice side effect, not the main goal of the model.
MuffinFlavored|3 years ago
So you are saying the order of events if:
1. you ask it a question
2. it gives you an answer (in this case, it's wrong)
3. you ask it "are you sure? i think that's wrong"
4. it answers again "you are right, i think the answer i just gave you was wrong, here is what i think is the right answer this time" (and again, it's wrong)
and repeat forever?
yetanotherloser|3 years ago