top | item 34374801

(no title)

gabelschlager | 3 years ago

It's a language model trained to return the most likely answer given a specific prompt. The goal of the answer is, to simulate a conversation and to sound realistic. ChatGPT is not grounded in reality, it does not know where its "knowledge" comes from, nor does it know whether the things it is saying are correct or wrong.

It can't tell it's wrong, it just reacts in a plausible well to you telling it it's wrong (acknowledging it and giving another explanation). Since ChatGPT is an expansion of GPT-3, those things won't be really solved since being somewhat of a knowledge base is a nice side effect, not the main goal of the model.

discuss

order

MuffinFlavored|3 years ago

> It can't tell it's wrong

So you are saying the order of events if:

1. you ask it a question

2. it gives you an answer (in this case, it's wrong)

3. you ask it "are you sure? i think that's wrong"

4. it answers again "you are right, i think the answer i just gave you was wrong, here is what i think is the right answer this time" (and again, it's wrong)

and repeat forever?

yetanotherloser|3 years ago

Thank you, this is a really clear way of explaining it and a careful reading would be helpful to lots of people who think this is something it isn't.