top | item 35164411

(no title)

three14 | 3 years ago

Maybe it knows the answer, but since it was trained on the internet, it's trolling you.

discuss

order

dx034|3 years ago

Is there any way to know if the model is "holding back" knowledge? Could it have knowledge that it doesn't reveal to any prompt, and if so, is there any other way to find out? Or can we always assume it will reveal all it's knowledge at some point?