top | item 43226763

(no title)

jwjohnson314 | 1 year ago

I think of hallucinating as a phenomenon where the model makes up something that appears correct but isn’t. Citations to papers that don’t exist, for example. Regurgitating training data (which may or may not be correct) is a different issue.

discuss

order

No comments yet.