top | item 46732360

(no title)

wpietri | 1 month ago

I think it's better to say that LLMs only hallucinate. All the text they produce is entirely unverified. Humans are the ones reading the text and constructing meaning.

discuss

order

No comments yet.