top | item 40729320

(no title)

delish | 1 year ago

We need a different word than "hallucinate" or "bullshit" because the LLM is executing the same functionality when it _gets_ the correct answer or incorrect. It doesn't _know_ the correct answer in either case.

discuss

order

No comments yet.