top | item 46861092

(no title)

Zachzhao | 27 days ago

You're right - hallucinations aren't limited to citations. We see a few failure modes:

Fabricated citations: Case doesn't exist at all

Wrong citation: Case exists but doesn't say what the model claims

Misattributed holdings: Real case, real holding, but applied incorrectly to the legal question

From our internal testing, proper context engineering significantly reduces hallucination across the board.

Once we ground the model in the relevant source documents, hallucination rates drop substantially.

discuss

order

No comments yet.