top | item 46726285

(no title)

dankwizard | 1 month ago

I was sick of my AI hallucinating, so I added in the system prompts "Do not hallucinate". Just a quick glimpse into my prompt engineering mind

discuss

order

BobbyLLM|1 month ago

No. Because that works about as well as telling a fat kid not to eat cake.

Prompts shape style, not epistemics.

What this does is moves the problem out of “please behave” and into hard constraints.

Nice drive by tho.

PS: You understand this is enforced outside the model, right? Or are you here just to try and dunk on someone?