top | item 47080405

(no title)

Yizahi | 10 days ago

Asking LLM programs to "not do the thing" often results in them tripping and generating output including that "thing", since those are simply the tokens which will enter the input. I always try to rephrase query the way that all my instructions have only "positive" forms - "do only this" or "do it only in that way" or "do it only for those parameters requested" etc. Can't say if that helps much, but it is possible.

discuss

order

kolinko|10 days ago

Which is how it works with people as well