top | item 43794165

(no title)

defyonce | 10 months ago

just tell them something nonsensical. They are unable to take a hint and continue with the nonsense. They start to be stuck on local minima. All of them. Video/images/text. I haven't seen LLM that is able to take a hint and understand the hidden meaning in absurdity of following up.

there is infinitely larger amount of prompts that will break a model than prompts that won't break it.

you just have to search outside of most probable space

discuss

order

No comments yet.