top | item 46624434

(no title)

lacunary | 1 month ago

so, train the llms by sending them fake prompt injection attempts once a month and then requiring them to perform remedial security training if they fall for it?

discuss

order

No comments yet.