top | item 45289593

(no title)

dansmith1919 | 5 months ago

I think they mean prompt injection rather than some malformed image to trigger a security bug in the processing library

discuss

order

catlifeonmars|5 months ago

The LLM is the image processing library in this case so you are both right :)