top | item 47065917

(no title)

Legend2440 | 11 days ago

This is just the price of being on the bleeding edge.

Unfortunately, prompt injection does strongly limit what you can safely use LLMs for. But people are willing to accept the limitations because they do a lot of really awesome things that can't be done any other way.

They will figure out a solution to prompt injection eventually, probably by training LLMs in a way that separates instructions and data.

discuss

order

No comments yet.