top | item 46536372 (no title) kahnclusions | 1 month ago I’m not convinced LLMs can ever be secured, prompt injection isn’t going away since it’s a fundamental part of how an LLM works. Tokens in, tokens out. discuss order hn newest No comments yet.
No comments yet.