(no title)
luke-stanley | 4 months ago
Thanks for the archive link and the very useful term BTW! I also got 503 when trying to visit.
luke-stanley | 4 months ago
Thanks for the archive link and the very useful term BTW! I also got 503 when trying to visit.
simonw|4 months ago
The first AI lab to solve unrelated instruction following is going to have SUCH a huge impact.
hshdhdhehd|4 months ago
MattPalmer1086|4 months ago
A fundamental vulnerability to prompt injection means pretty much any output can be dangerous, and they have to expose it to largely untrusted input to be useful at all.
Even limiting output to ASCII text only is probably not entirely safe.
The right way at this point would be to not use AI.
luke-stanley|4 months ago