The example in this article is prompt injection in a "skill" file. It doesn't seem unreasonable that someone looking to "embrace AI" would look up ways to make it perform better at a certain task, and assume that since it's a plain text file it must be safe to upload to a chatbot
fathermarz|1 month ago
As far as I know, repositories for skills are found in technical corners of the internet.
I could understand a potential phish as a way to make this happen, but the crossover between embrace AI person and falls for “download this file” phishes is pretty narrow IMO.
swores|1 month ago