top | item 46627598

(no title)

tempaccsoz5 | 1 month ago

The example in this article is prompt injection in a "skill" file. It doesn't seem unreasonable that someone looking to "embrace AI" would look up ways to make it perform better at a certain task, and assume that since it's a plain text file it must be safe to upload to a chatbot

discuss

order

fathermarz|1 month ago

I have a hard time with this one. Technical people understand a skill and uploading a skill. If a non-technical person learns about skills it is likely through a trusted person who is teaching them about them and will tell them how to make their own skills.

As far as I know, repositories for skills are found in technical corners of the internet.

I could understand a potential phish as a way to make this happen, but the crossover between embrace AI person and falls for “download this file” phishes is pretty narrow IMO.

swores|1 month ago

You'd be surprised how many people fit in the venn overlap of technical enough to be doing stuff in unix shell yet willing to follow instructions from a website they googled 30 seconds earlier that tells them to paste a command that downloads a bash script and immediately executes it. Which itself is a surprisingly common suggestion from many how to blog posts and software help pages.