top | item 46049288

(no title)

dizzy3gg | 3 months ago

Why is the being downvoted?

discuss

order

jermaustin1|3 months ago

Because the article shows it isn't Gemini that is the issue, it is the tool calling. When Gemini can't get to a file (because it is blocked by .gitignore), it then uses cat to read the contents.

I've watched this with GPT-OSS as well. If the tool blocks something, it will try other ways until it gets it.

The LLM "hacks" you.

lazide|3 months ago

And… that isn’t the LLM’s fault/responsibility?

NitpickLawyer|3 months ago

Because it misses the point. The problem is not the model being in a cloud. The problem is that as soon as "untrusted inputs" (i.e. web content) touch your LLM context, you are vulnerable to data exfil. Running the model locally has nothing to do with avoiding this. Nor does "running code in a sandbox", as long as that sandbox can hit http / dns / whatever.

The main problem is that LLMs share both "control" and "data" channels, and you can't (so far) disambiguate between the two. There are mitigations, but nothing is 100% safe.

mkagenius|3 months ago

Sorry, I didn't elaborate. But "completely local" meant not doing any network calls unless specifically approved. When llm calls are completely local you just need to monitor a few explicit network calls to be sure.