top | item 46142934

(no title)

samarthr1 | 2 months ago

I remember reading another comment a while ago about being able to only trust an llm with sensitive info only if you can guarantee that the output will only be viewed by people who already had access to the sensitive info already, or cannot control any of the inputs to the llm.

discuss

order

undefeated|2 months ago

Uhm... duh?

> or cannot control any of the inputs to the llm

Seeing as LLMs are non-deterministic, I think even this is not enough of a restriction.