(no title)
drewbug01 | 3 months ago
Even if the OP initially asked for a “professional” application, this is hardly a “gotcha” situation - our tools should do what we ask!
I’m sure we could come up with some realistic exceptions, but let’s not waste our words on them: this is a pretty benign thing and I cannot believe we are normalizing the use of tools which do not obey our whims.
Eisenstein|3 months ago
If it were possible for a gun to refuse to shoot an innocent person then it should do that.
It just so happens that LLMS aren't great at making perfectly good decisions right now, but that doesn't mean that if a tool were capable of making good decisions it shouldn't be allowed to.
pksebben|3 months ago
If you define the behavior of the system in an immutable fashion, it ought to serve as a guardrail to prevent anyone (yourself included) from fucking it up.
I want claude to tell me to fly a kite if I ask it to do something antithetical to the initially stated mission. Mixing concerns is how you end up spending time and effort trying to figure out why 2+2 seems to also equal 2 + "" + true + 1