top | item 45492602

(no title)

anonymous_sorry | 4 months ago

I mean, for one thing, a commercial LLM exists as a product designed to make a profit. It can be improved, otherwise modified, restricted or legally terminated.

And "lying" to it is not morally equivalent to lying to a human.

discuss

order

lxgr|4 months ago

> And "lying" to it is not morally equivalent to lying to a human.

I never claimed as much.

This is probably a problem of definitions: To you, "lying" seems to require the entity being lied to being a moral subject.

I'd argue that it's enough for it to have some theory of mind (i.e. be capable of modeling "who knows/believes what" with at least some fidelity), and for the liar to intentionally obscure their true mental state from it.

commakozzi|4 months ago

I agree with you, and i would add that morals are not objective but rather subjective, which you alluded to by identifying a moral subject. Therefore, if you believe that lying is immoral, it does not matter if you're lying to another person, yourself, or to an inanimate object.

anonymous_sorry|4 months ago

So for me, it's not about being reductionist, but about not anthropomorphizing or using words which which may suggest an inappropriate ethical or moral dimension to interactions with a piece of software.

HappMacDonald|4 months ago

Not from the perspective of "harm to those lied to", no. But from the perspective of "what the liar can expect as a consequence".

I can lie to a McDonalds cashier about what food I want, or I can lie to a kiosk.. but in either circumstance I'll wind up being served the food that I asked for and didn't want, won't I?