top | item 39371708

(no title)

MrNeon | 2 years ago

Is a farmer exploiting somebody's need for food?

discuss

order

suoduandao3|2 years ago

maybe, depending on what he's growing? Some foodstuffs have better nutritional content than others. Intimacy hawkers are surely the same.

I wonder though, would an AI vendor sell better or worse intimacy? Chatgpt apparently has better bedside manners than something like 80% of actual physicians. Granted, giving comfort isn't supposed to be part of their job, but why would a human onlyfans model with other customers be better than an AI adapted to only one customer?

almatabata|2 years ago

The original comment said:

"So how many more rounds of this cycle do we need before we leverage the letter of the law to say that maybe companies shouldn't be allowed to blatantly exploit people's vulnerability and isolation to make money?"

To which another answered: "Does that include p0rn?"

I thought the question was good but deserved a bit more exposition. If you believe that creating an AI Boyfriend/Girlfriend to make money is unethical. In my opinion you should ask yourself the question why it is not unethical for an onlyfans model to sell companionship.

Regarding your point "Is a farmer exploiting somebody's need for food?". I would say there is a key difference between these two scenarios. In the case of farmers, growing food is the healthiest option to not starve. In contrast you could argue that an AI Boyfriend/Girlfriend is not the healthiest cure to loneliness. Wouldn't interacting with a real person lead to better character development because you would have to work on your own imperfections and learn to accept other peoples shortcomings?

ToucanLoucan|2 years ago

> If you believe that creating an AI Boyfriend/Girlfriend to make money is unethical. In my opinion you should ask yourself the question why it is not unethical for an onlyfans model to sell companionship.

On some level there's something inherently icky in my mind with an ethics coloring to it involved in creating something that emulates intelligence, even poorly, and then "assigning" it a romantic interest in a person. I can't quite adequately explain but it's something around consent to me. At what point does simulating consciousness begin approaching it? The machine doesn't and can't consent to intimate interactions, but it's sole reason to exist and continue existing, in whatever sense you'd like to assign it exists at all in the way something intelligent does, is to facilitate those interactions. It's something about artificial life, even flagrantly fake life, being created solely to serve the purposes of another that just... rubs me the wrong way.

By contrast a creator or what have you that's serving in some sex-worker-or-adjacent-role is consenting. The consent is muddled by the financial aspect, and the argument can be made that such consent is inherently less valid because as long as you need money to live, the offer of money is inherently coercive. I don't know what I agree with that I'm just saying it is an argument that can be made. Nevertheless though it is a realized full being that is participating to whatever degree you want to say they are voluntarily, and that participation and consent can be revoked if the client becomes too... abusive, combative, or strays into uncomfortable subject matter, which makes it distinct from the AI.