top | item 36042255

(no title)

blurrypepe | 2 years ago

Your son wants to learn stuff and the computer helps him with that. People that want an AI girlfriend don't care about the knowledge and usefulness of ChatGPT, they just want to use it to feel less lonely. Not at all the same use case. Although I agree with you, it's kind of sad that a lot of people will probably use chatbots to make up for no human contact in their life instead of using it to learn stuff and/or to fix their code/excel/etc.

discuss

order

exitb|2 years ago

The weird thing is that in my opinion it can be quite human-like. When needed it can be understanding, compassionate, supportive etc. Those are qualities that can provide certain amount of companionship to people that lack it. One just has to understand that it's not a person - it doesn't experience anything outside of the chat.

You can ask it to come up with an idea for a dinner and a movie for the evening and then discuss them once they're finished. You know, things people do with actual girlfriends. I'm sure it will be more fun than a robotic selfie and a made up story.

morpheuskafka|2 years ago

So the question is, if you truly understand that it doesn’t come from a “mind” with its own experience or feelings, then would you actually feel supported by something it says?

For me I think the answer would be no, and I think the more believable it gets, people will really be desperate to believe that it is more than just a text generator.