top | item 43875173

(no title)

telchior | 10 months ago

A lot of human empathy isn't real either. Defaulting to the most extreme example, narcissists use love bombing to build attachment. Sales people use "relationship building" to make money. AI actually seems better than these -- it isn't building up to a rug pull (at least, not one that we know of yet).

And it's getting worse year after year, as our society gets more isolated. Look at trends in pig butchering, for instance: a lot of these are people so incredibly lonely and unhappy that they fall into the world's most obvious scam. AI is one of the few things that actually looks like it could work, so I think realistically it doesn't matter that it's not real empathy. At the same time, Sam Altman looks like the kind of guy who could be equally effective as a startup CEO or running a butchering op in Myanmar, so I hope like hell the market fragments more.

discuss

order

doright|10 months ago

This is a good point, you can't be dependent on a chatbot in the same way you're dependent on someone you share a lease with. If people take up chatbots en masse, maybe it says more about how they perceive the risk of virtual or physical human interactions vs AI. The people I have met in the past make the most sycophant AIs seem like a drop in the bucket by comparison. When you come back from that in real life, you remark that this is all just a bunch of text in comparison.

I treat AIs dispassionately like a secretary I can give infinite amounts of work to without needing to care about them throwing their hands up. That sort of mindset is non-conducive to developing any feelings. With humans you need empathy to not burden them with excessive demands. If it solely comes down to getting work done (and not building friendships or professional relationships etc.) then that need to restrain your demands is a limitation of human biology that AIs kind of circumvent for specific workloads.