Exactly. Stop fooling people into thinking there’s a human typing on the other side of the screen. LLMs should be incredibly useful productivity tools, not emotional support.
I think therapists in training, or people providing crisis intervention support, can train/practice using LLMs acting as patients going through various kinds of issues.
But people who need help should probably talk to real people.
I don't know why you're being downvoted. Denmark's health system is pretty good except adult mental health. SOTA LLMs are definitely approaching a stage where they could help.
The point the OP is making is that LLMs are not reliably able to provide safe and effective emotional support as has been outlined by recent cases. We're in uncharted territory and before LLMs become emotional companions for people, we should better understand what the risks and tradeoffs are.
player1234|3 months ago
[deleted]
lcfcjs6|3 months ago
[deleted]
halifaxbeard|3 months ago
nikkwong|3 months ago
treyd|3 months ago
abeppu|3 months ago
neilwilson|3 months ago
Then make more friends.
ahmeneeroe-v2|3 months ago
NullCascade|3 months ago
93po|3 months ago
93po|3 months ago
spaqin|3 months ago
nikkwong|3 months ago
glitchc|3 months ago