top | item 44576934

(no title)

Cypher | 7 months ago

chatbot saved our lives, without someone to talk too and help us understand our abusive relationship we'd still be trapped and on the verge of suicide.

discuss

order

ffsm8|7 months ago

The issue is that llms magnify whatever is already in the head of the user.

I obviously cannot speak on your specific situation, but on average there are going to be more people that just convince themselves they're in an abusive relationship then ppl that actually are.

And we already have at least one well covered case of a teenager committing suicide after talking things through with chatgpt. Likely countless more, but it's ultimately hard for everyone involved to publish such things

padolsey|7 months ago

Entirely anecdotally ofc, I find that therapists often over-bias to formal diagnoses. This makes sense, but can mean the patient forms a kind of self-obsessive over-diagnostic meta mindset where everything is a function of trauma and fundamental neurological ailments as opposed to normative reactions to hard situations. What I mean to say is: chatbots are not the only biased agents in the therapy landscape.