top | item 41929639

(no title)

rational_indian | 1 year ago

Does not make sense. I think the chatbot is a red herring.

discuss

order

akomtu|1 year ago

I bet he had some problems in real life, a new school he couldn't fit in or something else, and AI gave him an illusion of a better world he could escape into.

What's caught my attention is how he did it: by suicide to join his imaginary friends. This exact method is used among those who believe in demons for real. If you read relevant stories you'll notice the same pattern: a victim becomes obsessed with an imaginary friend, a demon, who quickly turns very controlling and finally demands a suicide to move into his world. Perhaps the AI was trained on those stories?

rational_indian|1 year ago

Yeah it implies the chatbot at some point said you can unite with it after death. There's no evidence of that in the article.