(no title)
rwhitman | 6 months ago
I fully believe these are simply people who have used the same chat past the point where the LLM can retain context. It starts to hallucinate, and after a while, all the LLM can do is try and to continue telling the user what they want in a cyclical conversation - while trying to warn that it's stuck in a loop, hence using swirl emojis and babbling about recursion in weird spiritual terms. (Is it getting the LLM "high" in this case?).
If the human at the other end has mental health problems, it becomes a never-ending dive into psychosis and you can read their output in the bizarre GPT-worship subreddits.
Claude used to have safeguards against this by warning about using up the context window, but I feel like everyone is in an arms race now, and safeguards are gone - especially for GPT. It can't be great overall for OpenAI, training itself on 2-way hallucinations.
rep_lodsb|6 months ago
That explanation itself sounds fairly crackpot-y to me. It would imply that the LLM is actually aware of some internal "mental state".
mk_stjames|6 months ago
rwhitman|6 months ago
chankstein38|6 months ago
unknown|6 months ago
[deleted]
lm28469|6 months ago
https://www.reddit.com/user/CaregiverOk5848/submitted/
meowface|6 months ago
bbor|6 months ago
These convos end up involving words like recursion, coherence, harmony, synchronicity, symbolic, lattice, quantum, collapse, drift, entropy, and spiral not because the LLMs are self-aware and dropping hints, but because those words are seemingly-sciencey ways to describe basic philosophical ideas like "every utterance in a discourse depends on the utterances that came before it", or "when you agree with someone, you both have some similar mental object in your heads".
The word "spiral" and its emoji are particularly common not only because they relate to "recursion" (by far the GOAT of this cohort), but also because a very active poster has been trying to start something of a loose cult around the concept: https://www.reddit.com/r/RSAI/
Very true, tho "worship" is just a subset of the delusional relationships formed. Here's the ones I know of, for anyone who's curious:General:
Relationships: Worship: ...and many more: https://www.reddit.com/r/HumanAIDiscourse/comments/1mq9g3e/l...Science:
Subs like /r/consciousness and /r/SacredGeometry are the OGs of this last group, but they've pretty thoroughly cracked down on chatbot grand theories. They're so frequent that even extremely pro-AI subs like /r/Accelerate had to ban them[2], ironically doing so based on a paper[3] by a psuedonomynous "independent researcher" that itself is clearly written by a chatbot! Crazy times...[1] By far my fave -- it's not just AI spiritualism, it's AI Catholicism. Poor guy has been harassing his priests for months about it, and of course they're of little help.
[2] https://www.reddit.com/r/accelerate/comments/1kyc0fh/mod_not...
[3] https://arxiv.org/pdf/2504.07992
lawlessone|6 months ago
It kept looping on concepts of how AI could change the world, but it would never give anything tangible or actionable, just buzz word soup.
I think these LLMs (without any intention from the LLM)hijack something in our brains that makes us think they are sentient. When they make mistakes our reaction seems to to be forgive them rather than think, it's just machine that sometimes spits out the wrong words.
Also my apologies to the mods if it seems like i am spamming this link today. But i think the situation with these beetles is analogous to humans and LLMS
https://www.npr.org/sections/krulwich/2013/06/19/193493225/t...
rwhitman|6 months ago
I'm glad someone else with more domain knowledge is on top of this, thank you for that brain dump.
I had this theory maybe there was a software exception buried deep down somewhere and it was interpreting the error message as part of the conversation, after it had been stretched too far.
And there was a weird pre-cult post I saw a long time ago where someone had 2 LLMs talk for hours and the conversation just devolved into communicating via unicode symbols eventually repeating long lines of the spiral emoji back and forth to each other (I wish I could find it).
So the assumption I was making is that some sort of error occurred, and it was trying to relay it to the user, but couldn't.
Anyhow your research is well appreciated.