top | item 47005204

(no title)

oidar | 16 days ago

Here's the related subreddit: https://www.reddit.com/r/MyBoyfriendIsAI/

discuss

order

dwroberts|16 days ago

I don’t have any evidence but I always get a strong suspicion that a very large % of what happens on this subreddit is fake. I don’t know what the exact motives are, but just something about it isn’t right to me.

hamdingers|16 days ago

I sort of agree. I don't know if it's "fake" so much as the members of that community use it as a place to extend their private role play into public.

On the one hand they're "mourning" their AI partners, but on the other hand they have intelligent and rational conversations about the practicalities of maintaining long running AI conversations. They talk about compacting vs pruning, they run evals with MRCR, etc. These are not (all) crazy people.

raincole|16 days ago

ragebait was the word of 2025 for a reason.

bee_rider|16 days ago

Well. Huh. Without regard to whether or not it was basically healthy to get that emotionally dependent on the bot… you’d think that if they could manipulate people into being so attached to the things, they’d also be able to manipulate people into accepting the end of the situation.

neom|16 days ago

Go look at any tweet by sama, or twitter generally, it's full of pretty angry people who feel like something tangible in their life has been ripped away - I read someone posting about how they got an email from OAI saying they'd been concerned about the users usage of the service so they'd "upgraded them" to the "newest model". This whole situation has been really distressing for me and I'm not even involved in it, so SO glad they're getting rid of 4o, that thing is genuinely a scourge on our societies.

rtkwe|16 days ago

They didn't intentionally manipulate the people though, or lets say they didn't intend for it to go as far as some of the more /intense/ users took it. It was just a byproduct of making the bot way too agreeable and follow-y. That doesn't mean they can manipulate these people into anything OpenAI wants to undo the issue, 4o wasn't persuading these people to believe it was going along with something they desperately wanted to believe.

Bratmon|16 days ago

> you’d think that if they could manipulate people into being so attached to the things, they’d also be able to manipulate people into accepting the end of the situation.

That seems like a very unlikely conclusion to me. Why is it your prior?

rektomatic|16 days ago

this is so so sad on many levels

fullmoon|16 days ago

I agree, but not because I think that those users had stable attachment patterns and have been corrupted by an unscrupulous company, but because there is unacknowledged, often hidden, but severe pain in a large % of the population.