top | item 45017784

(no title)

cbluth | 6 months ago

I know someone that has "resurrected" a dead relative through an llm, and I've seen the nonsense on forums about dating an "ai boyfriend"...

Some people can't help themselves and don't use these tools appropriately.

discuss

order

MisterTea|6 months ago

My friend admitted to having political and religious arguments with chatgpt. They have mental issues which contributed.

beacon473|6 months ago

What's wrong with using an LLM to learn about politics and religion?

I've found Claude to be an excellent tool to facilitate introspective psychoanalysis. Unlike most human therapists I've worked with, Claude will call me on my shit and won't be talked into agreeing with my neurotic fantasies (if prompted correctly).

ramesh31|6 months ago

>My friend admitted to having political and religious arguments with chatgpt. They have mental issues which contributed.

To be fair these are probably the same people who would have been having these conversations with squirrels in the park otherwise.

deadbabe|6 months ago

Can you talk more about the resurrection? Did they train an LLM fine tune against as much written content as possible made by that person?

nullc|6 months ago

That might actually be interesting if there were enough content, something of the "Beta" level AI's in Alastair Reynolds' revelation space books.

But that isn't what I've seen done when people said they did that. Instead they just told ChatGPT a bit about the person and asked it to playact. The result was nothing like the person-- just the same pathetic ChatGPT persona, but in their confusion, grief, and vulnerability they thought it was a recreation of the deceased person.

A particularly shocking and public example is the Jim Acosta interview of the simulacra of a parkland shooting victim.

dingnuts|6 months ago

the good news is that this shit isn't sustainable. when investor funtime subsidies end, ain't nobody spending $2000/mo for an AI boyfriend.

"but it's getting cheaper and cheaper to run inference" they say. To which I say:

ok bud sure thing, this isn't a technology governed by Moore. We'll see.

fullshark|6 months ago

Mental health issues in the population are never going away, people using software tools to prey on those with issues is never going away. Arguably the entire consumer software industry preys on addiction and poor impulse control already.

valbaca|6 months ago

> ain't nobody spending $2000/mo for an AI boyfriend

you haven't met the Whales (big spenders) in the loneliness-epidemic industry (e.g. OnlyFans and the like)

BoredPositron|6 months ago

When I was at a game studio for a big MMORPG I had the valuable experience sitting next to the monetization team. It was a third grade MMO with gacha mechanics our whales spend 20-30k every month... for years.

nullc|6 months ago

It doesn't require a particularly powerful AI because the human's own hope is doing the heavy lifting. 70B models run juust fine on hardware you can have sitting under your desk.

tokai|6 months ago

People have spend much more on pig butchering scam boyfriends that don't even exist. I bet you could get some people to pay quit a lot to keep what they see as their significant other alive.

deadbabe|6 months ago

$2000? I see $4000/month minimum, roughly equivalent to what some typical Wall Street data feeds are priced at. It’s big business.