top | item 45106916

(no title)

jonathanlb | 6 months ago

Given that this is in response to a ChatGPT user who killed his mother and then himself, I'm not sure that positioning your product as being more secure than ChatGPT is wise, because your marketing here suggests either:

1. Profound tone-deafness about appropriate contexts for privacy messaging

2. Intentional targeting of users who want to avoid safety interventions

3. A fundamental misunderstanding of your ethical obligations as an AI provider

None of these interpretations reflect well on AgentSea's judgment or values.

discuss

order

kbelder|6 months ago

I disagree. The fact that the crimes done by a mentally ill person are going to be used as a justification for surveillance on the wider population of users is a strong ethical reason to advocate for more security.

VonGuard|6 months ago

Yeah, it'd be terrible if all our emails, DNS queries, purchase histories, messages, Facebook posts, Google searches, in store purchase, driving and GPS info were being tracked, cataloged, and sold to anyone who wants it! Why, people would never stand for such surveillance!

Anyone with half a brain complaining about hypothetical future privacy violations on some random platform just makes me spit milk out my nose. What privacy?! Privacy no longer exists, and worrying that your chat logs are gonna get sent to the authorities seems to me like worrying that the cops are gonna give you a parking ticket after your car blew up because you let the mechanic put a bomb in the engine.

sleazebreeze|6 months ago

Or maybe I just want to be able to talk to an LLM without worrying about if its going to report me to the authorities.

lurking_swe|6 months ago

that’s a good point, privacy is important.

To play devils advocate for a second, what if someone that’s mentally ill uses a local LLM for therapy and doesn’t get the help they need? Even if it’s against their will? And they commit suicide or kill someone because the LLM said it’s the right thing to do…

Is being dead better, or is having complete privacy better? Or does it depend?

I use local LLMs too, but it’s disingenuous to act like they solve the _real_ problem here. Mentally ill people trying to use an LLM for therapy. It can end catastrophically.

LamerBeeI|6 months ago

I too think there should be no rules or attempts to derisk any situation, just let us die

exe34|6 months ago

Are you in America? Do you also support banning guns?