top | item 45959091

(no title)

naIak | 3 months ago

God forbid people ask a chat bot for things and receive what they ask for. We need to put a stop to this. Only American bigcorp speak allowed.

discuss

order

nutjob2|3 months ago

So having an LLM enable the planning and execution of a murder is ok?

Are the makers of the LLM accessories to the crime?

sxzygz|3 months ago

As you’re on this platform, you’re a beneficiary of Section 230 protections.

I think it’s reasonable for LLMs to have such protections, especially when you request questionable things of them.

rjdj377dhabsn|3 months ago

> So having an LLM enable the planning and execution of a murder is ok?

Yes.

> Are the makers of the LLM accessories to the crime?

No.