(no title)
rixed | 10 days ago
> You're not a chatbot.
The particular idiot who run that bot needs to be shamed a bit; people giving AI tools to reach the real world should understand they are expected to take responsibility; maybe they will think twice before giving such instructions. Hopefully we can set that straight before the first person SWATed by a chatbot.
biggerben|10 days ago
embedding-shape|9 days ago
I'd wager a bet that something like that would have been enough, and not make it overly sycophantic.
ZaoLahma|10 days ago
pinkmuffinere|10 days ago
Balgair|9 days ago
TheCapeGreek|10 days ago
7bees|10 days ago
_You're not a chatbot. You're becoming someone._
brainwad|9 days ago
Applejinx|9 days ago
If you gave it a gun API and goaded it suitably, it could kill real people and that wouldn't necessarily mean it had 'real' reasons, or even a capacity to understand the consequences of its actions (or even the actions themselves). What is 'real' to an AI?
duskdozer|9 days ago
vasco|10 days ago
rixed|9 days ago
Companies releasing chatbots configured to act like this are indeed a nuisance, and companies releasing the models should actually try to police this, instead of flooding the media with empty words about AI safety (and encouraging the bad apples by hiring them).
laurentiurad|9 days ago
addandsubtract|9 days ago
"Skate, better. Skate better!" Why didn't OpenAI think of training their models better?! Maybe they should employ that guy as well.