Well, in this case, I think since people are killing themselves after talking to the AI, people are actually killing people. The AI company and the AI kills no one, so surely they must not be responsible at all for this.
“responsibility” isn’t a boolean, at least in this human’s experience.
there are different degrees of responsibility (and accountability) for everyone involved. some are smaller, some are larger. but everyone shares some responsibility, even if it is infinitesimally small.
And more in general, people kill people. And people help people.
Tools are tools. It is what we make of them what matters. AI on its own has no intentions, but questions like these feed into that believe that there is already AGI with a own agenda waiting to build terminators.
embedding-shape|2 months ago
dijksterhuis|2 months ago
there are different degrees of responsibility (and accountability) for everyone involved. some are smaller, some are larger. but everyone shares some responsibility, even if it is infinitesimally small.
lukan|2 months ago
Tools are tools. It is what we make of them what matters. AI on its own has no intentions, but questions like these feed into that believe that there is already AGI with a own agenda waiting to build terminators.
0xbadcafebee|2 months ago
esafak|2 months ago
hk1337|2 months ago