top | item 47188447

(no title)

blhack | 2 days ago

Right. Did the DoW ask for that? Or does Anthropic make a product that does that?

discuss

order

nilkn|2 days ago

Obviously Anthropic does make a product that could do that -- just give Claude classified data and ask it who to target.

Obviously the military wants to use it for that purpose since they couldn't accept Anthropic's extremely limited terms.

One can easily and immediately infer the answers to both your questions are yes.

blhack|2 days ago

The DoW has explicitly said they don’t want this, and what you are describing are not automated kill drones.

Anthropic’s safeguards already prevent what you are describing, again the thing thar DoW has said they don’t want.

ImPostingOnHN|2 days ago

The DoD is explicitly asking for those things, by forcing contract renegotiation towards a contract that is identical in every way, except removing the prohibition on those things.

If the DoD did not want those things, it would not be forcing a contract renegotiation to include them, at great cost to the government.

blhack|2 days ago

No, the DoW may be implicitly asking for those things.

That’s the point I’m trying to make here: Anthropic should just say the unsaid thing here.

DoW asked for the following thing: $foo. We won’t give that to them.

sigmar|2 days ago

https://x.com/SeanParnellASW/status/2027072228777734474?s=20

Here's the Chief Pentagon Spokesman pointing to the same verbiage and reiterating they they won't agree to those terms of use.

blhack|2 days ago

The first sentence of that post is:

> The Department of War has no interest in using AI to conduct mass surveillance of Americans (which is illegal) nor do we want to use AI to develop autonomous weapons that operate without human involvement.

mcphage|2 days ago

I certainly wouldn’t give them the benefit of the doubt.

blhack|2 days ago

Then Anthropic should say: this is what the DoW has asked for, and we aren’t able to do it, or don’t want to.