top | item 44169587

(no title)

user568439 | 9 months ago

The stressed private might still have a bit of empathy and humanity. Meanwhile millions of drones can be programmed (or hacked) to kill millions of people without excluding civilians or anyone

discuss

order

alkonaut|9 months ago

We have had weapons which are autonomous for decades. You launch them consciously then you know that it will find and destroy weapons based on some "intelligence" (A homing missile with a radar you know is likely to hit the thing that reflects the radar waves, whatever that is. There are artillery shells which home in on vehicles and so on). The launch decision by the human means "I'm responsible for this thing hitting and the thing that it finds". The kill/no-kill decision is made at launch time. An AA missile might hit a civilan jet, but there is no way the operator will make a new kill/no-kill decision once it reaches the jet. You made the decision at launch.

That's the same with these drones. The smarter they get, the further away the human goes. Today it might be simple to create autonomous weapons who are instructed to kill vehicles matching various known appearances. That too already exists. The strike on the Russian bombers was reportedly carried out manually, but it would have been pretty easy to have that autonomous, since the targets are huge, stationary, easily recognizable and easy to navigate to in the geography.

If you launch a quadcopter and instruct it to kill any adult human it finds, then that's the same thing. You wouldn't launch it into an area where there is a remote possibility of being any civilians. No difference from firing an artillery shell. If there is a civilian, or a soldier waving a white flag or whatever - there is no cancel button for your artillery shell. The decision to kill whatever is in the other end was made when you fired it. There is literally no difference between firing a million drones and firing a million artillery shells down range. It's your human responsibility and your human consciousness when you make the decision.

I don't think we have had widespread use of autonomous human-targeting drones yet, but it's by no means science fiction today. Just a matter of time. We'll see their use in this conflict.

pjc50|9 months ago

Don't forget there's a war on right now in which precision munitions are being used to specifically target hospitals full of civilians on the pretext that the enemy is allegedly underneath.

2OEH8eoCRo0|9 months ago

I think this is such a hot topic around here because it makes sheltered nerds begin to comprehend the gritty reality of warfare.

watwut|9 months ago

Human soldiers kill civilians pretty much all the time. Then they brag to their friends how cool they are. Drones do not rape, soldiers rape (and yes they rape men too in case someone wants to make it about gender).

All the bombs Russia thrown onto Ukrainian civilians were thrown by human soldiers.

scott_w|9 months ago

> Drones do not rape

Yet. Drones also don't get tired.

Theodores|9 months ago

[deleted]

yaris|9 months ago

My understanding is that a drone will make decisions that are reproducible (same data - same decision), so if anything goes wrong then it should be possible to investigate (to some extent) and fix. A stressed private is in this sense ”undebuggable” because much more not-easily-reproducible factors influence decisions. Also I’m afraid that stressed and tired privates at war tend to err towards ”just kill them all” because it looks much more like a videogame.

lukan|9 months ago

The data is never the same. Every situation in war (or reality in general) is unique.