top | item 42941899

(no title)

mkolodny | 1 year ago

A vague “stuff is happening behind closed doors” isn’t enough of a reason to build AI weapons. If you shared a specific weapon that could only be countered with AI weapons, that might make me feel differently. But right now I can’t imagine a reason we’d need or want robots to decide who to kill.

When people talk about AI being dangerous, or possibly bringing about the end of the world, I usually disagree. But AI weapons are obviously dangerous, and could easily get out of control. Their whole point is that they are out of control.

The issue isn’t that AI weapons are “evil”. It’s that value alignment isn’t a solved problem, and AI weapons could kill people we wouldn’t want them to kill.

discuss

order

nicr_22|1 year ago

Have a look at what explosive drones are doing in the fight for Ukraine.

Now tell me how you counter a thousand small EMP hardened autonomous drones intent on delivering an explosive payload to one target without AI of some kind?

scottyah|1 year ago

How about 30k drones come from a shipping vessel in the port of Los Angeles that start shooting at random people? To insert a human into the loop (somehow rapidly wake up, move, log hundreds of people in to make the kill/nokill decision per target) would be accepting way more casualties. What if some of the 30k drones were manned? The timeframes of battles are drastically reduced with the latest technology to where humans just can't keep up.

I guess there's a lot missing in semantics, is the AI specifically for targeting or is a drone that can adapt to changes in wind speed using AI considered an AI weapon?

At the end of the day though, the biggest use of AI in defense will always be information gathering and processing.

bamboozled|1 year ago

How about 30k drones come from a shipping vessel in the port of Los Angeles that start shooting at random people?

It's going to happen.

siltcakes|1 year ago

I agree. I don't think there's really a case for the US developing any offensive weapons. Geographically, economically and politically, we are not under any sort of credible threat. Maybe AI based missile defense or something, but we already have a completely unjustified arsenal of offensive weapons and a history of using them amorally.

scottyah|1 year ago

Without going too far into it, if we laid down all offensive weapons the cartels in Mexico would be inside US borders and killing people within a day.

catlikesshrimp|1 year ago

"Geographically, economically and politically, we are not under any sort of credible threat. "

The US is politically and economically declining, already. And its area of influence has been weakening since, the 90's?

It would be bad strategy to not do anything until you feel hopelessly threathened.

computerthings|1 year ago

> AI weapons are obviously dangerous, and could easily get out of control.

The real danger is when they can't. When they, without hesitation or remorse, kill one or millions of people with maximum efficiency, or "just" exist with that capability, to threaten them with such a fate. Unlike nuclear weapons, in case of a stalemate between superpowers they can also be turned inwards.

Using AI for defensive weapons is one thing, and maybe some of those would have to shoot explosives at other things to defend; but just going with "eh, we need to have the ALL possible offensive capability to defend against ANY possible offensive capability" is not credible to me.

The threat scenario is supposed to be masses of enemy automated weapons, not huddled masses; so why isn't the objective to develop weapons that are really good at fighting automatic weapons, but literally can't/won't kill humans, because that's would remain something only human soldiers do? Quite the elephant on the couch IMO.