(no title)
mkolodny | 1 year ago
When people talk about AI being dangerous, or possibly bringing about the end of the world, I usually disagree. But AI weapons are obviously dangerous, and could easily get out of control. Their whole point is that they are out of control.
The issue isn’t that AI weapons are “evil”. It’s that value alignment isn’t a solved problem, and AI weapons could kill people we wouldn’t want them to kill.
nicr_22|1 year ago
Now tell me how you counter a thousand small EMP hardened autonomous drones intent on delivering an explosive payload to one target without AI of some kind?
scottyah|1 year ago
I guess there's a lot missing in semantics, is the AI specifically for targeting or is a drone that can adapt to changes in wind speed using AI considered an AI weapon?
At the end of the day though, the biggest use of AI in defense will always be information gathering and processing.
bamboozled|1 year ago
It's going to happen.
siltcakes|1 year ago
scottyah|1 year ago
catlikesshrimp|1 year ago
The US is politically and economically declining, already. And its area of influence has been weakening since, the 90's?
It would be bad strategy to not do anything until you feel hopelessly threathened.
computerthings|1 year ago
The real danger is when they can't. When they, without hesitation or remorse, kill one or millions of people with maximum efficiency, or "just" exist with that capability, to threaten them with such a fate. Unlike nuclear weapons, in case of a stalemate between superpowers they can also be turned inwards.
Using AI for defensive weapons is one thing, and maybe some of those would have to shoot explosives at other things to defend; but just going with "eh, we need to have the ALL possible offensive capability to defend against ANY possible offensive capability" is not credible to me.
The threat scenario is supposed to be masses of enemy automated weapons, not huddled masses; so why isn't the objective to develop weapons that are really good at fighting automatic weapons, but literally can't/won't kill humans, because that's would remain something only human soldiers do? Quite the elephant on the couch IMO.