(no title)
beloch | 4 days ago
The IDF used tools like Lavender and "Is Daddy Home" to analyze communications, identify members of Hamas, and learn when they were home so they could be killed with bomb strikes.
This has long been possible for humans to do, but it's a laborious process. In the past, only people high up in chains of command received such bespoke treatment. AI tools permitted the IDF to grant the same treatment to raw recruits who had been given the sum total of a pep-talk and a pistol.
The result was widespread destruction and indiscriminate killing of civilians. The IDF didn't spend much time scrutinizing AI recommendations and were willing to act on false positives. Every bomb strike, by design (i.e. "Is Daddy home"'s purpose was to determine when targets were in their family homes), took out civilians. Just taking a pizza order from a Hamas member years before the war might have been enough to get entire families and their neighbours killed.
If humans hate another group of humans enough and an AI says "Kill", they'll kill. Without thought or remorse. We don't merely need to be worried about murderous robots on battlefields, we also need to worry about humans implementing the recommendations of AI without thinking for themselves.
No comments yet.