AI-guided military aircraft and vehicles have muddied the waters even further when it comes to responsibility for civilian deaths in war. Whereas before the fear was that a shift towards “video game” warfare would result in a detachment from killing, now the fear is that automated AI killing machines show no remorse on the battlefield, no human judgement, no crisis of conscience, and no ability to disobey orders that are clearly against accepted rules of warfare.
Who, then, accepts responsibility when the deaths of one hundred civilians are blamed on a glitch in the machine, or incorrect parameters? Is there a way to even tell the difference between an error, and an AI that is doing exactly what it was designed to do?
From an Amnesty International Report, 2018, The Automation of War.
At five tons gross weight, the Reaper is four times heavier than the Predator … It can fly twice as fast and twice as high as the Predator. Most significantly, it carries many more weapons. […] “It’s not a recon squadron,” Col. Joe Guasella, operations chief for the Central Command’s air component, said of the Reapers. “It’s an attack squadron, with a lot more kinetic ability.”
“Kinetic” Pentagon argot for destructive power is what the Air Force had in mind when it christened its newest robot plane with a name associated with death.
“The name Reaper captures the lethal nature of this new weapon system,” Gen. T. Michael Moseley, Air Force chief of staff, said in announcing the name last September. […] The Reaper is expected to be flown as the Predator is by a two-member team of pilot and sensor operator who work at computer control stations and video screens that display what the UAV “sees.” Teams at Balad, housed in a hangar beside the runways, perform the takeoffs and landings, and similar teams at Nevada’s Creech Air Force Base, linked to the aircraft via satellite, take over for the long hours of overflying the Iraqi landscape.
Robot Air Attack Squadron Bound for Iraq, 15th July 2007, Associated Press