Killer drones — or “slaughterbots” — are already conducting airstrikes without any humans involved in the decision making, according to a recent UN report. Again, not the piloting, the decision making. Computers are deciding who to drone strike.
And that should have us really worried, a group of researchers argue in a guest post for IEEE Spectrum. “In so many words, the red line of autonomous targeting of humans has now been crossed,” the team writes.
The use of lethal autonomous weapon systems, according to them, should immediately be ceased. Nations around the world should sign a treaty to make sure these killer robots will never be used again.
At a minimum, this offers legitimate opportunity for sanctions, arrest and indictment.