Military and Killer Robots

25 Nov 2024 The advent of autonomous weapons is often described as the third revolution in warfare. Gunpowder and nuclear weapons were the first and second.
The spread and use of gunpowder and nuclear weapons has radically changed how conflicts are fought and experienced by combatants and civilians alike.

Technological advances now allow weapons systems to identify and attack targets autonomously using sensor processing. This means that there is less human control over what happens and why. This means we are closer to machines making decisions about who to kill and what to destroy.

Autonomous weapons lack the human judgment needed to assess the proportionality of an attack, distinguish civilians from combatants, and adhere to other basic principles of the laws of war.
History shows that their use will not be limited to specific circumstances. It is not clear who, if anyone, could be held liable for illegal actions caused by an autonomous weapon—the programmer, the manufacturer, the commander, or the machine itself—creating a serious accountability gap.

Some types of autonomous weapons will process data and operate at tremendous speeds. These systems, which are complex, unpredictable, and incredibly fast in their operation, have the potential to quickly send armed conflicts spiraling out of control, leading to regional and global instability. Killer robots inherently lack the ability to empathize, understand nuance, or understand context.

That’s why Stop Killer Robots works with military veterans, technologists, scientists, roboticists, and civil society organizations around the world to ensure meaningful human control over the use of force. We call for a new international law because the laws that ban and regulate weapons set boundaries for governments, militaries, and corporations between what is acceptable and what is not.