project
Responsible ‘Killer Robots’?
Military operations rely increasingly on complex, intelligent combat systems. International humanitarian law puts limitations on when they can be used. The researchers clarify what ‘meaningful human control’ is and provide a general direction for the designing-in of human responsibility into combat systems.
The objection that nobody can be held responsible for the actions of these systems is not justified. It is rather the risks involved that deserve more attention. An e-partnership approach to the design of such systems should ensure responsible deployment. Given that many weapons systems already incorporate autonomous elements, the researchers recommend an “e-partnership approach” to the design of all such systems to ensure compliance with the law and address issues of responsibility. In such an approach pro-actively shaping the partnership between humans and machines is put central in the design process.
What does such an e-partnership approach entail? In their policy paper “Robo-Wars; The Regulation of Robotic Weapons” the researchers provide manufacturers and the military the following suggestions:
- “Prioritising human oversight of and control over remote controlled and autonomous weapons;
- Ensuring operators are able to override the robot at any stage of its deployment;
- Putting in place adequate mechanisms to hold individuals responsible for the deployment of robotic weapons;
- Designing machine autonomy to enhance human decision-making, not substitute for it.”
To states they recommend to make sure that existing legal and ethical frameworks – such as humanitarian human rights law - are taken into account already in the design phase of new military technologies. Enhanced compliance should be made an explicit goal during the process. This requires periodic reviews of new military technologies.
Responsibility gap?
One perceived problem with intelligent combat or robotic weapon systems is a responsibility gap. The fear is that since these systems are so complex, nobody - not even its manufacturer or operator - can predict what targeting decisions will be taken and executed. This raises the question who can be held responsible or liable for what happens on the battlefield.
“The problem of a responsibility gap is less severe than commonly assumed”, the researchers conclude. “The increased use of machine autonomy does not undermine a commitment to moral responsibility in the armed services.” The research team found that robotic weapon systems are rarely fully autonomous or fully operator-controlled.
Risks
More problematic than the issue of responsibility are the risks involved. States should invest in more and more systematically analysing the technological, political and strategic risks of robotic weapon systems. For instance, it is not known how weaker states or non-state actors are going to respond when they face such superior weapon systems. They “may find it easier, and more effective,” the researchers note in their policy paper, “to detonate a primitive IED [Improvised Explosive Device] in an attack on a conventional military target (such as a convoy), or even against a civilian target, rather than pick a fight with an autonomous robotic weapon.” This means that the distinction between combatants and civilians would deliberately be abandoned, which would be undesirable from both an ethical and political perspective.
Humanitarian Law
The project also led to various insights regarding the legal aspects of autonomous weaponry systems. According to international humanitarian law (IHL) belligerent parties are not allowed to intentionally target civilians / non-combatants. It would be very difficult to programme machines in a way that fullfils this criterion. A machine, once deployed, will find it hard to distinguish a child with a toy gun from an armed soldier. However, the researchers conclude, there are areas where civilians are hardly present, such as the high seas, the deserts and outer space. Here the use of autonomous weaponry against non-human targets could potentially comply with international law.
What are robotic weapon systems?
Modern warfare is impossible without advanced information-based decision support systems. Such operational environments involve a range of systems that shape the choices available to military personnel.
When linked to precision munitions and delivery systems, these complex information systems create unprecedented capabilities to control the delivery of military force. An example is Remotely Piloted Aircraft Systems (RPAS), often popularly known as ‘drones.’
The project specifically looked at a future generation of Automated, Intelligent Combat and Decision Support Systems for Command and Control that are being developed by The Netherlands Navy, the Netherlands Defence Academy and the CAMS Force Vision team of the Dutch Ministry of Defence.
autonomous weapons, human control, robotic weapon systems, military technology, military technology, killer robots, humanitarian law, regulation, risks, combat systems, human combatants
Official project title: