Center for Strategic Assessment and forecasts

Autonomous non-profit organization

Home / Defence and security / New in the military / Articles
Autonomous weapons systems: questions and answers
Material posted: Publication date: 08-07-2018
Technical progress in armaments leads to the fact that decisions on the use of force on the battlefield often can be taken by machines operating without human intervention. Here we consider the possible results of such profound changes in the conduct of the war and discourage the use of such weapons, if this cannot be guaranteed respect for international humanitarian law.

How can Autonomous weapons systems, operating independently, to distinguish between combatants and civilians? Will they be able to cancel an attack, which might be caused disproportionate harm to civilians? And who will bear responsibility for violating international humanitarian law?

Because on many questions the answer is no, the ICRC called upon States to adequately assess the potential humanitarian implications and challenges in the field of international humanitarian law, which may be caused by the use of these new military technologies. And in March 2014, the ICRC convened an international expert meeting in order to intensify the discussion of these issues.

What are Autonomous weapon?

Autonomous weapons systems (also known as "lethal Autonomous weapons or killer robots") on their own, without human intervention, find, identify and hit targets. Today there are few weapons systems that are responsible for such critical functions as the detection and defeat of targets. For example, some defensive weapon systems have Autonomous modes of interception of guided and unguided missiles, artillery shells and aircraft of the enemy at close range.

While these systems are generally stationary and operate independently for a short time only in strictly defined circumstances (e.g., in the presence of a relatively small number of civilian persons and objects), and against a limited list of types of goals (e.g., mainly, ammunition or vehicle). However, in the future Autonomous weapon system unable to function outside hard-coded space-time framework, when faced with a variety of rapidly changing circumstances and, perhaps, by choosing people to target.

Are unmanned aerial vehicles (UAVs) are Autonomous weapons?

Autonomous weapons systems open fire without human intervention, in contrast to the methods used today unmanned aerial systems (they are also called drones or remotely piloted aircraft). Have the UAV can be other Autonomous functions (such as autopilot and navigation), but they require human operators to select targets, activate and restore installed on their weapons and open fire.

There were calls for a moratorium or ban on the development, production and use of Autonomous weapons systems. Does the ICRC support these calls?

The ICRC has not yet joined the call. However, the ICRC encourages States to consider the fundamental legal and ethical issues associated with the use of Autonomous weapons systems before, how will their further development or use in armed conflict, as required by international humanitarian law. The ICRC is concerned about the possible humanitarian consequences of the use of Autonomous weapons systems, as well as how it is possible to apply them without violating international humanitarian law.

What does international humanitarian law on Autonomous weapons systems?

Specific rules relating to Autonomous weapons systems, does not exist. However, the law provides that States must investigate, not whether international law prohibited the use of any new weapon or means or method of warfare that it develops or acquires, in some or in all circumstances. This requirement is contained in Additional Protocol I to the Geneva conventions.

In other words, the long-established norms of international humanitarian law governing the conduct of hostilities, in particular the principles of distinction, proportionality and taking precautions in attack, apply to all new types of weapons and technical innovations in the field of warfare, in particular, to Autonomous weapons systems. Conduct a legal assessment of new weapons of critical importance in the development of new military technologies.

The most difficult task for any country, developing or acquiring an Autonomous weapon system is to ensure its ability to operate in accordance with all these principles. For example, it is not clear how such weapons could distinguish between a civilian and a combatant, as required by the principle of distinction. Moreover, such weapons may even be required to distinguish combatants who are actively involved in the hostilities and those who are out of action, or surrendering, and between civilians taking a direct part in armed clashes and armed civilians such as law enforcement officers or hunters – which are protected by law and which cannot be strike.

Autonomous weapons systems will also need to act in accordance with the principle of proportionality, according to which accidental losses among the civilian population expected as a result of a strike on military targets, must not be excessive compared to the anticipated concrete and direct military advantage. Finally, an Autonomous weapon system would need to operate so that it is possible to take the necessary precautions in attack designed to minimize the number of victims among the civilian population.

The evaluations of existing technologies and those that may appear in the foreseeable future, it is unlikely that it will be possible to create a machine with such features of decision-making. So today there are serious doubts about the ability of Autonomous weapons to respect international humanitarian law, except for some very specific cases and most common situations.

What may be the outcome of the use of Autonomous weapons systems in armed conflict?

Some proponents of Autonomous weapons systems argue that these systems can be programmed so that they operate more gently and accurately than humans, and therefore apply them with the goal of reducing accidental losses among the civilian population. On the other hand, their opponents argue that Autonomous weapons systems will never have the man of common sense, which is necessary for the legitimate use of force, and that their application would be likely to exacerbate the humanitarian consequences.

These weapons systems are forced to ponder serious questions of moral order, and their wide dissemination would change radically the existing views on the conduct of hostilities. The main question for us all is whether the principles of humanity and requirements of public conscience allow machines to make decisions about how people should live and who should die.

Who is held liable if the use of Autonomous weapon systems would violate international humanitarian law?

Autonomous weapons system is a machine and it can not be held accountable for violations of international humanitarian law. The question arises, beyond the responsibility of those who deploys the system: who will bear legal responsibility, if the result of the functioning of Autonomous weapons systems will be made a war crime – the engineer, the programmer, the manufacturer or the commander who brought the weapon into action? If responsibility cannot be determined as required by international humanitarian law, is it possible to recognize the deployment of such a system is legal or ethically defensible?

That should be a major issue in future discussions in the States?

Increasing autonomy carries the risk that instead of people decisions be made by machines and, thus, will gradually weaken the human control over the use of force. Although it is recognized that the ultimate control will remain with the people, you need a more detailed discussion of what is sufficient, effective, or appropriate human control over the use of force.

The ICRC encouraged States to perform autonomy in the implementation of "critical functions" of existing and emerging weapons systems, and report this information to gain a deeper understanding of this problem. In future discussions it is necessary to consider the main question: at what point and under what circumstances we are facing the loss of effective human control over the use of force?

Because on many questions the answer is no, the ICRC encourages States to prevent the use of Autonomous weapons systems if it is impossible to ensure respect for international humanitarian law.


Tags: war , AME

RELATED MATERIALS: Defence and security