The Truth About Killer Robots, a 2018 documentary made by Third Party Films and directed by Maxim Pozdorvkin lays out the many ways that automation could affect us in the long term from labor to psychology to sexual encounters. In the context of international law, the issues being discussed relate to the use of unmanned technology during armed conflicts.
To date, twenty-six countries have called for an explicit ban that requires some form of human control in the use of force. However, prospects for an A.I. weapons ban are low as several influential countries including the United States are unwilling to place limits while the technology is still in development.

The following questions have been previously raised by UN experts in a 2013 report:

…is it morally acceptable to delegate decisions about the use of lethal force to such [autonomous] systems?
If their use results in a war crime or serious human rights violation, who would be legally responsible?
If responsibility cannot be determined as required by international law, is it legal or ethical to deploy such systems?

What are killer robots?

That depends on whom you ask. Manufactures of this technology would define a killer robot as, a robot that can make a decision to use lethal force without human intervention. However, Human Rights Watch broadens the definition to include any robot that can choose to use any type of force against a human, even if that force is not lethal. What is agreed is that all LAWS are already regulated by existing International Humanitarian Law (IHL). LAWS that cannot comply with IHL principles, such as distinction (from civilians and combatants) and proportionality (an attack must not be excessive in relation to the concrete and direct military advantage anticipated) are already illegal.

The phrase “meaningful human control” has caused some debate among diplomats. A great deal of the discussion in the LAWS debate is about humans and the term “loops”, which can be explained as follows:

-Human “in the loop”: the robot makes decisions according to human-programmed rules, a human hits a confirm button and the robot strikes.

-Human “on the loop”: the robot decides according to human-programmed rules, a human has time to hit an abort button, and if the abort button is not hit, the robot strikes.

-Human “off the loop”: the robot makes decisions according to human-programmed rules, the robot strikes, and a human reads a report a few seconds or minutes later.

Finally there is “robot beyond the loop”, where there is the largest concern. In this case, the robot decides according to rules it learns or creates itself, the robot strikes, and the robot may or may not let humans know.

What is the Convention on Conventional Weapons (CCW)?

Also known as the “Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects,” there are 120 nations that are ‘high contracting’ or state parties, including all five permanent members of the UN Security Council.

The CCW was adopted in 1980 and contains five separate protocols on various weapons and explosive remnants of war, already covering non-detectable fragments, mines and booby traps, incendiary weapons, blinding lasers and the explosive remnants of war. The CCW also provided a useful incubator for efforts to address the humanitarian consequences of antipersonnel landmines in the 1990s and cluster munitions in the 2000s. A Protocol VI added to the CCW banning “off the loop” LAWS, might be an option.

The Case for and Against Killer Robots

There are already plenty of examples of how technology has changed warfare. For the military, war robots can have many advantages: They do not need food or pay, they do not get tired or need to sleep, they follow orders automatically, and they do not feel fear, anger, pain or remorse. Furthermore, no one would mourn if robot soldiers were destroyed on the battlefield. The most recent and controversial example of how new technologies have changed war is the rise of drone warfare. But even these aircraft have a pilot who flies it by remote control, and it is the humans who make the decisions about which targets to pick and when to fire a missile.

On behalf of not banning LAWS some argue that robots should be regarded more as the next generation of “smart” bombs. They are potentially more accurate, more precise, completely focused on the strictures of International Humanitarian Law (IHL) and thus, in theory, preferable even to human war fighters who may panic, seek revenge or just make human mistake.

Meanwhile, a report by Human Rights Watch released before the last CCW meeting has argued that fully autonomous weapons would make it difficult to attribute legal responsibility for deaths caused by such systems. As the report notes: “[a] variety of legal obstacles make it likely that humans associated with the use or production of these weapons – notably operators and commanders, programmers and manufacturers – would escape liability for the suffering caused by fully autonomous weapons.”

The Campaign to Stop Killer Robots (CSKR), an international coalition working to preemptively ban fully autonomous weapons formed by a number of non-governmental organizations (NGOs) in October 2012, argues for a ban on LAWS similar to the ban on blinding lasers in Protocol IV of the CCW and the ban on anti-personnel landmines in the Ottawa Treaty. They argue that killer robots must be stopped before they proliferate, and that tasking robots with human destruction is fundamentally immoral. The biggest concern is the potential next generation of robotic weapons: “robots beyond the loop,” the ones that make their own decisions about who to target and who to kill without human control.

A ban on autonomous weapons is unlikely as defense contractors, identifying a new source of revenue, are eager to build the next-generation machinery. Last year, Boeing acquired Aurora Flight Sciences, a maker of autonomous aircrafts. The Company also reorganized its defense business to include a division focused on drones and other unmanned weaponry. Other defense contractors such as Lockheed Martin, BAE Systems and Raytheon are making similar shifts.





Unmanned aerial systems (UAS), also known as drones, are aircraft either controlled by ‘pilots’ from the ground or autonomously following a pre-programmed mission. In addition to the term “drone”, these types of crafts may also be referred to as “unmanned aircraft,” “remotely piloted aircraft,” or “unmanned aerial vehicles.” There are dozens of different types of drones, the most commonly used fall into two categories: those that are used for reconnaissance and surveillance purposes and those that are armed with missiles and bombs.

A report released today by Amnesty International titled “ ‘Will I Be Next?’ U.S. Drone Strikes in Pakistan” contains information on 45 drone strikes it says were carried out by the United States in North Waziristan, Pakistan, between January 2012 and September 2013. In some of the attacks, it says, the victims were not members of militant groups like al Qaeda or the Taliban, but just ordinary civilians.

The report by Amnesty International was made public the day before Pakistani Prime Minister Nawaz Sharif is due to meet U.S. President Barack Obama in Washington and calls for certain measures to bring the drone program in line with international law, including conducting impartial investigations into the cases documented, bringing those responsible for human rights violations to justice and offering compensation to civilian victims’ families.

Most of us are familiar with UAS from their use in such places as Afghanistan, Pakistan and Yemen. The main characteristics of UAS are that they do not carry a pilot onboard, but function from “pilot” control from the ground or elsewhere, and they use pre-programmed flight coordinates. The use of UAS have many advantages for the military such as low costs—both for flying as well as maintenance and acquisition, longer flight times and less risks to pilots.

UAS began to show their usefulness at the beginning of the Cold War as a reconnaissance tool. Over time, they have evolved into being used for three categories of action: as attack weapons, operation or strike tools, and as surveillance or reconnaissance systems. All the functioning of the UAS is generally controlled via a laptop computer, a kit mounted on a vehicle or in a larger fixed facility. The current military inventory for unmanned aerial vehicles exceeds 6,000 spread out among all branches of the military, with significant increases planned in the future.

In addition to the report released by Amnesty International, a report issued in conjunction with an investigation by Human Rights Watch details missile attacks in Yemen, which the group believes, could contravene the laws of armed conflict, international human rights law and Barack Obama’s own guidelines on drones. Human rights groups have accused US officials responsible for the secret CIA drone campaign against suspected terrorists in Pakistan of having committed war crimes.

The criticism launched against the US for their use of UAS in Pakistan is based on allegations that drone attacks have killed innocent civilians. Amnesty International has highlighted the case of a grandmother who was killed while she was picking vegetables and other incidents, which could have broken international laws designed to protect civilians.

According an internal Pakistani report leaked earlier this year, at least 10 civilian deaths were confirmed as a result of CIA drone strikes in 2009. The New America Foundation estimates that up to 207 civilians were killed from 2006 to 2009, along with up to 198 people who were not identified in reliable media reports to be either civilians or militants.

The United Nations Convention on Certain Conventional Weapons (CCW or CCWC), concluded at Geneva on October 10, 1980 and entered into force in December 1983, prohibits or restricts the use of certain conventional weapons which are considered excessively injurious or whose effects are indiscriminate. The aim of the Convention and its Protocols is to provide new rules for the protection of military personnel and, particularly, civilians and civilian objects from injury or attack under various conditions by means of fragments that cannot readily be detected in the human body by X-rays, landmines and booby traps, and incendiary weapons and blinding laser weapons.

To the extent that drone attacks are not sufficiently accurate to prevent civilian deaths, some argue that they are in contravention of the Geneva Convention. Additionally, as the applicability of international humanitarian law is sometimes unclear, human rights groups argue that America’s battle with al-Qaeda does not meet the intensity required under the laws of war to amount to an armed conflict.

Do we need the UN to step in and provide a definition of armed conflict for purposes of the use of drones by the US military (or CIA)?

Is the preemptive use of drones to strike at terrorists justified as part of a “new” kind of almost continuous war where the enemy may strike at any time and without any warning and thereby justified under Article VII of the UN Charter?

Is the US use of drones in contravention of international human rights law, which only permits using deadly force when strictly and directly necessary to save human life?