top of page

They Are the LAWS: Killer Robots and Trump

By Lucas Musetti

Let’s talk intelligent, killer robots. The United Nations already is. One of the technical terminologies the UN and partner organizations are using is lethal autonomous weapons systems or LAWS. And for the time being, let’s avoid thinking about Isaac Asimov’s Three Rules because the robots we are looking at are specifically meant to be lethal weapons systems and not just assistants who will serve humans except when it harms them. Before you start thinking that this is just science fiction alarmism, remember that people think H.G. Wells may have influenced the creation of the atomic bomb or that Aldous Huxley predicted mood altering drugs. So, we’re going to take this seriously, because you’d better believe that at a time when the United Nations is trying to find their position that Trump will end up playing a role.

So here are some basic facts about LAWS:

  1. There is no universal definition of what counts as a lethal autonomous weapon system. [1] This will be one of the first issues the United Nations needs to address. For example, a landmine does not need a human’s command to detonate but not everyone considers it to be an autonomous weapon system. For the most part, the United Nations will likely be focusing on robotic weaponry, meaning a system that can carry out a complex series of functions. For example, an autonomous weapons system will need to detect and deliberately target something, which a landmine cannot. Human Rights Watch considers there to be three different types of autonomous robotic weapons; human-in-the-loop systems where robots select targets but only deliver force with a human command; human-on-the-loop systems where robots target and deliver force with human supervision that can override a command; and human-out-of-the-loop systems where robots can target and deliver force without any human oversight. [2] We do not know for certain which of these types the United Nations will make as their primary focus for an international agreement or treaty. Vague terms such as “meaningful human control” and the difficult distinction between autonomous and automated are also sticking points for delegations when it comes to deciding what kinds of weapon systems the United Nations should even be discussing. The “meaningful human control” statement comes from a UN report on peaceful assemblies that recommended that: “Autonomous weapons systems that require no meaningful human control should be prohibited.” [3] Some delegations believe that too concrete of a definition will lead to developers finding loopholes to continue building similar systems while others believe that leaving the definition too vague could impede progress on unmanned devices meant to serve purposes like exploration. [1]

  2. The effort to ban LAWS is pre-emptive. Most of the weapons systems that the United Nations wants to address do not yet exist, which adds to the difficulty in defining them. In many previous weapons bans, such as those regarding landmines or nuclear weapons, the devices had been around for decades before the international community developed a treaty banning them. This preemptive ban would not be the first though, as there were efforts to restrict air warfare before the first military plane was ever commissioned and certainly before Giulio Gavotti dropped grenades that injured no one in the first aerial bombardment from a plane. [4]

  3. The United Nations are not the only ones concerned about killer robots. Other international organizations have released reports and statements urging action in banning these systems. Additionally, technology experts like Elon Musk and Stephen Hawking have urged the UN to ban killer robots before it is too late. [5] Musk and Hawking are particularly nervous that an arms race over artificial intelligence may cause World War Three.

  4. Trump’s people aren’t all on-board with a ban. Stephen Groves served as Nikki Haley’s Chief of Staff before recently signing on as a Deputy White House Counsel and also worked as a senior fellow at the Heritage Foundation, all which culminates to be a highly impressive resume and should signify that he is capable of putting serious thought into his positions. In 2015, he wrote a piece for the Heritage Foundation urging America to continue working on LAWS and oppose any ban on them. [6] The crux of his argument can be seen in this excerpt:

“The U.S. is a leader in the development of LAWS, and should continue to be so. That's the only way U.S. armed forces can retain a tactical and strategic advantage over its enemies in future conflicts. The U.S. delegation should block any effort to ban LAWS or regulate them out of existence.”

Groves also notes the lack of any one concise definition for LAWS, the fact that other countries may still continue working on LAWS even if the United States signs onto a ban, and the need for the United States military to remain competitive, all of which ties strongly to the Realist perspective of international theory. Groves’ proximity to Haley and Trump should be a strong indicator that this sentiment likely carries weight in the White House.

Quite frankly, Grove’s piece is well-written and accounts for the possibility that LAWS could be programmed in some ways to conform to international laws of warfare. However, there is still a distinct possibility that with the incorporation of artificial intelligence or the potential for cyber-attacks on our nation’s capacities the robots would lose this conformity and begin breaking international law on their own initiative. Programming can fail and there is no guarantee that LAWS produced by American companies would not end up harming American soldiers or civilians. In an even more likely scenario, there is no guarantee that when America deploys LAWS that it will be able to distinguish between irregular and guerrilla fighters, which are becoming the new norm for combatants, and civilians as they may very well be wearing the same clothing.

Many of the concerns that Grove cites are the same ones that proponents of nuclear weapons use to justify their arguments. As the United Nations continues to work toward a ban they will likely run into the same issues that have plagued the new nuclear weapon ban from earlier this year. One of the more major issues is likely going to be that the President of the United States will oppose any ban the United Nations tries to implement. Of course, the United Nations cannot explicitly compel any state, and certainly not the United States, to adopt, sign, or ratify any ban they may produce and it is also just as likely that the international community will create a treaty outside of the United Nations as they did in Ottawa with landmines. Still, Trump may choose to attack a ban as an affront to national sovereignty or try to paint the United Nations as panicked fear-mongers and himself as a man of vision leading the way to a technological future. I beg you to please not buy into this. The United Nations is working to prevent the proliferation of a technology that has the capacity to wreak serious havoc and bring tremendous amounts of violence to a world already struggling to reduce the intensity of armed conflicts. That is what is at stake here.

The United Nations is meeting in Geneva as I write this piece. The Convention on Certain Conventional Weapons (CCW) does not yet cover killer robots in its text and the Campaign to Stop Killer Robots, coordinated by Human Right Watch, has criticized the CCW for “dragging its feet” at the international level. [7] This does not mean that there has not been progress at the national or corporate levels. If you are interested in learning more about the progress of the Campaign to Stop Killer Robots, visit their website, StopKillerRobots.org, and social media for updates. The Campaign is made of 64 organizations in 28 countries, including Nonviolence International.

I genuinely believe this campaign is important. It is of the highest importance that military technology constantly improves in order to make our soldiers safer, but LAWS increase the possibility that civilians could die or that a malfunction of some order might put our soldiers at risk too. I feel that at least human-out-of-the-loop systems should be banned and I am very interested to see how this progresses. I hope to have an update for this piece when we finally hear Trump’s decision.

Update: The meeting on the Convention on Certain Conventional Weapons has concluded without much progress on the issues. The Non-Aligned Movement countries generally held together to oppose LAWS and were joined by others but little progress was made on defining LAWS or addressing the measure of meaningful human control. Additionally, countries that are currently developing LAWS or their precursors continued to push back on this measure. As of now there is no official statement from President Trump on the issue.

Note: This article first appeared on AEFocus.org

References

  1. Eveleth, Rose. "So What Exactly is a Killer Robot?" The Atlantic. August 20, 2014. https://www.theatlantic.com/technology/archive/2014/08/calling-autonomous-weapons-killer-robots-is-genius/378799/.

  2. "Losing Humanity." Human Rights Watch. November 19, 2012. https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots.

  3. Joint Report of the Special Rapporteur on the rights to freedom of peaceful assembly and of association and the Special Rapporteur on extrajudicial, summary or arbitrary executions on the proper management of assemblies. Report A/HRC/31/66. United Nations Human Rights Council. p. 15.

  4. Koshy, Yohann. "Death from Above: How Aerial Bombing Changed the World." Vice. February 02, 2017. https://www.vice.com/en_uk/article/ypnnwg/death-from-above-how-aerial-bombing-changed-the-world-1.

  5. Riley, Charles. "Elon Musk and Stephen Hawking warn over 'killer robots'." CNNMoney. July 28, 2015. http://money.cnn.com/2015/07/28/technology/ai-weapons-robots-musk-hawking/index.html.

  6. Groves, Steven. "Will the Obama administration agree to ban 'killer robots'?" The Heritage Foundation. April 20, 2015. http://www.heritage.org/arms-control/commentary/will-the-obama-administration-agree-ban-killer-robots.

  7. "Statement to the Convention on Conventional Weapons Group of Governmental Experts General Exchange of Views in Geneva." Human Rights Watch. November 15, 2017. https://www.hrw.org/news/2017/11/15/statement-convention-conventional-weapons-group-governmental-experts-general.

bottom of page