mercatornet

Features

When is it ethical for police and the military to use killer robots?

When is it ethical for police and the military to use killer robots?

by Heather Zeiger | August 15, 2016

EMAIL

Police responding to the Dallas shooting / The Associated Press  

I live north of downtown Dallas where a Black Lives Matter protest turned deadly on July 7. A gunman, Micah Johnson, killed five police and injured nine other officers and two civilians. Dallas police eventually cornered him in a building at El Centro College where they spent several hours attempting to negotiate. He responded with gunfire and the police decided to use a remote-controlled robot equipped with a C-4 explosive to take him out.

It was the first time that a police robot had killed a suspect in the United States.

Dallas Police Chief David Brown has been instrumental in implementing a de-escalation program that has seen complaints against his officers fall from 150 in 2009 to less than 13 in 2013. Dallas is considered a model for other cities. Even President Obama commended the police for the way they handled this terrible situation.

Most analysts agreed that this was a case where lethal force was necessary. However, some are concerned with the implications of using robots to kill.

For example, should police have resorted to long hours of negotiation before resorting to lethal force? Because there are no standard procedures for using remote-controlled robots or drones in a police setting, many are concerned that remotely-controlled technology might lead police to use lethal force too quickly.

This topic is already on the national agenda. The Obama administration has come under scrutiny for its use of drones, or unmanned aerial vehicles (UAVs), in Pakistan. Let’s consider the ethics of using robots to kill in the police setting and how this compares to the controversial use of drones in the military setting.

Rules of engagement

The rules of engagement for police differ from the military because they have different missions. Seth Stoughton, associate professor at the University of Southern California and a former police officer, distinguishes their roles in The Atlantic:

“The military has many missions, but at its core is about dominating and eliminating an enemy…Policing has a different mission: protecting the populace. That core mission, as difficult as it is to explain sometimes, includes protecting some people who do some bad things. It includes not using lethal force when it’s possible to not.”

While law enforcement may not have procedures for using remote-controlled robots yet, they do have them for when lethal force is appropriate. Part of the impetus behind the Black Lives Matter protests is that there have been cases when these procedures have not been followed.

According to Chief Brown, they needed to use lethal force because the suspect was heavily armed and was still a threat. Attempts at negotiations had not worked. While the method was unprecedented, the decision to use lethal force was not.

War is different. Western nations respect—or at least give lip service to -- just war theory. Its key principles include having an appropriate goal that is worth the cost of war and attainable. War should also be a last resort after other options have been tried. There should be as few civilian deaths as possible and the military should be used in an appropriate (i.e., not wasteful or trivial) manner.

American drones have killed hundreds of terrorists. Proponents claims that they are keeping military personnel safe. Critics claim that drones, particularly in Pakistan, violate the principles of just war theory. They pose some difficult questions. Are the militants really an imminent threat to the US? Does the lack of transparency violate the principles of legitimate authority? Does targeted killing serve other goals than self-defense? Does the use of drones violate the last resort principle?

In both a police and military setting, a person is killed without standing trial. Few people disagreed with the use of lethal force on Micah Johnson. The rules for engagement were clear: when a suspect fires on police and poses a real threat, they may use any means necessary to stop the suspect, including lethal force. According to reports, the suspect was clearly deranged, and violent.

The military situation is less clear. The use of lethal force across borders may be appropriate and lethal force may be warranted if the target poses an imminent threat to the United States. However, critics point out that the standards for who should be targeted by drones are not transparent. Additionally, the military must rely on remote intelligence and must act from a distance. This may deter the possibility of peace talks and makes it difficult to assess whether there might be other options.

Risk-free killing

In the Dallas case, the use of the bomb-equipped robot kept offices from grave danger. The gunman’s expertise from his experience in Afghanistan as well as the failed negotiations showed that police had run out of options. While we cannot know what would have happened, the remote-controlled robot likely led to fewer casualties.

The situation is less clear in military operations. Critics say that risk-free, low-cost killing could stifle interest in finding more creative solutions to conflicts. A study published by the US Army War College Strategic Studies Institute indicated that soldiers were more inclined to resort to force if they could deploy drones compared to airstrikes or inserting infantry.

But the most pressing ethical concern for military operations is lack of transparency. Other issues stem from this. The US and other countries are using drones without the accountability traditionally seen when two countries declare war.

Technology is never morally neutral. In every war generals have used the latest weapons to keep their soldiers as far as possible from danger, ranging from the long bow to rifles to bombers. But a “cost-benefit ratio” is not the only consideration, especially for modern democracies. We need to ensure that there are effective standards and principles, including transparency and accountability.

Heather Zeiger is a freelance science writer with advanced degrees in chemistry and bioethics. She writes on the intersection of science, culture, and technology.

EMAIL

comments powered by Disqus