FYI.

This story is over 5 years old.

News

UN Debates the Future of Killer Robots

Lethal Automated Weapon Systems (LAWS) — their more technical term — would be able to track and engage targets on their own.
Photo via United States Navy Photograph

The arms race is going sci-fi, and humans are playing catch-up.

United Nations diplomats convened today in Geneva for a week-long meeting of experts on killer robots.

Lethal Automated Weapon Systems (LAWS) — their more technical term — would be able to track and engage targets on their own, without human intervention. They don’t yet exist, but several countries are close.

Though armed robots conjure images of Terminator-like humanoids, they would likely first come from the sky, the evolved offspring of drones used by the CIA.

Advertisement

“We should be focusing on the likely trajectory of technological development, not images from popular culture,” said Stephen G. Townly, legal adviser to the US Department of State, and part of the American team in Geneva.

Canada funds high-tech drone campaign against Mohawk Tobacco trade. Read more here.

Flying Killers

Drones as we know them — the ones buzzing over tribal lands in Pakistan and attacking convoys (and weddings) in Yemen — are not considered autonomous since operators control them, albeit with a joystick, thousands of miles away.

However, US defense contractor Northrop Grumman has developed the X-47b, an autonomous drone aircraft capable of flying itself, and, with a few tweaks, it could eventually fire a weapon on its own. The UK and Israel are also working on autonomous armed drones.

South Korea already has robot sentries, Samsung-built surveillance robots armed with 5.56 mm machine guns and grenade launchers that watch over the demilitarized zone separating north from south.

For now, the machines — like drones — are overseen by soldiers, but they retain the capacity of autonomously targeting and firing using infrared detectors.

Human rights groups hope to use the meeting as a step towards eventually banning cyborgs under the 1980 “Convention on Certain Conventional Weapons (CCW),” the international agreement that today outlaws the likes of white phosphorus, napalm, booby traps and mines.

Advertisement

There are currently 117 parties to the convention, including the US, Russia, China and other countries like Israel that possess the capacity to develop autonomous technologies.

Today, Pakistan officials spoke forcefully against autonomous weapons. The country has suffered more than 2,000 deaths from the US' drone war.

In a statement they pointed out LAWS would lower the threshold for going to war, "resulting in armed conflict no longer being a measure of last resort — consequently, the resort to the use of force in war may become a more frequent phenomenon."

"LAWS are by nature unethical, because there is no longer a human in the loop and the power to make life and death decisions are delegated to machines which inherently lack compassion, morality and intuition," they added.

Giant art installation in Pakistan tells US drone operators people aren't 'bug splat.' Read more here.

On Monday, Human Rights Watch — who helped coined the term “killer robots” — released a report that warned of their potential use away from the battlefield, in potential law enforcement, “robocop”-type environments.

A Killer Defense

There’s recent precedence to the effort around robots.

In 1995, member states agreed to adopt restrictions on the use of “blinding laser weapons” — literally lasers that render permanent blindness on scores of enemy troops.

However, that addition provision to the CCW hasn’t stopped the US Navy from building an elaborate laser weapon system capable of shooting planes and missiles out of the sky with invisible high-intensity beams — just as long as the pilot dies before going blind.

Advertisement

Therein lies the awkward moral quandary built into the regulation of warfare, one now complicated by technology.

“The basic idea is that the rules of war laid out in the likes of the Geneva Convention were based on humans being able to make decisions whether or not they fire a gun or a shoot a missile or drop a bomb,” Thomas Nash, head of Article 36, a weapon-harm prevention group, told VICE News from Geneva.

“With the development of technology that could take away that capacity to make decisions — it’s important to draw a line to insure humans make the decision,” he added. “The moral reason [for a ban] is killing is something humans should decide,” Nash added.

Others think robots are better suited to that role.

Should Robots Decide What to Kill?

At a debate in Geneva, Georgia Tech robotics professor Ronald Arkin argued autonomous weapons “could outperform” human soldiers and suggested their development could “constitute a humanitarian effort.”

Proponents say robots can be pre-programmed (or, perhaps, google the Geneva Convention on the fly) to abide by international law.

“I am convinced that they can perform more ethically than human soldiers are capable of,” Arkin wrote in “Governing Lethal Behavior: Embedding Ethics in a Hybrid Deliberative/Reactive Robot Architecture” report in 2007. He insists robots could then sacrifice themselves in order to protect civilians, unlike human fighters.

Advertisement

But why would a country program their hardware to step in front of bullets if it means losing a war?

US bombards Yemen with drone strikes, but the policy is backfiring. Read more here.

Drawn to its logical, techno-futurist conclusion, the rules of war break down into the absurd.

But Nash understands that despite the dissonance involved in guiding armies towards appropriate ways to kill, maim and destroy, playing catch-up with military science is even more dangerous.

“We see conflict as a failure of the international community,” he said. “At the same time we recognize there needs to be some rules around war and those rules somewhat underline our basic humanity.”

Humanity's system of evaluating and punishing choices of right and wrong dates back to the earliest religious text.

But nowhere in the Ten Commandments does God discuss robotics, let alone a post-singularity world of autonomous weapons.

And the Pope is just trying to keep up.

Today, a Vatican representative told diplomats “we are most troubled by emerging technologies of autonomous weapon systems which may move beyond surveillance or intelligence-gathering capabilities into actually engaging human targets.”

The Geneva meeting will be followed by a larger, annual convening in November.