Rise of the Killer Robots: Hint--This is Not a Movie
Printer-friendly versionPDF version
a a
 
Type Size: Small
The Fiscal Times
April 2, 2013

Last month, Boston Dynamics posted a video update of its AlphaDog robot, developed to carry heavy military equipment for soldiers. The company had already released video showing AlphaDog traversing rough terrain and gaining significant speed.

But the March video shows something different: the robot now has a mechanical arm attached to the front. This arm picks up a cinderblock, which likely weighs about 30 pounds, and begins to move its feet rapidly up and down. It perches low, swings the arm to its left and then hurls the cinder block some 20 feet over its right shoulder, just as a decathlete would throw a hammer.

AlphaDog’s ability to hurl cinder blocks is a significant step that comes at a time when the use of robots in warfare is quickly evolving. In throwing the cinderblock, AlphaDog is performing an aggressive act as opposed to a passive one, like carrying equipment. It will also soon be able to process voice commands.

Given these advances, it’s easy to imagine AlphaDog as a military Sherpa—lugging heavy equipment to soldiers in need. It’s just as easy to imagine the robot charging into a battlefield and throwing explosives over the enemy’s defenses.

AlphaDog is only the beginning: in the future, robots will play a massive role in how the United States will wage war. Their use is expected to cut down on casualties and related long-term medical costs. The Pentagon has even hired a scientist that is attempting to program robots to obey the Geneva Convention.

But Noel Sharkey, an ethicist at the University of Sheffield in the United Kingdom, warns that the rise of robot soldiers removes the human moral element from warfare, much like the “Terminator.”  He, along with a number of other scientists and advocacy groups, are expected to launch the “Stop the Killer Robots” initiative in the UK this month.

“There are a lot of people very excited about this technology… this is going to be big, big money. But actually there is no transparency, no legal process. The laws of war allow for rights of surrender, for prisoner of war rights, for a human face to take judgments on collateral damage,” Sharkey said recently. “Humans are thinking, sentient beings. If a robot goes wrong, who is accountable? Certainly not the robot.”

ALREADY IN USE WITH MORE COMING SOON
Robot soldiers have already arrived: beyond drones, the United States is using 2,000 robots in Afghanistan right now. They do everything from sniff bombs to inspect suspicious vehicles, although according to reports they don’t do either very well.

These robots are just the beginning: the secretive Defense Advance Research and Projects Agency (DARPA) has already invested millions into robot fighting technology. In 2013, it spent $7 million on the Avatar Program, which is exploring the possibility of uploading a soldier’s brain to a surrogate robot. It’s also committed $11 million to a program that aims to create robots capable of acting autonomously (DARPA refused to comment for this story).

DARPA has also invested $14 million in the Autonomous Robotic Manipulation program, which aims to create “autonomous (unmanned) mobile platforms to manipulate objects without human control or intervention,” according to the agency’s 2013 unclassified budget request. This program includes work on a robot that can treat wounded soldiers on the battlefield, and then extract them to combat hospitals. It’s also invested $14 million in its Biometric Computing program, which aims to teach robots how to recognize and react to objects.

DARPA’s programs might seem like science fiction, but life is truly imitating art: Boston Scientific has created a robot that looks remarkably like the ones in the Terminator movies. According to the Army, the robot will only be used to test suits designed to protect against chemical weapons. But as the video below shows, the Protection Ensemble Test Mannequin, or PETMAN, can run, kneel and do push-ups. It’s not hard to imagine the robot equipped with a weapon.

The Pentagon has also launched the Future Solider 2030 initiative, aimed at integrating robotic technology with traditional soldiers. This includes robotic exoskeletons that make soldiers stronger and faster as well as integrated optic interfaces that allows soldiers to control robots with their eyes.  

Drones are only the start of the Pentagon’s airborne robotic arsenal: The Navy has paid Northrup Grumman $813 million to develop the X-47B, an unmanned plane that can take off and land on an aircraft carrier. It has both surveillance and strike capabilities and can travel at speeds that would be harmful to humans. It also acts autonomously, receiving mission instructions from an operator then executing them without oversight.

SCIENTISTS SOUND ALARM 
According to Moore’s Law, computer technology doubles every two years. If this is true, the use of robot soldiers is a lot closer than many people know.

This has alarmed scientists and ethicists. Writing in the Wall Street Journal recently, Jonathan Moreno warned not to allow autonomous robots to wage war, and urged the creation of treaties banning the practice.

“Given the obvious dangers to human society, fully autonomous offensive lethal weapons should never be permitted,” Moreno wrote.  “And though the technical possibilities and operational practicalities may take decades to emerge, there is no excuse for not starting to develop new international conventions, which themselves require many years to craft and negotiate before they may be ratified by sovereign states.

To date, the most coordinated effort comes from Sharkey and his Stop the Killer Robots campaign. Despite the hokey name, it has powerful backers, including Jody Williams, the political activist, Nobel Peace Prize winner and founder of the International Campaign to Ban Landmines. The two are expected to launch the anti-robot initiative at the House of Commons this month.

“Killer robots loom over our future if we do not take action to ban them now," Williams told the Guardian. "The six Nobel peace laureates involved in the Nobel Women's Initiative fully support the call for an international treaty to ban fully autonomous weaponised robots.”

An editor-at-large for The Fiscal Times, David Francis has reported from all over the world on issues that range from defense to border security to transatlantic relations.