Top US general warns against rogue killer robots

Getty Images

The second-highest-ranking general in the U.S. military on Tuesday warned lawmakers against equipping the armed forces with autonomous weapons systems that humans could lose control of and advocated for keeping the “ethical rules of war” in place.

In a Senate Armed Services Committee hearing on Tuesday, Gen. Paul Selva responded to a question from Sen. Gary Peters (D-Mich.) about a Defense Department directive that requires a human operator to be involved in the decision-making process when it comes to taking lives with autonomous weapons systems.

Selva warned lawmakers that the military should keep “the ethical rules of war in place lest we unleash on humanity a set of robots that we don’t know how to control.”

{mosads}”I don’t think it’s reasonable for us to put robots in charge of whether or not we take a human life,” Selva told the committee.

Peters mentioned that the directive expires later this year and told Selva that America’s enemies would not hesitate to use such technology.

“Our adversaries often do not consider the same moral and ethical issues that we consider each and every day,” Peters told Selva.

Selvar responded, saying that America does and should always “take our values to war.”

“There will be a raucous debate in the department about whether or not we take humans out of the decision to take lethal action,” he told Peters, but added that he was in favor of “keeping that restriction.”

Selva added that just because the U.S. won’t pursue that kind of technology doesn’t mean it won’t research how to defend against it.

It “doesn’t mean that we don’t have to address the development of those kinds of technologies and potentially find their vulnerabilities and exploit those vulnerabilities,” Selva told the committee.

Last year, Tesla founder Elon Musk and famed astrophysicist Stephen Hawking warned in an open letter against the possibilities of weapons developers using artificial intelligence.

“Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control,” their letter reads. 

See all Hill.TV See all Video