The word "lethality" has attached itself to a stream of press releases coming out of the Pentagon.
A former secretary of Defense created a whole task force to examine ways to boost lethality of our armed forces. The Army's new AI program, the so-called Advanced Targeting and Lethality Automated System, or ATLAS, would allow ground combat vehicles the ability "to acquire, identify, and engage targets at least 3X faster than the current manual process."
Few would question the need for our armed forces to maintain their competitive edge. A key component of that may be boosting the lethality of certain platforms, whether to protect our forces or deter an adversary from going to war in the first place, especially as the country shifts from fighting terrorists in Iraq and Afghanistan toward addressing larger threats like Russia or China.
Yet, the increased focus on lethality is making some civilians who work in Silicon Valley uncomfortable. The fear is that today's non-lethal digital gizmo might have lethal consequences down the road, akin to how Dow Chemicals' development of napalm was misused in Vietnam.
One can imagine the horror of a software engineer in San Jose who wakes up to hear that her algorithm or robot targeted the wrong suspect, leading to a civilian casualty. Others worry about whether a human is “in the loop” if an armed robot can shoot a suspect.
Consider Microsoft employees' recent open letter to cancel a Pentagon contract for its Integrated Visual Augmentation System, or IVAS. The platform is meant to save lives by increasing the "lethality, mobility and situational awareness" of infantrymen. Likewise, last year Google employees protested the use of their technologies as part of Project Maven, a platform for analyzing drone footage (the contract was not extended). Both projects saw petitions of dissent circulated by employees against partnering with the Pentagon.
The latest spat has even turned tech leaders against their own employees. And while CEOs such as Amazon's Jeff BezosJeffrey (Jeff) Preston BezosSpaceX launches first all-civilian orbit crew into space Tucker Carlson says he lies when 'I'm really cornered or something' Feehery: Not this way MORE claim to have the best interests of the country at heart, there are also handsome profits to be made. The expected budget of Project Maven was as high as $250 million. Google did not even bid on an AI project with the Pentagon valued at nearly $10 billion.
We have arrived at an important inflection point in this country’s history of military innovation. The U.S. military has always prided itself as holding a monopoly on the organized management of violence. That is why the transfer of lethal military kit to police departments has made some military leaders uneasy. Yet, this specialization is eroding, given the changing character of warfare, as expertise in new ways of warfare are shifting from soldiers to civilians.
On future battlefields, the deciding factor will not be whose payload is bigger or weapons platform more superior. It will hinge on which side adapts and innovates faster, often utilizing non-lethal tools.
Lethality is one part of the strategy put forth by former Secretary of Defense James MattisJames Norman Mattis20 years after 9/11, we've logged successes but the fight continues Defense & National Security — The mental scars of Afghanistan House panel advances 8B defense bill MORE to boost the U.S. military's capability and readiness. It includes everything from developing long-range hypersonic missiles to ramping up basic training to improving the nutritional quality of MREs. For some, the attention paid to lethality has a "been-there-done-that" quality to it. After all, this is not the first time the Pentagon has glommed onto a stock phrase to reframe its mission or new threat environment. "Transformation" was the buzzword of the Rumsfeld era. A "Revolution in Military Affairs," or RMA, ginned up similar debates in the 1990s.
The need for greater lethality is a legitimate concern, especially when one considers that the burden of sacrifice falls disproportionately on Army infantry and Marines. The military brass is right to want to win its wars, losing as few American lives as possible. Yet, to some in Silicon Valley, lethality has a less innocuous ring to it. It does not suggest avoiding civilian harm but rather just the opposite. These concerns are amplified by high civilian casualties in war zones like Yemen, Syria, and Afghanistan. These concerns are valid, and this does not make these tech workers the modern-day equivalent of "Hanoi Jane" or any less patriotic. War is ugly, and some techie at Microsoft should not feel morally compromised by handing over something to a battalion commander that frankly few of us understand.
The U.S. military needs to do a better job at framing its mission to non-military audiences. If all its field manuals and press releases call for greater lethality, it should expect some blowback among those in Silicon Valley who prefer to limit war and reduce civilian harm. By "lethality," what the military actually means is achieving greater efficiency and effectiveness.
Warfare is increasingly less focused on killing bad guys than deescalating situations, applying American soft power, and innovating creatively to defeat and deter our enemies.
We need a truce of sorts between Big Army and Big Tech. That does not mean that every military contract with Amazon, Microsoft, or Google should be rubber-stamped, or that tech engineers should blindly fall in line, salute the flag, and not voice concerns.
But it is wishful thinking to believe that a signed petition blocking these programs will somehow limit the likelihood or deadliness of tomorrow's wars. Writing recently in The New York Times, Lucas Kunce, a U.S. Marine, noted that such petitions, while well-intentioned, do not change the calculus or decision-making of leaders going to war.
War in our lifetimes will not resemble a clash of machines. It will involve the taking of human life – in a word, lethality. Going back to the Manhattan Project, the U.S. government has always relied on the private sector to innovate militarily and, yes, make war more humane. Just because the main fear is no longer low-tech improvised explosive devices does not make warfare any less likely or lethal.
Lionel Beehner, Ph.D., is an International Affairs fellow at the Council on Foreign Relations and co-editor of the forthcoming volume, “Blurred Lines: Civil-Military Relations and Modern War.”