Most reasonable people can see the benefits of using fully autonomous systems, particularly to help prevent injuries or death, as is the case with advanced driver assistance systems increasingly found in automobiles. When it comes to autonomous systems that are designed to take life rather than preserve it, there is significantly more debate.
Currently, the U.S. and other nations do not have any weapons systems that can operate fully autonomously, which is defined in military parlance as selecting, aiming, and firing at a target without a human being "in the loop," or somehow in control of the weapon system. However, a variety of military weapons systems operate semiautonomously, requiring some human control or input to select or choose targets, but relying on pre-programmed algorithms to execute a strike.
This article is thoughtful and well-balanced, but we need to ask, "How do you know whom to target?" From the Guantanamo base prisoners to the targets of autonomous vehicles in Pakistan or Yemen, we depend on intelligence from the ground to choose our victims--and the intelligence is often false. We can't get boots off the ground. We need to embed in and work with the populations we wish to protect. Remote warfare--unless it is unleashed completely from ethics, responsibility, and long-term consequences--is a fantasy.
Thanks for reading the article. Your point is valid. I don't think anyone believes that autonomous weapons will eliminate traditional military activity. The importance of getting boots on the ground, as well as winning hearts and minds, likely will continue to be relevant for many years to come.
The following letter was published in the Letters to the Editor in the March 2017 CACM (http://cacm.acm.org/magazines/2017/3/213824).
"Can We Trust Autonomous Weapons?" as Keith Kirkpatrick asked at the top of his news story (Dec. 2016). Autonomous weapons already exist on the battlefield (we call them land mines and IEDs), and, despite the 1997 Ottawa Mine Ban Treaty, we see no decrease in their use. Moreover, the decision as to whether to use them is unlikely to be left to those who adhere to the ACM Code of Ethics. The Washington Naval Treaty of 1922 was concluded between nation-states entities that could be dealt with in historically recognized ways, including sanctions, demarches, and wars. An international treaty between these same entities regarding autonomous weapons would have no effect on groups like ISIS, Al-Qaida, Hezbollah, the Taliban, or Boko Haram. Let us not be nave . . . They have access to the technology, knowledge, and materials to create autonomous weapons, along with the willingness to use them. When they do, the civilized nations of the world will have to decide whether to respond in kind defensive systems with sub-second response times or permit their armed forces to be out-classed on the battlefield. I suspect the decision will seem obvious to them at the time.
Joseph M. Saur
Virginia Beach, VA
Displaying all 3 comments