Autonomous Weapons And The Threat to Humanity
Wednesday, November 21, 2012, by Dylan Novak
There has been plenty of hype in the recent months over U.S. drone strikes on terrorists in Yemen, Pakistan, and Somalia. While many questions have been raised about the legality of these unmanned combat aerial vehicle (UCAV) strikes, especially on U.S. citizens, there is no doubt that the UCAV strikes are an effective weapon of war. Currently, these effective weapons are controlled remotely by military personnel, which means a human being decides when and at whom the missile is fired; however, this might not be the case for long as militaries around the world have begun experimenting with autonomous UCAVs and other weapons.
Autonomous weapons have the capability of carrying out a mission with little or no input from military personnel. Robotic weapons can bring many benefits to the battlefield, such as cheaper upkeep than a soldier and allegedly no danger to friendly personnel. On the other hand, the thought of autonomous killing machines raises a deep-rooted fear in human beings, evident by popular media like I, Robot and Stealth.
Recently, Human Rights Watch and Harvard Law School International Human Rights Clinic published a report detailing the threat that autonomous weapons could pose to humanity. The report criticizes autonomous weapons because they give machines the choice over whether to take a human life or not.
The lack of necessary supervision of autonomous weapons raises the interesting legal question of accountability. When a robot decides to initiate a strike on an unlawful target (e.g., a civilian), who is to be held accountable, “the commander, programmer, or manufacturer”? At the very least, it appears that a new set of law would need to be developed to identify who humanity wants to blame for an artificial intelligence’s decision.
In their report, the human rights advocates call for “an international treaty prohibiting [autonomous] weapons before they show up in national arsenals.” Furthermore, they ask that individual nations avoid the development of these dangerous weapons for the indefinite future. “Human control of robotic warfare is essential to minimizing civilian deaths and injuries.”
It seems unlikely that the United States would agree to not produce any autonomous weapons. Even when the United States agreed to restrict nuclear weapons in the world, it still guaranteed its right to maintain nuclear arsenal as a nuclear weapon state.
Interestingly, the report assumes that the nations of the world have not yet created autonomous weapons as only the precursor to these weapons are now being officially released. However, it is quite possible that these weapons could already exist as a classified part of the U.S. arsenal. This possibility only strengthens the Human Rights Watch’s call for regulation of autonomous weapons.