UN diplomats are gathering in Geneva this week to discuss the ethical, legal, and technical implications of nations operating so-called lethal autonomous weapons systems in combat. LAWS are different from remotely piloted aircraft since humans would not be making the targeting decisions for the autonomous weapons as they do today for RPA, states an issue paper from the Center for a New American Security, released on the eve of the April 13-17 meetings of the Convention on Certain Conventional Weapons. Thanks to rapid advances in computer technology, more than 30 nations already operate “human-supervised” systems, such as automated air and missile defenses, but they have been used narrowly in most instances, note authors Michael Horowitz, Kelley Sayler, and Paul Scharre. Understanding the full range of the technology is important to forwarding debate on LAWS, they write. These weapons require careful consideration of accountability and strategic stability in certain cases, and there are important “moral and ethical issues” that current international law does not address, state the authors. The Pentagon’s Defense Innovation Initiative researches autonomous systems as one of its areas of focus.
U.S. munitions have been expended at a high rate during Operation Epic Fury against Iran, prompting concerns that the Pentagon is eating into weapons stockpiles it needs to deter threats around the world. Yet the newly released $1.5 trillion defense budget request was developed before the war against Iran and…