Strategic think tanks should also conduct studies on the military, technical, ethical, legal and political issues surrounding LAWS
by Arvind Gupta
Dual-use technologies like artificial intelligence and machine learning, and big data analytics are set to transform the world. With the onset of these technologies, the prospects of development of lethal autonomous weapons systems (LAWS) have increased manifold.
Concerns have been expressed that autonomous weapons systems can bypass human control.
The fifth review conference of the high contracting parties to the convention of prohibition on restrictions on the use of certain conventional weapons, also known as CCW convention, in its meetings in 2016 had decided to set up an open-ended group of governmental experts to closely examine military, technological, legal and moral issues arising out of LAWS.
The first meeting of the CCW group of governmental experts (GGE) was held in November 2017 in New York under the chairmanship of India. Ambassador Amandeep Singh Gill chaired the meeting in which participants from 91 of the convention’s 125 high contracting parties took part along with representatives from international organisations, non-governmental organisations, academia, industry and civil society.
Three panels of experts were set up to look into the technological, military, legal, ethical and humanitarian dimension of LAWS, which has major implications for international security. Ethical and moral concerns are also important as life and death decisions are transferred to a machine.
The militaries all over the world are looking at LAWS with great interest.
They feel that LAWS would increase the efficiency of operations and also minimise the harm to soldiers. The flip side is that autonomous weapons systems could go beyond human control.
They could also fall into the hands of terrorists and other undesirable groups. Proliferation of such weapons would raise many security issues.
Artificial intelligence is the most significant technology in the context of LAWS. However, AI is a dual-use technology with large applications in civilian and military domains. On the civilian side, smartphones are using AI on a routine basis. Currently, AI is not so developed as to make weapons systems truly autonomous and beyond human control. However, in medium to long-term, AI can be expected to lend greater autonomy to machines.
What about the legality of autonomous weapons systems? Questions have been raised over international humanitarian law, which rests on the principles of distinction, proportionality and precaution, as more and more decisions are shifted to machines, and the question of accountability also arises as degree of human involvement reduces.
LAWS raise the question of ethics. Can ethics be coded into machines as legal and ethical questions are shifted to technical domain? Can responsibility, creativity and compassion functions be discharged by machines?
All these issues are difficult to resolve, particularly when there is no agreement on the definitions of autonomous systems. Restrictions on technologies will be opposed by economic interests and technology generators.
As the history of technology shows it is very difficult to stop the march of technology once it is invented.
Today, technology rests in the hands of massively large and powerful companies, which are often stronger than the governments and have much larger financial and human resource capacities. Equally, there is a concern that over-regulation of technology can deprive human beings of the positive benefits of emerging technologies.
But the challenge posed by autonomous weapons systems is real and cannot be entirely wished away. Such systems are already in use — in Afghanistan, for instance. The drones represent a step towards autonomous systems as they are remotely controlled by humans. But humans take decisions on the basis of combination of surveillance, analytics and weapons technologies. A faulty determination either by surveillance satellite or computing machines can easily result in inaccurate decisions.
The challenge of lethal autonomous weapons systems has been recognised and for the first time discussed in the CCW-GGE. No doubt more discussions will take place. Many countries have put forward papers outlining their views on the matter. At the moment views are highly divergent and no consensus has been reached.
India needs to take cognizance of emerging technologies like the AI, machine learning and big data analytics, and develop capabilities in these areas. At the same time it should remain engaged with the issues surrounding LAWS. It would be useful if the government sets up a task force of officials, industry and academia to deliberate over these issues. A discussion paper should be prepared and shared with the public for their comments.
Strategic think tanks should also conduct studies on the military, technical, ethical, legal and political issues surrounding LAWS.
The author is a former deputy national security adviser