Lethal Autonomous Weapons System and Artificial intelligence (AI) technologies, including machine learning and deep learning have seized the imagination of people across the globe. There appears to be general agreement on the positive benefits which AI can bring to society. At the same time, there is also an underlying fear grounded in the belief that AI systems would one day exceed human intelligence and capabilities.
Concern of military use –
- The ultimate concern is the use of AI in military applications, specifically those termed as Lethal Autonomous Weapon Systems (LAWS). LAWS, according to a commonly accepted definition, are weapon systems that “once activated, can select and engage targets without further human intervention”.
- There is a view which says that retaining human control over the use of force is a moral imperative and essential for promoting compliance with international law and ensure accountability.
Types of LAWS present today –
- Some of the well-known autonomous defensive weaponry already in use today are missile defence systems such as the Iron Dome of Israel and the Phalanx Close-In Weapon System used by the US Navy.
- Fire-and-forget systems, such as the Brimstone missile system of the United Kingdom (UK) and the Harpy Air Defence Suppression System of Israel, also function in an autonomous manner.
- Another oft quoted example of existing autonomous weapon systems is the SGR-A1, a sentry robot deployed by South Korea in the Demilitarised Zone with its northern adversary.
Concerns about LAWS –
An informal group of experts from a large number of countries has been debating the issue of LAWS for three years now at the United Nations Office of Disarmament Affairs (UNODA) forum known as Convention on Certain Conventional Weapons (CCW).
Approximately 90 countries along with many other agencies participated in a meeting to highlight the concerns about Lethal Autonomous Weapons System –
- states must ensure accountability for lethal action by any weapon system used by them in armed conflict;
- acknowledging the dual nature of technologies involved, the Group’s efforts should not hamper civilian research and development in these technologies; and,
- there is a need to keep potential military applications using these technologies under review.
Arguments in favour of the ban –
- While the former principle requires weapon systems to be able to reliably distinguish between combatants and civilians, the latter requires value judgement to be used before applying military force. According to this argument, LAWS will never be able to live up to these requirements.
- In addition, there is also the consideration of what is known as the Martens Clause, wherein it is contended that delegating to machines the decision power of ‘life and death’ over humans would be “against the principles of humanity and the dictates of public conscience.”
Arguments against the ban –
- There is an argument that says that development and deployment of LAWS would not be illegal, and in fact would lead to the saving of human lives. This is because without the driving motivation for self-preservation, LAWS may be used in a self-sacrificing manner, saving human lives in the process.
- Moreover, they can be designed without emotions that normally cloud human judgment during battle leading to unnecessary loss of lives.
- An argument is also put forth that autonomous weapons would have a wide range of uses in scenarios where civilian presence would be minimal or non-existent, such as tank or naval warfare, and that the question of legality depends on how these weapons are used, and not on their development or existence.
Use in India –
- There are many scenarios where Lethal Autonomous Weapons System can be deployed to advantage. Autonomous systems designed to disarm improvised explosive devices (IEDs) are already in use by Indian forces, although these are non-lethal and defensive in nature.
- Future possible applications include AI-enabled drone swarms to boost surveillance capabilities; robot sentries along the borders to check infiltration by terrorists; autonomous armed UAVs for use in conventional as well as sub-conventional scenarios, and so on. In general, saving own soldiers from the lethality of war would yield rich dividends to any military force, especially in conventional conflicts.
Notwithstanding the world-wide concern about the development of Lethal Autonomous Weapons System from the legal and ethical points of view, it is increasingly clear that, irrespective of any conventions that may be adopted by the UN, R&D by major countries is likely to proceed unhindered. Given India’s security landscape, perhaps there is a need to adopt a radically different approach for facilitating the development of LAWS. As with any transformation, this is no easy task. Only a determined effort, with specialists on board and due impetus being given from the apex level, is likely to yield the desired results.