Written by: Sabina Kulueva & Agata Bidas, EST Working Group Security & Defense

Edited by: Jessica White & Martina Canesi

Executive summary

This policy brief provides an overview of the European Union’s (EU) stance on autonomous weapons. The work outlines the pitfalls of the current EU legislation, including the inadequate definition of the Lethal Autonomous Weapon Systems (LAWS) and failure to properly address the four different concerns, namely security, ethical, humanitarian, and legal, associated with the deployment of such novel armaments. This analysis concludes with a number of recommendations to address the identified shortcomings.

Introduction

In August 2017, 110 leaders from AI and robotics companies worldwide signed an open letter to the Group of Governmental Experts, established by the United Nations Conference on Certain Conventional Weapons. In their appeal, they urgently called on the international community to take action in regulating lethal autonomous weapon systems (LAWS) (Future of Life, 2017). They further cautioned that such novel weaponry could constitute “the third revolution in warfare”; if developed and subsequently deployed, such armaments would dramatically increase the scope and speed of armed conflicts (Future of Life, 2017).

To ascertain the extent to which the EU has sought to address concerns related to the LAWS, we decided to examine the content of thematically relevant official legislation. As a guideline, we referred to Secretary-General Antonio Guterres’s “A New Agenda for Peace”, in which he not only called on the international community to adopt a binding resolution on LAWS, but also pointed out the security, ethical, humanitarian and legal concerns raised by such novel weapons systems (United Nations Executive Office of the Secretary-General, 2023). The online database of EU legislation, EUR-Lex, was searched to retrieve documents published after the specified date containing the term “autonomous weapon”. Accordingly, this policy brief presents a content analysis of the twenty relevant documents, outlines the shortcomings, and concludes with several recommendations for policymakers to modify the current norms and laws regulating such arms.

Problem Description & Background

International community’s position on autonomous weapon systems

Weaponry with primitive autonomous capabilities existed as early as the mid-20th century. Initially, they were employed for surveillance purposes. Decades later, the Israeli army deployed such weaponry during actual combat operations in Lebanon (Williams, 2011, p. 2). In the early 2000s, the United States utilised Predator drones equipped with missiles in their anti-terrorist operations in Afghanistan and Libya (Risen, 2002). Over time, such weapons became increasingly sophisticated. The Panel of Experts on Libya reported the deployment of the Turkish lethal autonomous weapon STM Kargu-2 against Hafter Affiliated Forces, noting that it was “programmed to attack targets without requiring data connectivity between the operator and the munition” (United Nations, 2021, p. 17). Despite the technological evolution of the LAWS, a global debate on the ethical issues associated with the use of such novel weaponry commenced only in 2012 with the publication by Human Rights Watch (HRW) of the report titled “Losing Humanity: The Case against Killer Robots”. 

A decade later, UN Secretary-General António Guterres presented his position on the LAWS in “A New Agenda for Peace,” in which he corroborates the views outlined in the HRW report. In particular, he calls upon the international community to establish a legally binding, international, instrument that would “prohibit lethal autonomous weapon systems that function without human control or oversight, and which cannot be used in compliance with international humanitarian law, and to regulate all other types of autonomous weapons systems” (United Nations Executive Office of the Secretary-General, 2023, p. 27). In essence, Guterres’s position is that the measures applied to the AWS should be contingent on the degree of autonomy, with a call for the prohibition of fully autonomous weapons and the regulation of those that can be controlled by humans. Furthermore, the Secretary-General asserts that this category of weapon systems poses humanitarian, legal, ethical and security problems, and constitutes a direct threat to basic human rights and freedoms (United Nations Executive Office of the Secretary-General, 2023, p. 27).

Absence of the common definition

To address the first facet of Guterres’ contention- namely legally binding legislation- a universally recognised definition of the LAWs seems the first logical undertaking. As none of the documents adopted by the EU legislative bodies after July 2023 contained the definition of LAWS, we referred to the European Parliament’s Resolution 2018/2752(RSP). According to this document, LAWS are defined as “weapon systems without meaningful human control over the critical functions of selecting and attacking individual targets” (European Union, 2019, p. 87). Furthermore, “non-autonomous systems such as automated, remotely operated and teleoperated systems” are not regarded as LAWS (European Union, 2019, p. 87). 

A comparison of the EU definition with the definition set out in Guterres’ earlier communiqué, suggests that the initial point of criticism pertains to the categorisation of the AWS. While the Secretary-General defines LAWS as a subset of the AWS, the European Parliament differentiates between LAWS and non-autonomous systems and provides examples of armaments that fall into one of the two categories. Besides, the EU has not amended the definition since 2018. Given the rapid pace of technological development, particularly in the fields of robotics and artificial intelligence, the emergence of a new generation of the LAWS in turn undermines the EU legislative perspective. As a result,  the EU’s current definition of LAWs requires an update.

The international community has yet to reach a consensus on a unanimous definition of the LAWS (United Nations, 2024, p. 5). Taddeo and Blanchard (2022) examine the content of a variety of existing definitions and propose one that encompassess important variables. By comparing the definitions proposed by states and international organisations, and highlighting their shortcomings, the scholars conclude that four essential characteristics of such weapons; “autonomy, adaptability of AWS, human control, and purpose of use” should be considered while attempting to define LAWS (2022, p. 1). Following this line of argument, one might contend that, while the EU’s definition incorporates concepts of human control and purpose of deployment, it does not consider significant features such as autonomy and adapting capabilities. Scharre (2018, p. 27) defines autonomy as “the type of task the machine is performing; the relationship of the human to the machine when performing that task; and the sophistication of the machine’s decision-making when performing the task”. Adaptability as a feature focuses on the learning capabilities of the weapon systems. Increasingly, more AWS are equipped with AI (Taddeo & Blanchard, 2022, p. 12), the exclusion of such salient characteristics challenges the efficiency of the EU definition. As a result, the EU definition, which has not been reviewed since 2018, should be updated in light of the Secretary-General’s distinction between LAWS and AWS and rapid technological advances, and should include concepts of autonomy and adaptive capabilities.

Lethal autonomous weapons and security concerns

EU legislation has, to varying degrees, addressed the four concerns raised by the Secretary-General in his call. In the context of security concerns, which are to be understood as concerns pertaining to the safety of the territorial integrity of the EU, the subject of autonomous weapons is addressed in the documents pertaining to EU defence. Such legislation mentions a conflict in Ukraine and documents the “return of high-intensity warfare and territorial conflict to Europe” (European Union, 2023c, p. 1). Moreover, it highlights the “increasing fragmentation and polarisation” of the global system, which adversely impacts the “global threat landscape” (European Commission, 2024, p. 5). Notwithstanding the plethora of security-related concerns, the EU is unequivocal in its stance regarding the deployment of the LAWS. In particular, the EU states that “weapons and weapon systems specifically designed to defend own platforms, forces and populations against highly dynamic threats such as hostile missiles, munitions and aircraft are not considered lethal autonomous weapon systems” (European Union, 2019, p. 88). Consequently, despite the escalating geopolitical tensions, the EU’s commitment to addressing security challenges through conventional defence means demonstrates its cautious approach to the use of new technologies, such as lethal autonomous weapon systems, and underlines its preference for traditional, human-controlled military strategies to ensure stability and compliance with international norms.

Furthermore, the EU does not only refuse to deploy LAWS, but also opposes the financing of the production and procurement of such weapons. In particular, the legislation explicitly prohibits funding for weapons systems where “selection and engagement decisions” to kill combatants are not made by humans (European Union, 2023b, p. 18; European Union, 2023c, p. 10) and outlaws acquisition of such armaments (European Commission, 2024, p. 64). Consequently, while the EU recognises the urgent necessity to reinforce the defence industry of the region, it clearly states that it would maintain production of conventional systems, thereby demonstrating alignment with the call of the Secretary-General.

Lethal autonomous weapons and ethical concerns

The ethical concerns associated with LAWS primarily raise questions concerning the permissibility of machines’ participation in decision-making processes that potentially result in harm or death to human beings. The United Nations (2024, p. 11) emphasise that such systems lack “empathy, compassion and the ability for moral reasoning“. Arguing that the killing of human beings” and, more specifically, “the delegation of the decision to take a human life by machines” are immoral, it stresses that the use of LAWS would violate human dignity and result in dehumanisation, which in turn could give rise to “unjustified violence and civilian casualties” (United Nations, 2024, p. 11). EU policy-makers likewise draw attention to the potential moral repercussions of the deployment of such weaponry. The AWS Resolution explicitly states that “human involvement and oversight are central to the lethal decision-making process, since it is humans who remain accountable for decisions concerning life and death” (European Union, 2019, p. 87). Moreover, the European Parliament urges the EU to take action to introduce a binding legal framework outlawing LAWS that lack proper human oversight (European Union, 2024, p. 9). Furthermore, in the Progress Report addressing the weapons of mass destruction, the supranational organisation outlines that “all weapons, also in the area of emerging technologies, comply with international law, and in particular International Humanitarian Law (IHL), taking into account relevant ethical considerations” and emphasizes that “human responsibility for decisions over the use of weapons should be maintained and that human accountability must be preserved at all times and across the entire lifecycle of a weapons system” (European Union, 2023a, p. 15). Consequently, the EU recognises the ethical implications of the LAWS and unequivocally endorses the Secretary-General’s call towards the creation of a legal document prohibiting the LAWS. Nevertheless, we believe that the future updated legislation should make references to the real-world cases where AWS are already being deployed. For instance, the ongoing war in Gaza has demonstrated the negative real-world implications of AWS usage. The Israeli Defence Forces’ deployment of AI-powered systems such as “The Gospel”, “Lavender” and “Where’s Daddy” to identify Hamas members has resulted in significant destruction and numerous civilian casualties (Serhan, 2024). Even though such use of AI does not represent the deployment of fully autonomous systems, the operators’ confirmation of the system-generated targets demonstrates the absence of effective human oversight (Serhan, 2024). We are convinced that the inclusion of empirical evidence would strengthen the validity of the renewed AWS legislation.

Lethal autonomous weapons and humanitarian concerns

The third concern raised by Guterres is the compliance of LAWS with the humanitarian principles and the core principles of IHL in particular. The European Parliament in its AWS Resolution has also expressed concerns over LAWS’ inability to meet the standards of IHL, because of the possibility of violating the principle of distinction, according to which combatants must be distinguished from civilians, and that attacks must only target legitimate military objectives (European Union, 2019, Crawford & Pert, 2020). Compliance with this principle is essential, but it is significantly challenged by the limitations of LAWS, which struggle to accurately distinguish between civilians and combatants—particularly in complex environments like urban warfare or asymmetric conflicts where both groups may be present in close proximity (Heyns, 2013). 

Another crucial principle of the IHL is the principle of proportionality, according to which “any attack should not cause damage to civilian life and property that is excessive in relation to the anticipated military advantage” (Crawford & Pert, 2020). However, autonomous systems inherently face difficulties with accurately assessing the proportionality of an attack, as they lack the capacity to fully assess the broader context of the operation (Heyns, 2013). While real-world examples of LAWS deployment are still scarce, this limitation is a significant concern highlighted in the literature. For instance, as Heyns (2013) argues, autonomous systems may struggle to weigh military advantage against potential civilian harm in complex and unforeseen situations. The European Parliament in its AWS Resolution has recognised that the international regulations need to guarantee that LAWS are not deployed in scenarios where they cannot reliably adhere to the principles of proportionality and distinction (European Union, 2019), demonstrating a commitment on paper to integrating humanitarian considerations into the development and implementation of LAWS but relying on such non-binding resolutions may not be sufficient to address the inherent limitations. However, relying on such regulatory aspirations alone risks overlooking the structural limitations of autonomy in warfare, which strongly suggest a fundamental incompatibility with the principles of IHL.

Lethal autonomous weapons and legal concern

The final concern, also stressed by the European Parliament (European Union, 2019), is the need for clear legal frameworks addressing the responsibility gaps that arise when autonomous systems are used in warfare. Traditionally, individuals such as combatants, commanders, or political leaders have been held accountable for violations of IHL and IHRL (Crawford & Pert, 2020). However, when autonomous systems operate with significant independence, it becomes increasingly unclear who should be held responsible for unlawful acts—such as attacks leading to civilian casualties—especially if the use of force was not explicitly authorized by a human operator (Heyns, 2013). This ambiguity risks undermining legal accountability and could result in violations of IHL and IHRL going unpunished. Therefore, to ensure that justice for victims is maintained synonymously with the integrity of international law, in the AWS Resolution, the European Parliament has called for rules guaranteeing meaningful human control and accountability mechanisms as part of any future EU regulation on LAWS.

The assumption that meaningful human control alone can address these fundamental challenges should be approached critically. While human oversight may reduce certain risks, it does not necessarily resolve the broader ethical and legal dilemmas inherent in delegating life-and-death decisions to machines. Furthermore, the introduction of autonomous weapons systems significantly complicates the attribution of responsibility, potentially creating a spectrum where combatants are partially involved, yet the machine’s actions remain opaque. This risks the machines being adopted as scapegoats, obscuring accountability for lethal decisions. It is also unclear what “meaningful” means in practice and whether such control would remain feasible in fast-paced or complex conflict scenarios. Without a clear and enforceable definition, the principle of ‘meaningful human control’ risks becoming a vague safeguard, providing a false sense of security and potentially enabling the continued deployment of systems that may, in fact, undermine the core values of IHL. This occurs because the ambiguity of the term ‘meaningful’ allows for varying interpretations, leading to the deployment of systems with minimal or ineffective human intervention. In rapid or complex scenarios, this vagueness permits increased autonomy, potentially compromising ethical considerations and violating IHL principles like proportionality and distinction. Moreover, the lack of a precise definition makes legal enforcement difficult, creating loopholes that allow for the deployment of potentially harmful technologies without adequate oversight, ultimately eroding accountability.

Policy Options & Recommendations

In light of the issues associated with the definition of the LAWS and concerns regarding the deployment of such weapons, the following proposals are put forward to address them.

The definition of the LAWS lacks clarity and is outdated, and thus we recommend a review of the version from 2018. The EU should not only specify the distinction between AWS and LAWS, but also incorporate the concepts of autonomy and adapting capabilities into the new definition. A compelling example of an alternative definition is put forward by Taddeo and Blanchard (2022, p. 15), who differentiate between AWS and LAWS by regarding the latter as a “specific subset” of the former. Furthermore, the scholars propose a “value-neutral definition,” which incorporates such features of the LAWS as “autonomy, adapting capabilities, and control” (Taddeo & Blanchard, 2022, p. 14). They argue that these characteristics exhibit a gradation, thus reflecting the flexible nature of the definition in light of the rapid advancements being made in AI and robotics (Taddeo & Blanchard, 2022, p. 14). In light of the fact that EU legislation repeatedly focuses on the rapidly changing regional and global security landscape, it is imperative that EU institutions make compromises and accept a more encompassing definition of LAWS. Adopting such a definition would have a positive impact on the governance of these novel weapons.

Furthermore, EU legislation that addresses the ethical implications of LAWS would benefit from real-world cases. In light of the fact that, while addressing security concerns, the law under scrutiny refers to the empirical evidence from the war in Ukraine, it is recommended that the same logic be applied to the legal provisions concerning the moral implications of LAWS deployment. One potential solution to this challenge could be to draw upon examples of armed conflicts in which AWS have been employed. The previously mentioned case of the war in Gaza could be a viable solution. This approach would serve to reinforce the argument for the prohibition of LAWS deployment.

From a humanitarian perspective, the use of LAWS should not result in indiscriminate effects or harm to civilians and combatants hors de combat. In the AWS Resolution (European Union, 2019), the EU distinguishes between the AWS and defensive systems that are excluded from the definition of AWS. However, the question of attack is not raised in the resolution. Perhaps, then, the EU could introduce strict criteria detailing in what circumstances LAWS can be used in armed conflict. The main criterion should be that the use of LAWS can only be used in areas with no civilian presence, limiting the use of LAWS to military objectives, which could minimise one of the main concerns of the autonomous systems – that they will target civilians. The EU legislation on LAWS  should ensure that they will be used in compliance with IHL, particularly in relation to the protection of non-combatants, medical personnel, and humanitarian workers. 

The crucial concern discussed above is the question of accountability. As already emphasised, the AWS resolution – while mentioning accountability as an important issue- does not provide specific regulations on the issue. In order to guarantee accountability for LAWS deployment, the new EU regulation should include particular provisions that specifically address the question of who bears responsibility in case of a violation. In particular, the EU should regulate that all LAWS implemented in EU member states have an accountability mechanism that documents every choice the system makes in a manner that allows for a review in the future. The ICRC has proposed a few examples  that could be implemented in a new EU regulation, for example target selection data logging, the justification for particular actions, and system-human operator communication logs.  This would guarantee that humans are still accountable for the actions of autonomous systems (ICRC, 2021).

To comply with IHL and ensure meaningful human oversight, the EU should introduce regulations that require human control over LAWS at all stages of use, from activation to operation. This is important because the implementation of IHL regulations, including the rules of distinction, proportionality, and precautions, requires context-specific assessments that can only be made only by humans who are able to monitor the changing circumstances during an attack and react accordingly (ICRC, 2019). This could be addressed in a new clause of the updated AWS Resolution, which would mandate that any LAWS used by member states must be under human supervision at all times during operation.

As one example of how this could be implemented, the ICRC (2021) emphasizes that even the most advanced autonomous systems should not be allowed to operate in fully independent modes, meaning without any human intervention in targeting decisions. To ensure that this policy will be implemented in practice, the updated regulations could require LAWS to have, as proposed by the ICRC (2021), a “kill switch” feature that would enable human operators to instantly override or disable the weapon at any moment in order to guarantee adherence to this policy. This would ensure that such weapons are completely compliant with the IHL principles of humanity and proportionality and protect against the abuse or unforeseen consequences of automated decision-making during a conflict.

Conclusion

The rapid advancement and deployment of lethal autonomous weapons systems (LAWS) requires urgent legislative reform within the European Union. While the EU has expressed support for the Secretary-General’s call to produce a legally binding document that would regulate LAWS, it is imperative that the EU adapt its own laws to reflect the realities of modern warfare. Recent examples, such as the increasing use of drones and AI-supported systems for target identification and strike operations in Ukraine and Gaza, highlight the pressing need for clarity, accountability, and robust human oversight. The proliferation of AI-driven targeting tools, and the potential for their misuse by state and non-state actors, creates significant ethical and security risks. Without updated legislation and in particular renewal of the existing definition of LAWS, these technologies threaten to outpace existing legal and ethical standards.

In an era of rising geopolitical tensions and blurred lines between combatants and civilians, upholding international humanitarian law is a moral and legal necessity. The challenges posed by algorithmic bias and the attribution of responsibility in autonomous systems are significant. The new legislation should be flexible enough to both permit and prohibit such novel weaponry, depending on the level of human control over AWS.  Reforming the EU’s AWS Resolution and establishing clear legal definitions are crucial to safeguard human dignity and minimize civilian harm. As the international community grapples with the governance of autonomous weapons, the EU has the opportunity to lead by example.

Bibliography

BMEIA. (2024). Killer robots on the battlefield: Vienna Conference on Autonomous Weapons Systems from 29 to 30 April 2024. https://www.bmeia.gv.at/en/ministerium/presse/aktuelles/2024/04/killer-robots-on-the-battlefield?

Council of the EU. (2019). Humanitarian assistance and International humanitarian Law: Council adopts conclusions. Press Release. https://www.consilium.europa.eu/en/press/press-releases/2019/11/25/humanitarian-assistance-and-international-humanitarian-law-council-adopts-conclusions/

Crawford, E., & Pert, A. (2020). Types of Armed Conflicts. In International Humanitarian Law (pp. 54–95). chapter, Cambridge: Cambridge University Press. 

Docherty, B. L., Human Rights Watch, & Harvard Law School International Human Rights Clinic. (2012). Losing Humanity. The Case against Killer Robots. Human Rights Watch. https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots

European Commission. (2024). Commission staff working document: Addendum to the proposal for a Regulation of the European Parliament and of the Council establishing the European Defence Industry Programme and a framework of measures to ensure the timely availability and supply of defence products (‘EDIP’) COM(2024(150). SWD(2024) 515 final. https://eur-lex.europa.eu/legal-content/NL/ALL/?uri=SWD:2024:515:FIN

European Union. (2019). European Parliament resolution of 12 September 2018 on autonomous weapon systems (2018/2752(RSP)). Official Journal of the European Union, C 433, 86-88.

European Union. (2023a). Annual Progress Report on the Implementation of the European Union Strategy against the Proliferation of Weapons of Mass Destruction (2022). Official Journal of the European Union, C, 1-80.

European Union. (2023b). Regulation (EU) 2023/1525 of the European Parliament and of the Council of 20 July 2023 on supporting ammunition production (ASAP). Official Journal of the European Union, L 185, 7-25.

European Union. (2023c). Regulation (EU) 2023/2418 of the European Parliament and of the Council of 18 October 2023 on establishing an instrument for the reinforcement of the European defence industry through common procurement (EDIRPA). Official Journal of the European Union, L, 1-16.

European Union. (2024). European Parliament resolution of 28 February 2024 on human rights and democracy in the world and the European Union’s policy on the matter – annual report 2023 (2023/2118(INI)). Official Journal of the European Union, C, 1-24.

Future of Life Institute. (2017, August 20). An Open Letter to the United Nations Convention on Certain Conventional Weapons. Future of Life Institute. https://futureoflife.org/open-letter/autonomous-weapons-open-letter-2017/

Heyns, C. (2013). Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions. Human Rights Council, A/HRC/23/47. 

International Committee of the Red Cross (ICRC). (2019, March 26). Statement of the International Committee of the Red Cross (ICRC): Agenda item 5(a) – An exploration of the potential challenges posed by emerging technologies in the area of lethal autonomous weapon systems to international humanitarian law. Group of Governmental Experts on Lethal Autonomous Weapons Systems, Convention on Certain Conventional Weapons, Geneva. https://docs-library.unoda.org/Convention_on_Certain_Conventional_Weapons_-_Group_of_Governmental_Experts_(2019)/CCW%2BGGE%2BLAWS%2BICRC%2Bstatement%2Bagenda%2Bitem%2B5a%2B26%2B03%2B2019.pdf

International Committee of the Red Cross (ICRC). (2021, August 3). Statement of the International Committee of the Red Cross delivered at the Convention on Certain Conventional Weapons (CCW) before the Group of Governmental Experts on Lethal Autonomous Weapons Systems. International Committee of the Red Cross. https://www.icrc.org/en/document/autonomous-weapons-icrc-recommends-new-rules

Risen, J. (2002, November 8). Threats and Responses: Drone Attack; An American Was among 6 Killed by U.S., Yemenis Say. The New York Times. https://www.nytimes.com/2002/11/08/world/threats-responses-drone-attack-american-was-among-6-killed-us-yemenis-say.html

Scharre, P. (2018). Army of None: Autonomous Weapons and the Future of War. W. W. Norton.

Serhan, Y. (2024, December 18). How Israel Uses AI in Gaza—And What It Might Mean for the Future of Warfare. Time. https://time.com/7202584/gaza-ukraine-ai-warfare/

Taddeo, M., & Blanchard, A. (2022). A Comparative Analysis of the Definitions of Autonomous Weapons Systems. Science and Engineering Ethics, 28(5), 37. https://doi.org/10.1007/s11948-022-00392-3

United Nations Executive Office of the Secretary-General. (2023). A New Agenda for Peace (Policy Brief 9; p. 38). https://www.un.org/sites/un2.un.org/files/our-common-agenda-policy-brief-new-agenda-for-peace-en.pdf

United Nations. (2021, March 8). Letter dated 8 March 2021 from the Panel of Experts on Libya Established pursuant to Resolution 1973 (2011) addressed to the President of the Security Council [Letter]. United Nations. https://digitallibrary.un.org/record/3905159

United Nations. (2024). Lethal autonomous weapons systems. Report of the Secretary-General. A/79/88. United Nations. https://docs-library.unoda.org/General_Assembly_First_Committee_-Seventy-Ninth_session_(2024)/A-79-88-LAWS.pdf

Williams, J. (2011). Borderless Battlefield: The CIA, the U.S. Military, and Drones. International Journal of Intelligence Ethics, 2(1), 2–34.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also like