Redirecting...

Laws on LAWS: Regulating the Lethal Autonomous Weapon Systems

  • Published
  • By Sitara Noor

Abstract

The development of lethal autonomous weapon systems (LAWS) has sparked a wide-ranging debate on their use and regulation. This article delves into the contrasting perspectives of optimism, pessimism, and realism surrounding LAWS. Optimists view LAWS as a revolution in military affairs, while pessimists raise concerns about their destabilizing effects and ethical implications. Realists advocate for regulated development and better understanding. This article examines the ongoing debate, the need for a common understanding of LAWS, and proposes options for regulation. It explores the technical and legal definitions of LAWS, analyzes arguments for and against regulation, and discusses potential pathways for addressing the challenges posed by LAWS. The article emphasizes the importance of international cooperation and highlights the risks of unregulated adoption of LAWS. By finding common ground and strengthening norms against unrestrained use, it aims to contribute to the development of laws and regulations that promote global peace and security.

***

 

There is nothing more difficult to take in hand, more perilous to conduct, or more uncertain in its success, than to take the lead in the conduct of a new order of things.

Niccolò Machiavelli

The Prince

 

F

or more than a decade, there has been growing concern over the development of lethal autonomous weapons systems (LAWS), weapons capable of selecting and engaging targets without human intervention, also known as killer robots. The invention of LAWS is seen as the “third revolution in warfare,” with their development being deemed as significant in terms of impact as gunpowder and nuclear weapons.1 Various states, international nongovernmental organizations (NGO), civil society, including artificial intelligence (AI) experts, have independently and jointly worked at different levels to halt and regulate the use of these lethal machines.

However, the debate surrounding this new development is fragmented and marked by divergent viewpoints, resulting in a lack of significant regulatory progress. The failure to reach a consensus can be attributed, in part, to the inherently secretive and opaque nature of the weapon system, which hinders transparency among states involved in their development.

This article aims to analyze the ongoing debate regarding LAWS and the pressing need to regulate these swiftly advancing weapon systems. To achieve this goal, the article will first address the issue of defining LAWS and examine the primary arguments both in favor of and against controlling and regulating these systems. Finally, an attempt will be made to identify a viable way forward and explore potential pathways that developing and possessing states may pursue.

What Are Lethal Autonomous Weapon Systems?

Various weapon systems, such as land and naval mines, have incorporated a degree of autonomy in their operation, a concept that is not entirely new. Naval mines, for instance, have been utilized in warfare since the 16th century, with the earliest known use of floating explosive torpedoes dating back to 1777 during the American Revolutionary War.2 Presently, several countries employ weapon systems with varying levels of autonomy, including the US Phalanx Close In Weapon System (CIWS), the Israeli HARPY loitering munition (LM) and Iron Dome, Russian Arena, and the German AMAP Active Defence System (ADS).3

While autonomy in weapon systems is a prevalent concept, confusion persists regarding the technical and legal definition of Lethal Autonomous Weapons Systems (LAWS). This is primarily due to the absence of a universally accepted definition, with only a few explanations provided by different organizations to elucidate their stance on LAWS. The International Committee of the Red Cross (ICRC) defines LAWS as “ny weapon system with autonomy in its critical functions—that is, a weapon system that can select (i.e. search for or detect, identify, track, select) and attack (i.e. use force against, neutralize, damage or destroy) targets without human intervention.”4 The 2022 US defense policy on LAWS defines it as “a special class of weapon systems that use sensor suites and computer algorithms to independently identify a target and employ an onboard weapon system to engage and destroy the target without manual human control of the system.”5 An earlier definition outlined in the US Department of Defense Directive (DODD) 3000.09 describes autonomous weapons as “weapon system[s] that, once activated, can select and engage targets without further intervention by a human operator.”6 DODD 3000.09 further classifies LAWS into full autonomous, autonomous, and semi-autonomous weapons. Full autonomous weapons operate without human involvement, autonomous weapons require human oversight, allowing the operator to monitor and halt the system, while semi-autonomous weapons involve human control but possess “fire-and-forget” capability, enabling them to select targets and require human command for the attack.7

These definitions provide insight into the modern manifestations of LAWS, which have transitioned from the realm of science fiction to the tangible reality of today. The Aegis Combat System, the first instance of a semi-autonomous weapon system, infamously shot down Iran Air Flight 655 in March 1988, mistaking it for an Iranian F-15 fighter plane. While initially attributed to human error in misinterpreting signals from the machine, this incident remains relevant today, underscoring the potential consequences of machine-driven decision-making processes rather than human involvement.

Similarly, military drones have long been employed for remote surveillance and strike operations. Contemporary drones, armed with explosives and equipped with target identification technology, possess the ability to detect and engage targets without relying on human controllers. Recent reports indicate the first documented use of autonomous drones occurred in Libya in March 2020. According to a UN report documenting the incident, the Turkish-made Kargu-2 drone autonomously hunted down members of the Libyan National Army.8 These weapons were programmed with a “fire, forget, and find” capability, functioning independently without requiring continuous data connectivity between the operator and the munition. While the specifics of the Turkish drones’ fully autonomous utilization in this operation remain unclear, the fact remains that the Kargu-2 drone demonstrates the potential for complete autonomous operation through machine learning. They have proven to be a force multiplier and a game changer, positioning themselves as potential weapons of choice in future operations.

During the 2022 Russian invasion of Ukraine, a notable LAWS incident came to light. Open-source analysts reported the presence of the Russian KUB-BLA loitering munition system, a collaborative production of Kalashnikov and ZALA Aero group, in the Podil neighborhood of Kyiv in March 2022.9 Although there is yet to be confirmed evidence of the drone being deployed in its fully autonomous mode, capable of selecting and killing targets, it is evident that this technology has made its way onto the battlefield and is here to stay.

Three Approaches on the Lethal Autonomous Weapon Systems

The debate surrounding the usability and legality of these weapons has garnered significant attention from the public, industry, defense community, and governments alike. Several formal forums are devoted to studying this subject in depth. The International Committee for the Red Cross (ICRC) has hosted two expert meetings, while the United Nations, under the Convention on Certain Conventional Weapons (CCW), has organized three informal expert meetings to assess and evaluate the technological, military, ethical, and legal aspects of LAWS. These informal meetings resulted in the establishment of a formal Group of Governmental Experts (GGE) in 2017. The GGE’s primary objective is to explore ways and means of regulating LAWS in response to growing concerns surrounding their proliferation. Since its inception, the GGE has convened numerous meetings to address the issues and enhance clarity on the topic.

The key challenge in discussing the regulation of LAWS lies in the absence of a shared understanding of concepts and definitions. There exists a notable disparity in comprehending their military applications, technical functionalities, and subsequent legal implications. As a result, current positions on LAWS can be categorized into three approaches: the pessimistic, the optimistic, and the realist viewpoints.

The Pessimists

Pessimists categorize LAWS as weapons of mass destruction (WMD) and perceive their development and potential use in warfare as highly destabilizing, contradicting legal, ethical, and moral values. In 2019, the UN Secretary-General conveyed a message to the GGE in Geneva, stating that “machines with the power and discretion to take lives without human involvement are politically unacceptable, morally repugnant and should be prohibited by international law.”10 The UN Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions recommended in a 2013 report that states “establish national moratoria on aspects of Lethal Autonomous Robotics (LARs) , and calls for the establishment of a high level panel on LARs to articulate a policy for the international community on the issue.”11

Furthermore, approximately 30 countries, including Pakistan, Argentina, Austria, Brazil, Morocco, New Zealand, and around 165 global organizations, are spearheading the debate and demanding a preemptive ban on LAWS due to operational risks, legal non-compliance, and accountability issues.12 Opponents, including a significant number of AI scientists, assert that the deployment of LAWS will severely diminish international, national, and personal security. They draw parallels to civil society’s momentum that led to the establishment of a global norm through the Biological Weapons Convention, envisioning a similar impact on LAWS.13

Pessimists argue that if a state only needs to risk machines without human involvement, it would significantly lower the threshold for engaging in war, thereby violating the principle of Jus ad Bellum. They highlight legal concerns, emphasizing the inadequacy of existing laws regarding accountability in the use of fully autonomous weapons. They contend that there are legal gaps as current accountability systems fail to adequately address the use of LAWS. Fully autonomous weapons would be unable to adhere to crucial norms such as distinction, proportionality, and military necessity, which safeguard civilians during armed conflict. A killer robot does not fit the role of a “natural person” under international law, and a completely autonomous weapon system cannot be held legally responsible for any crimes committed due to the absence of intent. The argument of holding the commander in charge of LAWS responsible for potential miscalculations is deemed insufficient by pessimists, as it would only lead to civil liability under indirect responsibility rather than direct criminal responsibility.14

Based on these concerns, the pessimistic approach sees the necessity and feasibility of a treaty banning LAWS, following the successful examples set by the prohibition of antipersonnel landmines in 1997 and cluster munitions in 2008.15

The Optimists

The optimists lend their support to the development of autonomous weapons systems, viewing it as a revolution in military affairs. This debate primarily unfolds in the United States, which became the first country to issue an official policy on LAWS.16 Several other nations, including Australia, France, Germany, India, Israel, Russia, South Korea, Spain, Turkey, and the United Kingdom, also oppose a preemptive ban on LAWS.17

According to the optimists, robots may prove more effective than human soldiers in certain situations.18 They argue that autonomous systems may even exhibit more humane behavior on the battlefield, as they are likely to act conservatively and without the psychological pressures that can lead to emotionally driven decisions. Consequently, they believe that autonomous weapons have the potential to reduce the number of noncombatant casualties and minimize collateral damage during warfare.19

In response to the criticisms put forth by pessimists, scholars advocating for the development of LAWS contend that, like any other weapons system, their ethical implications depend on how and under what circumstances they are used.20 They argue that autonomous weapon systems can be governed by an inbuilt ethical code, thereby falling within the purview of existing Law of Armed Conflicts (LOAC). In fact, they suggest that autonomous systems may even refuse or report unethical orders issued by military personnel. Supporters maintain that autonomous systems possess a more objective viewpoint and can potentially prevent human errors in judgment.21 Regarding the ability to discriminate between civilian and military personnel and targets, proponents assert that LAWS can employ mechanisms such as sophistication, restriction, updates, and human involvement to make such distinctions, thus ensuring compliance with international humanitarian law (IHL).22

Regarding accountability and attribution, especially concerning Article 7 of the Responsibility of States for Internationally Wrongful Acts (2001), which addresses “Excess of authority or contravention of instructions,” proponents argue that even if an autonomous system behaves unexpectedly and beyond the scope of its initial deployment, the actions will still be attributed to the state.23

The Realists

The realists position themselves between the optimists and pessimists, considering arguments from both sides. They acknowledge that the ship has sailed regarding LAWS, as numerous states have already invested in research and development. A complete policy reversal and ban on LAWS are deemed unlikely. Instead, the realists advocate for a better understanding of the evolving situation and the implementation of checks on the process.

This approach finds support in China's stance, which advocates for the prohibition of the use but not the development of LAWS. China characterizes these systems as indiscriminate, lethal, and unaccountable, and argues that they would inherently violate international humanitarian law (IHL).24 Speaking at the United Nations GGE China expressed its “desire to negotiate and conclude” a new protocol for the Convention on Certain Conventional Weapons “to ban the use of fully autonomous lethal weapons systems.”25 However, China's position has faced criticism for its perceived ambiguity, as it continues to advance its work on artificial intelligence in military domains and maintains its cyber sovereignty.

France and Germany, who primarily align with the optimist category, also support a practical approach. They have proposed “issuing a non-binding political declaration that would state that LAWS are subject to international humanitarian law and that states parties “share the conviction that humans should continue to be able to make the final decisions regarding the use of lethal force and should continue to exert sufficient control over lethal weapon systems they use.”26 This reflects a realist perspective. The CCW Convention's Group of Governmental Experts has adopted 11 guiding principles emphasizing the necessity of human control and the applicability of international humanitarian law to “all weapons systems, including the potential development and use of lethal autonomous weapons systems.”27

The realists adopt a nuanced view, recognizing that determined individuals may still develop autonomous weapons despite attempts to stop them, drawing parallels to chemical weapons.28 However, they emphasize the need to control and regulate the research and development of this new weapon category. They share the concerns of the pessimists, acknowledging the vulnerability of this technology and the possibility of it falling into the hands of nonstate actors. Thus, they call for better regulation to govern the development and proliferation of LAWS. Their valid solution lies in the argument that legal requirements for the use of force must be considered from the outset of developing autonomous weapon systems, rather than as an afterthought.

Squaring the Circle: The Way Forward

The rapid progression of autonomous weapons from science fiction to reality raises overarching concerns. Similarly, the current trends of alleged LAWS usage are expected to transition into announced and more organized usage in future conflicts. However, the biggest challenge lies in the regulation process, as the development and use of LAWS are outpacing discussions of a complete ban or the implementation of effective regulations, despite ongoing efforts for over a decade.29

During the Sixth Review Conference of CCW, the Group of Government Experts discussed various proposals, including introducing legally binding instruments, non-legally binding instruments, clarifying states' commitment to implementing existing obligations under international law, particularly IHL, developing regulations based on IHL, or considering the option of no further legal measures.30

The array of options on the table reflects the complexity of the issue and the challenge of achieving consensus. While divergent views exist regarding whether LAWS are regulated by IHL treaties, there is a common understanding that their use must comply with IHL. However, doubts remain, as the optimists, including possessing and developing states, insist that their LAWS will operate within IHL parameters, raising concerns about the potential for a different and more problematic reality.

Another challenge arises from the divisive nature of global politics, which hinders positive outcomes in addressing this evolving challenge. Major powers are engaged in a competition of arms buildup, particularly the strategic competitions between the US–China and US–Russia. The nonproliferation regime is also in disarray, with various previously agreed treaties collapsing. Creating an environment conducive to constructive dialogue among states to discuss banning, controlling, or regulating the development of LAWS is therefore challenging.

Despite these challenges, it is crucial to maintain engagement and momentum towards clear end goals to uphold the majority perspective of the pessimists driving the debate on banning LAWS. It should be noted that some states currently favoring the pessimists’ demand may be assessing other options while observing the evolving situation. In the absence of a global norm against LAWS, a security dilemma is likely to emerge. Consequently, indecisive states may feel compelled to develop LAWS to ensure their security. For instance, countries like Pakistan, currently advocating for a ban on LAWS, may reconsider their position and embark on LAWS programs if their rival, India, continues to develop LAWS without consequences.

The following is a list of proposed options that can guide future actions in regulating LAWS:

Developing a Common Understanding. The lack of a common understanding on the definition of LAWS and the specific risks associated with their use in warfare poses a significant obstacle. The optimist states must provide greater transparency on their LAWS development programs and operational strategies to foster a shared understanding. They should address specific technical questions, such as the human-machine interface and ensuring strong human control, particularly in nuclear environments.

Adopting a Bottom-up Approach. In the absence of significant progress at multilateral forums, sustained social activism at the grassroots level can be employed as a bottom-up approach to garner support and exert pressure at the political level. Various studies support the notion that transnational advocacy movements have effectively influenced citizens to pressurize their governments for causes. The ongoing discussions among the Group of Government Experts at the CCW highlight the immense challenge of developing consensus among state parties. States adopting the optimist approach are resisting calls for a complete ban on LAWS, indicating that the current negotiation format is unlikely to result in substantive regulations, let alone a complete ban.

Numerous NGOs, including Human Rights Watch, actively lead campaigns against killer robots, albeit with limited scope. Drawing inspiration from the International Campaign to Abolish Nuclear Weapons (ICAN) and the subsequent Treaty on the Prohibition of Nuclear Weapons (TPNW), campaigners against killer robots should also engage in grassroots-level debates among the general population. This approach will help generate bottom-up pressure, similar to the ICAN campaign, leading to a more comprehensive movement against killer robots.

Strengthening the Norm Against the LAWS. Despite criticism of the nuclear ban movement for its limited achievement in nuclear disarmament, the normative value of such movements cannot be disregarded. Emulating this approach, campaigns against LAWS can strengthen existing efforts by NGOs and civil society, making it politically difficult for leaders to employ LAWS without effective human control.

Finding Common Ground. LAWS pessimists, particularly civil society activists, call for preemptive bans on autonomous weapons through "upstream regulation." Optimists argue for “downstream” regulations, anticipating that morality will evolve alongside technology.31 Both sides should move beyond strong opposition and seek common ground to build mutual trust. Optimist states may not completely abandon their LAWS programs, especially without a verification mechanism to ensure their adversaries are not secretly developing LAWS. However, optimists must acknowledge the concerns raised by pessimists, as these weapons, beyond a certain level, may unintentionally escalate conflicts and pose a threat to peace.

Conclusion

As new technologies become more widely available and affordable, the capacity to establish rules and control their adoption diminishes. Regarding LAWS, UN Secretary-General António Guterres acknowledges the rapid development of this technology, stating that it is progressing at “warp speed.”32

Amid the definitional ambiguity and subjective assessments of pessimists and optimists on the role of LAWS in future warfare, realists are gaining ground and advancing the debate. The pressure to regulate these emerging technologies for global peace and security is steadily increasing. However, progress thus far has been unsatisfactory, with no clear pathway or unanimity among leading possessor states on the way forward. Operational aspects and the compatibility of IHL with LAWS raise numerous unanswered questions. While LAWS optimists assert that future employment will adhere to IHL limits, there is no clarity on attribution and responsibility in case of violations. Additionally, the potential unpredictability arising from the use of LAWS in war-like situations and how such situations would be handled within IHL parameters remain unaddressed.

The employment of LAWS and other AI/machine learning systems will transform the nature of war. Increased reliance on machine learning and AI-based systems may lead to policy makers’ “AI overconfidence” if they assume that these systems generate accurate data and superior analysis compared to human sources.33 In such scenarios, even when employing semi-autonomous weapons systems with human involvement in decision-making, the ability or willingness to challenge information provided by AI-based systems or verify it through other means may be significantly diminished. This can result in dangerous situations and a higher risk of unintended escalation, particularly in a nuclear environment.

Furthermore, the lack of meaningful international control over the acquisition and adoption of these technologies incentivizes nonstate actors to utilize LAWS for terrorist purposes. Non-state organizations such as Hezbollah and ISIS have already demonstrated their ability to operate drones in various attacks.34 Without regulations in place for dual-use technology necessary for LAWS, more nonstate actors will venture into this field.

Given the gravity of the situation, urgent action is necessary before these technologies become ubiquitous. While existing export control systems are not failsafe, the addition of LAWS-specific export controls would complement the current system and introduce hurdles to impede easy access to technologies that may pose harm to human beings. 


Sitara Noor

Ms. Noor is a research fellow in the Project on Managing the Atom at Harvard University’s Belfer Center for Science and International Affairs. She previously held the position of senior researcher at the Centre for Aerospace & Security Studies (CASS) in Islamabad, Pakistan, from 2019 to 2022. Prior to that, she served as a research fellow at the Vienna Centre for Disarmament and Non-Proliferation (VCDNP) in Vienna, Austria, from 2016 to 2017. From 2008 to 2015, she worked as an international relations analyst at the Pakistan Nuclear Regulatory Authority.

In addition to her research roles, Ms. Noor has served as an adjunct faculty member at several prestigious institutions, including the National University of Modern Languages, the National University of Science and Technology (NUST), Quaid-i-Azam University, the Foreign Services Academy of Pakistan, and the Information Services Academy of Pakistan.

Ms. Noor has also held various visiting fellowships, including the South Asian Voices visiting fellowship at the Stimson Centre in Washington, DC, from 2019 to 2020, and visiting fellowships at Sandia National Labs in 2019 and 2013, as well as at the James Martin Center for Nonproliferation in Monterey, California, in 2013. Since 2012, she has served as the country coordinator for the University of Gothenburg project “Variety of Democracy.” Her written work on nuclear and security issues has been featured in prominent national and international platforms such as Al Jazeera, The News, The National Interest, The Diplomat, and South Asian Voices, among others.

Disclaimer: The views and opinions expressed or implied in the Journal of Indo-Pacific Affairs are those of the authors and should not be construed as carrying the official sanction of the Department of Defense, Department of the Air Force, Air Education and Training Command, Air University, or other agencies or departments of the US government or their international equivalents.


Notes

1 Billy Perrigo, “A Global Arms Race for Killer Robots Is Transforming the Battlefield,” Time, 9 April 2018, https://time.com/.

2 Trevor English, “How Do Naval Mines Work?,” Interesting Engineering, 5 October 2019, https://interestingengineering.com/.

3 Gulshan Bibi, “Implications of Lethal Autonomous Weapon Systems (LAWS): Options for Pakistan,” Journal of Current Affairs 2, no. 2 (2018): 18–41, https://www.ipripak.org/.

4 International Humanitarian Law and the Challenges of Contemporary Armed Conflicts (Geneva: International Committee of the Red Cross, 31 October 2015), https://www.icrc.org/.

5 “Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems,” In Focus (Washington, DC: Congressional Research Service, 15 May 2023), https://crsreports.congress.gov/.

6 US Department of Defense Directive Number 3000.09: Autonomy in Weapon Systems (Washington, DC: DOD, 25 January 2023), https://www.esd.whs.mil/.

7 “Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems.” For more definitions of various types of LAWS, also see Bonnie Docherty, Human Rights Watch, Losing Humanity: The Case against Killer Robots (Cambridge, MA: Human Rights Watch, November 2012), https://www.hrw.org/.

8 Steven Spittaels et al., Final Report of the Panel of Experts on Libya Established Pursuant to Security Council Resolution 1973 (New York: UNSC, 2011), https://reliefweb.int/.

9 Stijn Mitzer and Jakub Janovsky, “Attack on Europe: Documenting Russian Equipment Losses during the 2022 Russian Invasion of Ukraine,” Oryx, 24 February 2022, https://www.oryxspioenkop.com/.

10 “Machines Capable of Taking Lives without Human Involvement Are Unacceptable, Secretary-General Tells Experts on Autonomous Weapons Systems” (press release, United Nations, 25 March 2019, https://press.un.org/.

11 Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns (New York: Human Rights Council, 9 April 2013), https://www.ohchr.org/.

12 Brian Stauffer, “Stopping Killer Robots: Country Positions on Banning Fully Autonomous Weapons and Retaining Human Control,” Human Rights Watch, 10 August 2020, https://www.hrw.org/.

13 Ian Sample, “Ban on Killer Robots Urgently Needed, Say Scientists,” The Guardian, 12 November 2017, https://www.theguardian.com/.

14 Bonnie Docherty, Mind the Gap: The Lack of Accountability for Killer Robots (Cambridge, MA: Human Rights Watch, 9 April 2015), https://www.hrw.org/.

15 Stauffer, “Stopping Killer Robots.”

16 “Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems.”

17 Stauffer, “Stopping Killer Robots.”

18 Paul Scharre, Autonomous Weapons and Operational Risk (Washington, DC: Ethical Autonomy Project, Center for a New American Security, 2016), https://s3.us-east-1.amazonaws.com/.

19 Ronald C. Arkin, Governing Lethal Behavior in Autonomous Robots (New York: Routledge, 2009).

20 Christopher Ford, “Autonomous Weapons and International Law,” South Carolina Law Review 69, no. 2 (2017): 413–79, https://web.archive.org/.

21 Amitai Etzioni and Oren Etzioni, “Pros and Cons of Autonomous Weapons Systems,” Military Review (May–June 2017): 72–81, https://www.armyupress.army.mil/.

22 Christopher Ford, “Autonomous Weapons and International Law.”

23 “Responsibility of States for Internationally Wrongful Acts,” UN Doc. A/56/83, at pt. 1, ch. II, art. 4 (2001), https://documents-dds-ny.un.org/; and Ford, “Autonomous Weapons and International Law.”

24 Liu Zhen, “Time to Set Global Rules for AI Warfare, China Tells UN Weapons Review,” South China Morning Post, 14 December 2021, https://www.scmp.com/.

25 Hayley Evans, “Lethal Autonomous Weapons Systems at the First and Second U.N. GGE Meetings,” Lawfare Blog, 9 April 2018, https://www.lawfareblog.com/.

26 Stauffer, “Stopping Killer Robots.”

27 “Meeting of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, Annex III: Guiding Principles affirmed by the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons System,” United Nations, https://documents-dds-ny.un.org/ .

28 Ian Sample, “Thousands of Leading AI Researchers Sign Pledge against Killer Robots,” The Guardian, 18 July 2018. https://www.theguardian.com/.

29 Neil Davison, “A Legal Perspective: Autonomous Weapon Systems under International Humanitarian Law” (paper, Convention on Certain Conventional Weapons Meeting of Experts on Lethal Autonomous Weapons Systems, 11 April 2016), https://doi.org/.

30 “Report of the 2022 session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems,” United Nations Office on Disarmament (UNODA) 29 July 2022, https://documents.unoda.org/.

31 Kenneth Anderson and Matthew C. Waxman, Law and Ethics for Autonomous Weapon Systems: Why a Ban Won’t Work and How the Laws of War Can, Jean Perkins Task Force on National Security and Law Essay Series (Stanford, CA: Hoover Institution Press, 9 April 2013), https://scholarship.law.columbia.edu/.

32 “‘Warp Speed’ Technology Must Be ‘Force for Good’ UN Chief Tells Web Leaders,” UN News, 5 November 2018, https://news.un.org/ .

33 David Minchin Allison, “The Risk of Destabilizing Technologies: Artificial Intelligence and the Threat to Nuclear Deterrence” (working paper, Yale University, Department of Political Science, 25 September 2020).

34 Jürgen Altmann and Frank Sauer, “Autonomous Weapon Systems and Strategic Stability,” Survival 59, no. 5 (2017), 127, https://doi.org/.

 

Disclaimer

The views and opinions expressed or implied in JIPA are those of the authors and should not be construed as carrying the official sanction of the Department of Defense, Department of the Air Force, Air Education and Training Command, Air University, or other agencies or departments of the US government or their international equivalents. See our Publication Ethics Statement.