International Responsibility for Autonomous Weapons Crimes
Corresponding Author(s) : Khaled Ismael
Journal of Law and Emerging Technologies,
Vol. 2 No. 1 (2022)
Abstract
The technological and technical development of artificial intelligence has raised many advantages in all different medical, industrial and administrative fields, but at the same time, it has become a threat to the human race and a violation of the rules and provisions of international humanitarian law, when this modern technology was introduced in the field of military weapons, which raised many of the complexities and risks in the field of armed conflicts.
The reality indicates that developed countries have recently begun to develop and introduce modern technologies and artificial intelligence algorithms in the process of making military weapons, which has resulted in the emergence of new types of weapons known as autonomous weapons or lethal robots. Here, international concerns started around the usage of such weapons (killer robots) that can kill and destroy and carry out military operations by themselves without any human guidance, and their inability to distinguish between civilians and soldiers, as well as their inability to comply with the rules of international humanitarian law.
Accordingly, the idea of this research came to study the extent of the state’s responsibility for the crimes committed by these weapons in the light of the rules of public international law, and to determine who is responsible for the crimes committed by these weapons, in light of the recent emergence of these weapons, in addition to the absence of the international convention and treaty sources from any regulatory framework that shows the extent of the state's responsibility for the use of these weapons or not, as well as the desire to ensure legal protection for members of armed conflict, civilians and combatants, from the danger of using these weapons.
Keywords
Download Citation
Endnote/Zotero/Mendeley (RIS)BibTeX