3
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Inteligencia artificial militar: problemas de responsabilidad penal derivados del uso de sistemas autónomos de armas letales Translated title: Military artificial intelligence: criminal liability issues arising from the use of lethal autonomous weapon systems

      research-article

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Resumen: El avance tecnológico ha generado una sociedad del riesgo exacerbada producto del progresivo desarrollo tecnológico. Actualmente, la inteligencia artificial (IA) causa preocupación por su potencial riesgo y uso malicioso en el ámbito militar, lo que ha impulsado el desarrollo de sistemas autónomos de armas letales (SAAL), es decir, armas que pueden operar y atacar sin intervención humana, lo que conlleva al problema de atribución de responsabilidad penal por actos concretados por estas armas autónomas. Así, se identifican distintas posturas divergentes, sin embargo, se concluye que el derecho penal puede y debe abordar los problemas de responsabilidad penal de los SAAL, debido a que puede adaptarse para identificar a los individuos con control significativo sobre estas tecnologías. Esto permite mantener la justicia y la responsabilidad sin desnaturalizar sus principios orientadores centrados en la acción humana de quienes diseñan, fabrican y operan estas tecnologías. Además, el derecho penal puede evolucionar para enfrentar nuevos riesgos sin perder su esencia de proteger bienes jurídicos y la dignidad humana en los conflictos bélicos modernos.

          Translated abstract

          Abstract: Technological advances have led to a risk society, exacerbated by the development of artificial intelligence (AI). Currently, AI generates concern for its potential risk and malicious use, which is evident in the incorporation of AI in the military field, with the development of lethal autonomous weapons systems (LAWS), which correspond to weapons that can operate and attack without human intervention, which leads to the problem of attribution of criminal liability for acts carried out by these autonomous weapons. With respect to this problem, different divergent positions have been identified. However, it is concluded that criminal law can and should address the problems of criminal liability of SAALs because it can be adapted to identify individuals with significant control over these technologies. This allows it to maintain justice and accountability without distorting its guiding principles, which focus on the human actions of those who design, manufacture, and operate these technologies. Moreover, criminal law can evolve to face new risks without losing its essence of protecting legal goods and human dignity in the modern conflicts of war.

          Related collections

          Most cited references50

          • Record: found
          • Abstract: not found
          • Article: not found

          The responsibility gap: Ascribing responsibility for the actions of learning automata

            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Four Responsibility Gaps with Artificial Intelligence: Why they Matter and How to Address them

            The notion of “responsibility gap” with artificial intelligence (AI) was originally introduced in the philosophical debate to indicate the concern that “learning automata” may make more difficult or impossible to attribute moral culpability to persons for untoward events. Building on literature in moral and legal philosophy, and ethics of technology, the paper proposes a broader and more comprehensive analysis of the responsibility gap. The responsibility gap, it is argued, is not one problem but a set of at least four interconnected problems – gaps in culpability, moral and public accountability, active responsibility—caused by different sources, some technical, other organisational, legal, ethical, and societal. Responsibility gaps may also happen with non-learning systems. The paper clarifies which aspect of AI may cause which gap in which form of responsibility, and why each of these gaps matter. It proposes a critical review of partial and non-satisfactory attempts to address the responsibility gap: those which present it as a new and intractable problem (“fatalism”), those which dismiss it as a false problem (“deflationism”), and those which reduce it to only one of its dimensions or sources and/or present it as a problem that can be solved by simply introducing new technical and/or legal tools (“solutionism”). The paper also outlines a more comprehensive approach to address the responsibility gaps with AI in their entirety, based on the idea of designing socio-technical systems for “meaningful human control", that is systems aligned with the relevant human reasons and capacities.
              • Record: found
              • Abstract: not found
              • Article: not found

              On the moral responsibility of military robots

                Author and article information

                Journal
                rducn
                Revista de derecho (Coquimbo)
                RDUCN
                Universidad Católica del Norte (Coquimbo, , Chile )
                0718-9753
                2024
                : 31
                : 00202
                Affiliations
                [1] Temuco Araucanía orgnameUniversidad Autónoma de Chile Chile marcos.aravena1@ 123456cloud.uautonoma.cl
                Author information
                https://orcid.org/0009-0006-0262-7618
                Article
                S0718-97532024000100202 S0718-9753(24)03100000202
                10.22199/issn.0718-9753-6632
                bfe13704-5902-451c-b666-57a6c47f3f97

                This work is licensed under a Creative Commons Attribution 4.0 International License.

                History
                : 17 July 2024
                : 11 October 2024
                Page count
                Figures: 0, Tables: 0, Equations: 0, References: 52, Pages: 0
                Product

                SciELO Chile

                Categories
                ESTUDIOS

                armas autónomas,responsabilización,tecnología bélica,accountability,military technology or warfare technology,autonomous weapons

                Comments

                Comment on this article

                Related Documents Log