Researcher ORCID Identifier

0000-0002-9472-8101

Graduation Year

2021

Date of Submission

12-2021

Document Type

Campus Only Senior Thesis

Degree Name

Bachelor of Arts

Department

Philosophy

Reader 1

Alex Rajczi

Terms of Use & License Information

Terms of Use for work posted in Scholarship@Claremont.

Rights Information

© 2021 John R Church

Abstract

There is currently an ongoing debate about whether the usage of lethal autonomous weapons systems (LAWS) can be ethical and, if they can be, under what conditions they are ethical. This thesis specifically aims to address the problem of LAWS accountability. If a LAWS unexpectedly kills a civilian, who is to blame? During the normal operation of a LAWS, who is accountable for the deaths of enemy combatants? I begin by discussing the definition of a LAWS before turning to a brief discussion of machine learning, a technology generally recognized to be substantively different in several ways relevant to a discussion about accountability compared to previous technologies enabling autonomy. I then discuss the possible actors that could be responsible for the events resulting from the use of a LAWS. I divide my arguments into two parts: the first assuming that the LAWS is an agent and the second assuming that the LAWS is not an agent. I find that if a LAWS is not an agent then accountability in respect to it ought to be handled the same way as a conventional weapon whereas if it is an agent then there is the possibility of a responsibility gap. I conclude by noting that the most important part of the discussion about the accountability of a LAWS ought to be its status as an agent, an attribute often glossed over in the present literature.

This thesis is restricted to the Claremont Colleges current faculty, students, and staff.

Share

COinS