This week I read an article called “Autonomous weapons systems, killer robots and human dignity” by Amanda Sharkey. The article discussed how it is argued that autonomous weapon systems are against human dignity. The argument that autonomous weapons are against human dignity is valid, but many other types of weapons are as well. The conclusion of the article is that we should not rely on the dignity argument exclusively when arguing against autonomous weapons systems.
Autonomous weapon systems can select and kill targets by themselves. This statement alone is terrifying, in my opinion. The idea of giving something that is not human agency to kill at its own discretion is terrifying. The article brings up an interesting point: machines cannot take accountability for their actions. If they malfunction and kill a group of civilians, they cannot be punished for it. I believe that accountability is important in war. When a military operation results in accidental deaths, I believe that there should be public scrutiny and the people involved should be held accountable. If an automated weapon killed civilians by accident, the blame could be pinned on the programmers or even the people who ordered the use of the weapons.
Going back to the idea that robots can kill without dignity, another section of this reading stood out to me. It says that “it is an affront to an individual’s dignity if the decision to kill them is made by a machine that does not recognise the value of their life”. I agree with this statement. Machines are not capable of compassion. They have never lived, and can not understand the concept of living. For that reason alone, I don’t think that machines should be used to kill people anonymously.
Something came up in my class today that I have been thinking about that relates to this article. I noticed that in the video we watched in class and in this article, the word “drones” is brought up when describing autonomous weapons. When people hear the word “drones”, there is usually some level of negative connotation. People tend to think of drones as they are described in this article, as dangerous weapons. However, people commonly use the word “drone” to refer to quadcopters as well. I believe that we should move away from that verbiage when referring to quadcopters. There are many different reasons that people fly quadcopters. The vast majority of people fly quadcopters for fun. There should be more differentiation between the two meanings of the word “drone”. There is a negative perception of people who fly quadcopters particularly in part to this misunderstanding. People like me who enjoy flying quadcopters purely for fun are now being subject to more public scrutiny and increased, unnecessary regulations to flying quadcopters.
Aside from that, the article brings up the question: “Are AWS against human dignity in a way that other weapons are not”? I hadn’t considered this. At first, the answer seemed simple, that they are against human dignity differently than other weapons. The rationality behind that for me was that autonomous weapons do not have compassion or empathy. However, atomic bombs can’t make that decision either. I can’t think of any difference, other than the fact that an atomic bomb cannot decide who to kill and who to spare.
Comments