Comment
Paper 4
When debating the ethics involving autonomous or non-autonomous
lethal weapons a factor that often goes unobserved is the person controlling or
“supervising” these machines and it affects this has on both the user and the
product. Journalist Azmat Khan points
out in his article written about the effects of prolonged drone operation, “the
six-month study of of nearly 1,500 service members in Nevada and California is
the first to quantify stress levels in military drone operators,”. Therefore we know very little about the
effects of controlled weaponry on the person who operates it. While reading these articles, I found the
most disturbing aspect to be the desensitization that can occur when carrying
out a lethal plan from afar, which is the case for many. Author Rachel Martin claims in her survey
that only about 30% of drone operators experience “burnout”. However it is more alarming that 70% of these
operators are not affected by what
they are doing. It may seem brash to
say, but it is probably true for many that it would be morally and ethically
more traumatizing to murder someone with your own hands than to sit in a room
and press a button.
The human element, which most consider to be a
weakness in combat, is what I believe can be an armies biggest strength. Although our humanity; emotions such as greed,
anger, or sadness; is usually what leads us into war it is also what tends to
push us towards a resolution. Because
with humanity we also encompass sympathy, compassion, and sorrow. When a machine that is lethal lacks all those
things, it becomes more dangerous than any human army. Even when a person possibly supervises its
actions, those emotions often become muted.
Thus wars will start, but they will not end. This also poses the point of “If we can get
it, then who else can?” In the cases of
developed countries, wars will continue to escalate as each offender continues
to up the ante. The use of lethal robots
leaves us with a never-ending battle scenario.
Although robotics leads us to believe that we are saving lives in
combat, the “loop” is becoming so big that people will eventually be
sacrificed, as has happened with drones, and many of whom are not soldiers. Arkin argues that the lack of emotion in
robots is an advantage because there is a lack of fear of hysteria in robots,
and I see his argument when assessing robots that are utilized for non lethal
prospects; however without intuition and a conscious when does anyone or anything
really have the ability to comprehend right and wrong? And how does the use of remote controlled
robots affect the psyche of the user in question and their ability to determine
what is morally ethical and what is not?
Soldiers are bred to feel nothing in battle and this phenomenon is enhanced when looking at autonomous weapons where the soldier is even more distant from the combat that they are effecting. I think that as a nation we should not take emotion out of war. Emotions are critical to war it is emotions that allow one side to reach a compromise and the conflict to end. Take this away and war goes on forever.
ReplyDelete