Wednesday, January 16, 2013

Comment Paper #4


                    Some of the points raised in Arkin’s article had me rethink my position on drone warfare. For the majority of this class, I viewed drone warfare as a scary possibility for the future of war. However, Arkin piqued my interest when he discussed the fact that the direct involvement of humans in warfare is rife with atrocities. I believe that while looking at the dangers and negative aspects of drone warfare, I was failing to criticize manned warfare which in turn may have placed it on a pedestal of sorts. From Arkin’s reading, I was reminded that though drones might lack heart, they also lack negative human emotions like greed and revenge. Yet, after continuing to analyze the concept of furthering the removal of humans from warfare, I began to see the downfalls of autonomous drone warfare.

                For a brief moment, the very thought of being able to override the downfalls of selfish human behavior and the atrocities that result from it in war was appealing. However, when it comes down to it, these are still new technologies which we cannot accurately predict the risk of using for noncombatants. These machines have just as many vulnerabilities as humans do—the possibility of being hacked or glitching just to name two. As of now, it appears that numerous studies would need to be done comparing both drone warfare and manned combat to see whether if further removing humans would be an effective measure.

                It seems that the idea of having warfare controlled by autonomous drones resulting in removing war further from the atrocities of humans might be a pleasant thought, but it is far from plausible. In addition to all of the downfalls and limitations that drones already face, it is hard to think of a time when humans are unable to sneak their selfish agendas into the programming of robots. Even in a dystopic future where autonomous systems begin wars without any human prompts, some human programmed it with triggers or protocols for doing so—protocols which can be flawed by human emotion. There may be few reasons to favor autonomous drone warfare over manned warfare, but the idea that it is significantly less corrupt than manned combat appears to fall flat and its vulnerabilities make it far too great of a risk. 

1 comment:

  1. It may seem science fiction, but I think that autonomous killing machines are even more dangerous than manned machines. Programmed machines are built and programmed by humans and therefore going to reflect human behavior. This is then compounded with giving autonomy to these machines and potentially having no way to stop them if they go on a programmed selfish rampage. Humans at leas have the ability to be reasonable whereas a machine will just act with no feelings although it may act in its best interest.

    ReplyDelete