Wednesday, January 16, 2013

Cooment Paper #4

Both the Arkin and Sharky articles offered differing view point on autonomous robots, specifically drone and war fighters. Although each brings up some valid points I think they each either ignore or deny the fact that war can change; it isn't as simple as people killing other people and general destruction. The nature and tactics of war are rapidly changing and are the result of both new enemies and new technology.

First and foremost, Arkin's plan for completely autonomous robots, whether they are used just to target other robots or humans, ignores the possible repercussions from using such a technology. The first, which is touched upon, is the availability of the same technology to other countries. War has often been a battle between who can spent to most resources, but if two nations with comparable forces and similar technology meet it could perhaps end in a draw when using autonomous robots. As well this again worries me that war will become a less politically risky action because there will be less bloodshed and therefore less public backlash. Second, Arkin doesn't see how insurgents have adapted thus far to our new warfare and use of drones. Autonomous robots will be very basic at first yet warfare has already moved away from isolated battles and into urban warfare and scattered skirmishes. As it stands there is no algorithm for deciding who is guilty and who can be shot, so how do we intend on programming this in a computer?

Sharky's side is equally thought provoking but I see similar flaws. Much of what he says I agree with, both from the robot's lack of intuition to the need for human participation in decision making. He does however make one point which I don't find isn't particularly strong. The first issue is that of responsibility and the ability to hold a person accountable for their actions. He argues that you cannot try and punish a robot. It simply does as it is programmed, so if it makes a mistake then how are we to rectify this? Well first these robots are programmed, so look to the programmers as well as those to test and authorize their use in battle. The testing and operational standards must be rigorous and allow for no error, if there is then it is because of negligence on the part of either manufacturer, programmer, or military testing. Second, do we really even try and punish those who violate the laws of war right now? Yes, we have punished genocidal lunatics after years of hunting them but when was the last time you heard about the US forces' atrocities? Abu Ghraib is the last I can think of and that even revealed the lack of care taken to protect the principles of war and the treatment of noncombatants/POWs.

So while neither has overwhelmingly convinced me, I do think that Arkin is far to optimistic and naive about the possible outcome these systems might have on war and Sharky both ignores the fact that atrocities happen now but also that there is always a human responsible.

5 comments:

  1. While we could try to blame the programmers, it may be hard to 'prove' that they did anything with malice or recklessness. It could simply be a blind spot that we haven't found yet. In fact, those 'blind spots' are probably inevitable no matter how good we get at this.

    ReplyDelete
  2. I agree with Professor Shirk that it is difficult to hold the programmers fully accountable for malfunctions that might result in their machines. Though there is a certain level of negligence you can place on the programmers, accidents do happen. Even some of the best machines out there have lemons and there should be some amount of mistakes expected while scientists continue to develop this technology. However, for this very reason, I think we should be testing the use of drones in warfare in very small amounts at first in order to reduce risk. That is why I believe the current, rampant use of drones and the extreme speed with which it is increasing in warfare is cause for concern.

    ReplyDelete
    Replies
    1. I definitely agree with there always being accidents, however I think that they will be few and far between if there is an established and rigorous testing regimen for these machines. If there is no one to blame then it is back to the drawing board, but often times the government does in fact rush this new technology to the field and does not give a second thought to such matters. This is a decision that will result in the taking of human life and should be treated with extreme caution and certainty.

      Delete
    2. Although I do not hold the programmers completely responsible, they should feel a sense of accountability. We a plane fails or a car fails, the engineers and manufacturer are held accountable because they created the failure, the same should be applied to drones. Just because drones have a level of autonomy, they still are machines that must be fail proof if we are going to give them the capability to kill other humans. This is not something that should be taken lightly and accountability is really important in perfecting a product.

      Delete
  3. I think the whole keeping humans "in the loop" argument is very similar to the argument that "guns dont kill people, people kill people." Drones just enhance humans ability to destroy. Drones are useless without a human to pull the trigger. The whole discussion about drones really should center on the condition of humanity in modern times.

    ReplyDelete