What do you think of his assertions concerning the future of autonomous operations of armed UAS? Do you agree or disagree, why? What safeguards do you think should be put in place to guard against the future he predicts? In what commercial applications could a completely autonomous UAS be used?
When I started this program I had an understanding of what unmanned aviation was due to my background. In 2003, while sitting in an operations center in Tikrit, Iraq, I received a call from higher informing me that the predator was ours for a period of three hours. At that very moment we received rover video and the system was commanded to follow a white vehicle that just attacked an operating base with mortars. The base plates for the mortars were located in the bed of this truck and positively identified. Within minutes that vehicle was destroyed by 30mm fire from an Apache. At that moment I had a feeling the face of war would be forever changed. This is the beginning of skynet. Life imitating art appears to be the case.
Since the late 1990’s I’ve watched the RQ-1 go from an ISR platform to the MQ-1 platform that is capable of launching hellfire missiles. These systems are remotely piloted vehicles (RPV) but the natural progression is towards autonomy. These systems are not capable, at this point, of making the decision to fire weapons. That doesn’t mean they can’t be upgraded to do so. As stated in the video, the PRV is the precursor to autonomous robotic weapons (Suarez, 2013). I do think he is correct. This is a very dangerous proposition and I think it’s one that is easily attainable at this point. Take a look at the DARPA challenges. These vehicles created by academia with government funding can negotiate obstacles through the use of LiDAR, high powered CPU’s to process algorithms, positioning sensors and more. If they can do this they can process information through self-learning algorithms to make the decision call to fire a weapon system. Do I think there will be international law that prohibits the use of such technology? I think the answer, ultimately, is yes. I also think that we are not a proactive group by nature and it will take some sort of nasty accident to help facilitate. We are so focused on this technology and the data it is producing. We cannot keep up with it, period. With the advancement of technology the data will continue to grow which means we’ll need further automation/technology/software to process all of this data produced by the unmanned systems. Case in point; ARGUS.
Not only is the RPV a stepping stone to completely autonomous systems but so is the data. In 2004 UAS collected a total of 71 hours of video (Suarez, 2013). In 2011 300,000 hours of video (Suarez, 2013). This isn’t soon to change with systems like Gorgon stare and Argus (NOVA, 2012). These are data producing sensors to the likes of which we’ve not seen before. It is a known fact we don’t have the manpower to process all this data (Erwin, 2012). We will not be able to keep up and will need employ software to do the work for us (Suarez, 2013). The data production will facilitate further change especially as these systems become more prolific. I don’t want to sound pessimistic or negative but I’m not sure what, if any, safeguards can be put in place at this point to stop this snowball. We have humans-in-the-loop to make crucial decisions such as producing/receiving a 9-line. The Department of Defense directive number 3000.09 is in place and states; Autonomous and semi-autonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force (Department of Defense, 2012). This is a directive. Directives can change and something more binding should be put in place, internationally. I’m just not sure if we are past this point.
Commercial applications for UAS such as precision agriculture, search and rescue and mapping have great possibilities. When we start talking about law enforcement and government uses this is where it will become an issue. Issues such as privacy and the ability of the automated decision making without a human will be much debated in the near future.
References
Department of Defense. (2012). Autonomy in weapons systems. Retrieved from www.dtic.mil/whs/directives/corres/pdf/300009p.pdf (Links to an external site.)
Erwin, S. (2012). Too much information not enough intelligence. Retrieved from http://www.nationaldefensemagazine.org/archive/2012/May/Pages/TooMuchInformation,NotEnoughIntelligence.aspx (Links to an external site.)
NOVA. (2012). Rise of the drones. Retrieved from https://www.youtube.com/watch?v=QFUUxcuyDN0
Suarez, D. (2013). The kill decision shouldn’t belong to a robot. Retrieved from https://www.youtube.com/watch?v=pMYYx_im5QI