Today, unmanned aerial vehicles (UAVs) make up a third of the US Navy’s battle and armed services, and armed forces around the world are upgrading their fleets with autonomous drones and other automated strike systems. Statistics show that most air strikes in the Middle East are carried out by the US Army, with Somalia and Pakistan also being frequently hit. About 1300 air strikes have been implemented by the US since the beginning of the year, but there are barely any official statistics regarding target injuries and deaths available to the public. In October 2015, however, classified Pentagon documents leaked. These documents indicate that 90% of the targets killed were actually civilians. The Intercept, a news website that reports on leaked documents, published these papers from the US intelligence surveillance task force, indicating the premature firing of drones. During Operation Haymaker, which targeted north-eastern Afghanistan, about 155 people were killed by unmanned aircraft. Of these, only 19 were indicated to be ‘Jackpots’ (JP) (i.e., people who were intended to be killed). The other 136 civilians who lost their lives were marked EKIA (i.e., enemy killed in action): see Figure 1.
One might also note the percentage on the very right of the table that is shown in Figure 1. The information presented here could lead to the belief that 70% of the operations were successful. However, this value represents the number of JPs divided by the number of operations and therefore completely ignores the hundreds of civilians who have been killed along the way. When the number of civilians hit are taken into account, the ‘success rate’ is actually around 12%.
This low success rate implies either an underdeveloped technology or barbarous drone operators. Both of these possibilities are highly unethical. However, considering the rather long chain of command, the evidence leads to the drones themselves.
‘Looking through a soda straw’
One of the gravest technical issues faced by drones is the restricted view available to the operators in control centres. Reports from former drone pilots describe a prohibitive phenomenon known as ‘blink’. When a pilot follows a target that is captured by one drone, an image captured by the previous drone is held for a few seconds before the display suddenly jumps to the required image. Blink poses a severe restriction on the pilot, particularly during target recognition. The pilot, located on the other side of the planet, has to make a decision based on images that represent small snippets of a wider picture. As a result, these drone images lack context. Former drone operators describe this experience as watching targets ‘through a soda straw’. This extremely limited view is often responsible for the deaths of civilians that were simply not in sight at the time of firing. On one of the leaked slides from the US air force, this lack of ‘persistent stare’ is expressed as key factor for failed operations.
Additionally, target recognition is still in its early stages. Most of the targets are still localised by their mobile phone data, the geographical location of which is linked to the drone. This moves the target from the person itself, turning the phone into a deadly proxy. The screenshot from the leaked documents—see Figure 2—shows the SIM cards of specific individuals and the drones that their geolocations were connected to. Simple phone-conversation snippets are therefore the only indicator for whether the correct target is under surveillance. The elimination of this signal (a ‘touchdown’) ensures only a destroyed mobile phone, and not the death of a certain target.
‘Tyranny of distance’
Finally, drones seem to be greatly restricted by geographical means. During missions that were implemented in Yemen and Somalia in 2012, drones spent about 50% of their airtime in transit because of the distance from the US air bases and the countries’ borders. The main US air base is located in the small East-African nation Djibouti, which lies at a 500km distance from most targets in Yemen. Some targets in Somalia were over 1000km away. This implies that the minimum requirements for achieving target surveillance in Eastern Africa (e.g., minimum target and observation time) have never been met, contrary to what is claimed in the leaked documents. Limited surveillance time results in incautious firing and thus entails the risk of a wrong or unsafe target location.
Out of the loop
Despite the significant technical flaws that have been uncovered, today’s drone research seems two steps ahead of previous efforts. Plans are in place to take humans out of the loop, and instead allow the drone itself to make decisions over life and death. This is challenging not only technically, but also in terms of our ethics, laws and politics. However, for Edward Lucas, senior editor at the Economist and an expert in intelligence and cyber-security issues, his fears primarily relate to the possible assassinations or cyber-security threats that may be imposed by unmanned drones. In an interview with UCL ENGins, he says, ‘Our dependence on computers is increasing much faster than our ability to defend those computers and networks’. In other words, it isn’t just hackers that constitute a threat, but also foreign military, pranksters and anyone who might want to inflict damage on someone. Lucas adds, however, that the ‘threat towards computers isn’t new’, and that we are used to ‘increasing our critical dependency on those electronic machines’. After all, drones are just part of the evolution of weaponry, not a revolution in and of themselves. It’s also important to note that autonomous weapons are not an invention of today. Landmines and booby traps have been used for a very long time in the history of warfare, and neither require someone pulling a trigger. These weapons are themselves designed in a discriminative and simple way (e.g., only setting off when exposed to a certain weight).
This shows both the biggest disadvantage and the only advantage of ‘smart’ weapons. The lack of human emotion or gut feeling creates a complex question, which has been distilled to a simple and analytical answer: a soldier in the field sees death before his/her own eyes; someone looking at numbers does not. And even if the final decision is not made by the drone itself and sent to a human in some control centre far away, the commander will still make decisions on scarce, limited information. A report by the UK Royal Air Force suggests that in future, decisions over life or death will probably have to be made based on information such as in the following example, concerning the targeting of a Colonel John smith on his journey to a location by vehicle:
- Probability of one human rider: 95%
- Probability of body-match to Colonel John Smith: 75%
- Probability of voice-match to Colonel John Smith: 90%
Lucas also fears ‘intelligence overreach’. He says, ‘We need to distinguish between what we do for intelligence or military purposes‘. Lucas feels that there is no big difference between attacking a group of terrorists in a war zone with snipers or with drones. In his eyes, generals are trained in war ethics and have actually held a gun in their hands before, whereas a CSI agent pressing a button on a screen most probably has not. Transferring the use of drones from the military operational zone to intelligence actions, such as assassination combat around the world, would significantly lower the inhibition level of killing for this reason. It may be that this would push the use of drones on a more regular basis, which would on no account lead to a sustainable future.
The leaked presentation files indicate that the US is trying to eliminate blink and obtain a persistent stare by adding even more UAVs to areas of operation. Because of the political restrictions that stop them developing air bases in East Africa, US forces are also aiming to launch more drones from naval bases on ships. In future, this will mean busier skies for North-Eastern Africa and the Middle East, and faster processing times for target eliminations.
Looking at the documents that report major technical flaws, we should consider solving the issue of drones in a non-technical and broader context. A sustainable solution can only be found when people stop considering blind killing machines as a way to solve crises around the world.
- S. Ackerman and N. Shachtman, Almost 1 in 3 US warplanes is a robot, Wired, 2012.
- Cora Currier, The kill chain, The Intercept, 2015.
- Small footprint operations, The Intercept, 2015.
- Andy Myers, The legal and moral challenges facing the 21st century air commander, Air Pow. Rev. 10, 2007.
- Operation Haymaker, The Intercept, 2015.
- Geolocation watchlist, The Intercept, 2015.