Print

US Reaper Drones Test Agile Condor: Another Step Closer to ‘Killer Robots’
By Peter Burt
Global Research, September 30, 2019
Drone Wars UK 27 September 2019
Url of this article:
https://www.globalresearch.ca/us-reaper-drones-test-agile-condor-step-closer-killer-robots/5690533

General Atomics Aeronautical Systems, manufacturer of the Reaper drone, has recently been awarded a US Air Force contract to demonstrate the  ‘Agile Condor’ artificial intelligence system with the MQ-9 Reaper drone.  According to General Atomics President David R. Alexander,

a“The Agile Condor project will further enhance RPA [remotely piloted aircraft] effectiveness by specifically allowing a MQ-9 to surveil a large area of operations, autonomously identify pre-defined targets of interest and transmit their locations.”

This type of capability represents a tangible step further towards the development of autonomous weaponised drones able to operate without human input – flying killer robots, in other words.  From identifying targets without the need for a human decision to destroying those targets is a very small step which could be achieved with existing technology.

The Agile Condor system is intended to send information back to a human analyst. There’s no intention for it to be used in an autonomous mode, and current US military policy on autonomous weapons is that there should always be ’appropriate levels of human judgement’ over the use of force.  However, there is no guarantee of such a restrained approach to autonomous weapons in the future, by the US or any other nation (or indeed by an non-state group). The danger is, as we detail in  our report on the development of autonomous military drone (see below), lethal autonomous weapons (LAWS) are likely to develop though step-by-step upgrades in technological capability.

Agile Condor is a high-performance computing system which uses AI techniques to enable on-board processing of large quantities of data from the drone’s sensors – for example video footage, synthetic aperture radar imagery, or infra-red camera imagery.  The system is mounted in a pod which can be fitted to the drone, and has been designed for installation in a variety of configurations including sea-based, ground based, and fixed-site weapon systems as well as on aircraft.

The demonstration flights which General Atomics will be undertaking will be used to experiment with the Agile Condor system to optimise AI and machine learning techniques for finding, identifying, and tracking targets.  It’s possible that they will include ‘training’ of the AI system to identify potential targets, with human operators confirming whether the computer has made a correct decision in order to refine and improve its performance.

In due course the Air Force Research Lab anticipates using the system to enable real-time processing of data during intelligence, surveillance and reconnaissance missions.  A video (below) prepared by SRC Corporation, which manufactures the Agile Condor pod, shows a drone using AI processing to identify an insurgent who is preparing to attack a military convoy.  An alert is sent to a ground commander, and as a result of the signal the convoy is diverted; the insurgent surrenders; and everyone lives happily ever after.

AI technology of this type enables thousands of hours of video footage to be processed autonomously – only targets of potential interest would be flagged up to commanders.  Another benefit of the on-board processing capability is expected to be a dramatic reduction in the satellite bandwidth needed to pass data between the drone and the ground.  With much of the work of handling and analysing data from the drone’s sensors conducted on board the aircraft, far less information would need to be transmitted to the ground station, reducing the costs of satellite capacity.

Hans Vreeland, a former targeting officer in the US Marines, has written recently about how AI can transform intelligence analysis when used alongside existing hardware and sensor capabilities.  According to Vreeland: “Currently, both information collection and processing are manual, labor-intensive endeavors. AI can relieve human operators of much of that burden, performing the same tasks better and faster.”

Vreeland is enthusiastic about the potential of AI systems such as Agile Condor which can analyse data from sensors and flag up situations of potential concern.  “If we had autonomous drones programmed to search specified areas and identify activity by fusing several sensor inputs”, he writes, “and if we had had the ability to process the information at the edge with Project Maven  [a US military project to use machine learning and artificial intelligence to process drone video footage], it would be difficult to overstate the increase in the amount of activity that we could have collected and analyzed”.

Vreeland is one of a long, long line of writers arguing that technology will make war better. “AI has paradigm-shifting potential to be a force-multiplier, provide better information to commanders, and quicken the operational tempo,” says Vreeland. “In other words, it will provide more better outcomes faster, a recipe for success in combat”.

Adding new capabilities to an existing platform is not a novel step for the military.  The history of the Predator drone shows that drones have evolved in incremental steps to incorporate new technology and undertake new missions.  The RQ-1 Predator entered service with the US military and was used for unarmed reconnaissance operations over the former Yugoslavia from 1995 onwards.   The role of the Predator first began to extend into combat operations when the drone was fitted with a laser designator, allowing it to illuminate targets for guided missiles fired from conventional aircraft.  In 2001 the drone was modified to fire Hellfire missiles, enabling it to undertake armed strikes on its own.  The MQ-9 Reaper, a larger version of the Predator which entered service in 2007, can now be fitted with a much broader range of weapons than the RQ-1.  General Atomics is continuing to make improvements to the MQ-9 and increase its range of automated features as it remains in service.

As we have shown in our ‘Off The Leash’ study into the development of autonomous drones, killer robots are likely to evolve in a similar manner, through small, step-by-step upgrades in technological capability.  The extent to which increased autonomy might raise concerns will depend upon the level of human control over ‘critical functions’ required to select and attack targets.  Intelligence gathering and analysis is on the borderline of being a critical function: in itself, it is a non-lethal function, but on the other hand it is an essential part of the process of identifying and tracking a target.

The Agile Condor modification to the Reaper crosses a line.  It is an important enabling technology which could allow a decision on lethal strikes to be taken by the drone itself, with no human intervention, and is a significant step towards the development of an autonomous weapon system.  To control and prevent the development of such ‘killer robots’, we need to rapidly develop an international legal instrument to prevent the development, acquisition, deployment, and use of fully autonomous weapons and ensure that humans are always in control of lethal force decisions.

Before it’s too late.

*

Note to readers: please click the share buttons above or below. Forward this article to your email lists. Crosspost on your blog site, internet forums. etc.

Disclaimer: The contents of this article are of sole responsibility of the author(s). The Centre for Research on Globalization will not be responsible for any inaccurate or incorrect statement in this article.