Posted by Thomas Nephew on October 3rd, 2012
(From United States Air Force Unmanned Aircraft Systems Flight Plan 2009-2047, http://tinyurl.com/droneplans.) Note the
planned capabilities of the MQ-Lc, far right: “Modular, Autonomous,” “Strategic Attack,” “Global Strike.” Similar features
are envisioned for “medium,” fighter-sized version MQ-Mc’s.
What could be better than unmanned aerial vehicles raining death on Pakistan in a ratio of three children to one terrorist leader by remote control? Why, the same thing on autopilot, of course. J. Michael Cole of “The Diplomat” reports:
…although the use of drones substantially increases operational effectiveness — and, in the case of targeted killings, adds to the emotional distance between perpetrator and target — they remain primarily an extension of, and are regulated by, human decisionmaking.
All that could be about to change, with reports that the U.S. military (and presumably others) have been making steady progress developing drones that operate with little, if any, human oversight. For the time being, developers in the U.S. military insist that when it comes to lethal operations, the new generation of drones will remain under human supervision. Nevertheless, unmanned vehicles will no longer be the “dumb” drones in use today; instead, they will have the ability to “reason” and will be far more autonomous, with humans acting more as supervisors than controllers.
(Via digby at “Hullabaloo”). Sure, there are concerns and glitches, Washington Post’s Peter Finn notes: “Some experts also worry that hostile states or terrorist organizations could hack robotic systems and redirect them. Malfunctions also are a problem: In South Africa in 2007, a semiautonomous cannon fatally shot nine friendly soldiers.”
But the deeper concern is that a war-fighting process already on institutional and public opinion autopilot would now simply go on a computerized one. Americans think they know what’s going on in Afghanistan, Pakistan, and elsewhere, but they don’t. As the authors of Living Under Drones: Death,Injury,and Trauma to Civilians From US Drone Practices in Pakistan put it,
In the United States, the dominant narrative about the use of drones in Pakistan is of a surgically precise and effective tool that makes the US safer by enabling “targeted killing” of terrorists, with minimal downsides or collateral impacts. This narrative is false.
Elsewhere, authors James Cavallaro and Sarah Knuckey observe:
Overly permissive criteria after the fact, together with serious public accountability and transparency deficits, provide little assurance that each use of lethal force strictly complies with the relevant law. Indeed, in many other contexts, a failure to examine carefully the legality of government use of force after a killing has led to development of a culture of impunity and heightened the risk of unlawful killing.
To be clear, the “relevant law” includes the Geneva Conventions. Christof Heyns, the UN special rapporteur, told a meeting in Geneva on June 21: ’Reference should be made to a study earlier this year by the Bureau of Investigative Journalism… If civilian ‘rescuers’ are indeed being intentionally targeted, there is no doubt about the law: those strikes are a war crime.’ (emphasis, link added). Yet far from adopting measures that would help at least keep public track of the damage done, mechanisms are being proposed to “help” Congress and the judiciary avoid their constitutional responsibilities altogether — for example, Omar Shariq‘s and Benjamin Wittes‘s fascination with the U.K.’s “Independent Reviewer of Terrorism Legislation,” a kind of quasi-judicial, “wise old man” approach to punting away responsibility for identifying and killing the right people.
Nevertheless, just as happened with old-fashioned drones, some “sentient drone” enthusiasts say it might be downright immoral not to make the next great technological leap forward. Chief proponent Ronald Arkin, of the Georgia Institute of Technology, told Chronicle of Higher Education’s Don Troop:
“I am not a proponent of lethal autonomous systems,” he says in the weary tone of a man who has heard the accusation before. “I am a proponent of when they arrive into the battle space, which I feel they inevitably will, that they arrive in a controlled and guided manner. Someone has to take responsibility for making sure that these systems … work properly.
In a way, Mr. Arkin may have a point: right now, no one’s really taking responsibility for that. Given how asleep at the wheel Congress and the American public have been — and seem to want to be — about drones or the unchecked devastation they cause, maybe supplying the drones themselves with little computer consciences is the best the United States of America can hope to accomplish. Our own are in such short supply.
But if you disagree, do two things. First, go see what the International Committee for Robot Arms Control is up to, give them your support, and let your friends and representatives know about these issues. Then, lend your name to the petition Robert Naiman is bringing to Pakistan, showing people of that country that not everyone in the U.S. supports a permanent reign of drone terror in their country:
Richard Olson, U.S. Ambassador to Pakistan:
We urge you to do everything in your power to end U.S. drone strikes in Pakistan; to bring the drone strike policy into compliance with international and U.S. law; to permanently end all “signature strikes” against unknown persons; to permanently end all “secondary strikes,” particularly those that target and endanger civilian rescuers, in grave violation of international law; to address questions about civilian casualties from drone strikes publicly and in detail; and to compensate civilian drone strike victims and their families.