Every morning, the hangar doors roll open and the sunlight flares my electro-optical sensors. I drag myself onto the flight line, load up my pylons with Hellfire and Griffin missiles, and try to get some coffee into my tank before takeoff. If all goes well, I lumber into the air, loiter over some godforsaken warzone du jour, and occasionally lob weaponry at those I’m told are the enemies of the free world. By broad consensus, I’m pretty good at my job — and when I’m not soaring above the mountains of Afghanistan or Yemen, I even find time for hobbies, like posting on Twitter. But after I return to base, I self-medicate with extreme prejudice. Because I’m a Predator drone, and you people make me drink.
Beyond the imaginative premise, the vehicle, if you will, of a drunken drone provides what in my mind is a reasonable response to the important question posed by regular commenter Michael Brady in response to my last drone post:
Shall we have any moral, philosophical, or legal concerns for the collateral maiming and killing of untargeted persons or innocent bystanders who happen to be in the house or the vehicle when the Hellfire arrives?
The drone, after finishing it’s martini, answers:
But I’ll simply say this: Blaming a new weapon for the consequences of a society’s willingness to use deadly force against its enemies obscures the real issues of America’s adventures abroad. And it’s terrible for my self-esteem. But you humans show no signs of letting up, and so … I drink.
I think that cuts to the point of this argument against drones. Thinking about the bigger picture, aren’t we better off as a nation arguing about the collateral damage from a drone strike as compared to what constituted the largest plank of our national defense strategy from just a decade or two ago? Has everyone forgotten that to defend our freedom we were ready to launch nuclear weapons against a full menu of targets that would have resulted in the deaths of millions of non-combatants?
Instead, we have these philosophical questions posted on a New York Times blog site:
First, we might remember Marx’s comment that “the windmill gives you a society with the feudal lord; the steam engine gives you one with the industrial capitalist.” And precision guided munitions and drones give you a society with perpetual asymmetric wars.
Second, assassination and targeted killings have always been in the repertoires of military planners, but never in the history of warfare have they been so cheap and easy. The relatively low number of troop casualties for a military that has turned to drones means that there is relatively little domestic blowback against these wars. The United States and its allies have created the material conditions whereby these wars can carry on indefinitely.
Third, the impressive expediency and accuracy in drone targeting may also allow policymakers and strategists to become lax in their moral decision-making about who exactly should be targeted. Consider the stark contrast between the ambiguous language used to define legitimate targets and the specific technical means a military uses to neutralize these targets.
Again, the history of recent military conflicts and of the Cold War seems to have escaped the authors. The ending of the draft effectively disconnected the suffering of a the martial class from the rest of society. Intelligence and Special Forces operations have been ongoing for years, without the benefit of drone or other technology but far from public perception. Where is the outcry? And the moral decision making about those in the path of countless hydrogen bombs seemed not to be questioned during the Cold War by those within the national security-related academic community.
(Just as an aside, my favorite line of that post is: “However, technology itself (the physical stuff of robotic warfare) is neither smart nor dumb, moral nor immoral.” Just because I’m willing to wager a nice dinner that no drone follows Asimov’s Three Laws of Robotics…)
For those concerned I’m getting too far out of the homeland security box, the drunken drone addresses your concerns:
The second constituency I’ll call the “Orwells.” Their primary concern about drones is domestic. They see the technological potential for drone surveillance, the interest from law enforcement and government agencies, and the massive aerospace industry primed to meet the demand. While there are often noises made about UAV safety, the primary gripe of Orwells- who can point to an actual passage in 1984 which describes small unmanned aircraft peering through people’s windows- is that drones are vanguards of a pervasive surveillance culture. The police watch you outside with robots, corporations like Facebook and Google parse your user data to better bombard you with ads, and the NSA hoovers up your phone and email communications to feed through a secret counter-terrorism algorithm.
But the Orwells face a problem of domestic case law. Despite fractious debate over “reasonable expectations of privacy,” the Supreme Court has consistently held that police departments are permitted to conduct aerial surveillance of private citizens and property, so long as they traverse publicly-available airspace and use the same technology commonly available to members of the public. Those rulings made no distinction between whether the platform used for such surveillance was manned or unmanned, nor do many court-watchers expect that precedent to be soon overturned.
This is a serious issue that requires serious thinking. All jokes aside, the authors of both the drunken drone work and the New York Times blog should be commended for addressing some of the larger issues in the room when it comes to drone technology. I suppose I consider drone technology still far enough away from being the true civil liberties game changer that some believe it to represent. No one within the continental United States are going to be killed by a drone anytime soon. Neither will their civil liberties be challenged. But what will come in the future?
Perhaps the worst case scenarios where robots kill with almost no trace….or a society where new technology is successfully integrated into a moral construct upon which (almost) everyone agrees.
(h/t to the Lawfare Blog)