Monday, January 19, 2009

Towards a 'largely robotic' battlefield

A new book will hit store shelves later this week that will be of interest to those concerned about the ongoing roboticization and de-humanizing of military technology. The book, Wired for War: The Robotics Revolution and Conflict in the 21st Century, is authored by P. W. Singer, the director of the 21st Century Defense Initiative at the Brookings Institution. He has also published Children at War (2005) and Corporate Warriors: The Rise of the Privatized Military Industry (2003).

In a recent Wilson Quarterly article, Singer makes the claim that Pentagon planners are already provisioning for battlefields that will be, as they put it, "largely robotic." It's no secret that the U.S. military is developing a variety of unmanned weapons and seemingly futuristic technologies -- everything from automated machine guns and robotic ­stretcher ­bearers to tiny but lethal robots the size of insects.

As these weapons gain more and more autonomy, deeper questions arise. Singer poses difficult questions: "Can the new armaments reliably separate friend from foe? What laws and ethical codes apply? What are we saying when we send out unmanned ma­chines to fight for us? What is the “message” that those on the other side receive?" And ultimately, asks Singer, how will we remain masters of weapons that are immeasurably faster and more "intelligent" than we are?

Proxy killing

A fundamental problem as Singer sees it is the ease with which killing can now take place. He cites the example of the Predator, an unmanned aerial vehicle (UAVs). This propeller-­powered drone is 27 feet in length, can spend up to 24 hours in the air and flies at a height of 26,000 feet. Predators are flown by "reach-back" or "remote-split" operators -- military personnel who are 7,500 miles away and who fly the planes via satellite from a set of converted ­single-­wide trailers located mostly at Nellis and Creech Air Force bases in Nevada.

This type of operation has created a rather novel situation where "pilots" experience the psychological disconnect of being "at war" while dealing with their daily domestic routines. Singer notes the words of one Predator pilot, “You see Americans killed in front of your eyes and then have to go to a PTA meeting.” Says another, “You are going to war for 12 hours, shooting weapons at targets, directing kills on enemy combatants, and then you get in the car, drive home, and within 20 minutes you are sitting at the dinner table talking to your kids about their homework."

These days there are more than 5,300 drones in the U.S. military’s total arsenal and not a single mission happens without them. The Pentagon predicts future conflicts involving tens of thousands.

Better than humans

The appeal of robots is obvious. They don't need to be returned home in body bags after they've shot down. Moreover, robots don't come with typical human frailties and foibles. "They don’t get hungry," says Gordon Johnson of the Pentagon’s Joint Forces Command. "They’re not afraid. They don’t forget their orders. They don’t care if the guy next to them has just been shot. Will they do a better job than humans? Yes." Johnson's comments sound eerily like the script from a Terminator movie.

And as these technologies improve, human capabilities are being increasingly pushed to their limits. Today's F-16 fighter jet can maneuver so fast and hard that its pilots black out. As a DARPA official has noted, "the human is becoming the weakest link in defense systems." Moving forward, autonomous weaponry will be increasingly used in place of humans. Eventually it will be robot versus robot -- especially when the theater of operations starts to function at technologic speed. The Pentagon is aware of this possibility, noting that "As the loop gets shorter and shorter, there won’t be any time in it for humans."

Failure to override

The inevitable question arises: Who will control robots that work autonomously and at suprahuman 'technologic' speed? There are already disturbing examples of 'failure to override' incidences -- when machines function outside of human control and can't be shut down. Today's Navy ships use the Aegis computer system which enters into "casualty" mode when all the humans onboard are dead. In this situation the guns go into a kind of berserker mode and the computer does its best to ensure that the ship doesn't get hit. As Singer notes, "Humans can override the Aegis system in any of its modes, but experience shows that this capability is often beside the point, since people hesitate to use this power. Sometimes the consequences are ­tragic."

Part of the problem is that humans are starting to give intelligent systems the benefit of doubt. In many cases the human power "in the loop" was actually only veto power -- but even that is a power that military personnel are often unwilling to use against the quicker (and what they viewed as superior) judgment of a ­computer.

The next step in this trend is to give robots the ability to fire back on their own. As Johnson notes:
Anyone who would shoot at our forces would die. Before he can drop that weapon and run, he’s probably already dead. Well now, these cowards in Baghdad would have to pay with blood and guts every time they shot at one of our folks. The costs of poker went up significantly. The enemy, are they going to give up blood and guts to kill machines? I’m guessing not.
Johnson in particular views this as not only logical but quite attractive.

Removing the human factor


Retired Army colonel Thomas Adams believes that the speed, confusion, and information overload of modern-day war will soon move the whole process outside "human space." He predicts that future weapons will be too fast, too small and too numerous; they will create an environment that's simply too complex for humans to direct.

The Joint Forces Command is very aware of this possibility, noting that autonomous robots on the battlefield will be the norm within 20 years. Military and robotics developers predict that robots as fully capable as human soldiers will start to appear on the battlefield sometime between the years 2025 to 2035. This will undoubtedly mark a pivotal point in human history. The next war, claims Singer, could be fought partly by robots that respond to spoken commands in plain English and then figure out on their own how to get the job done.

When war becomes too easy

War is hell -- well, at least it's been that way in the past. Democratic governments and their citizens have had to be extremely careful about entering into costly and emotionally wrenching conflicts. But Singer now worries that unmanned systems represent the ultimate break between the public and its military:
With no draft, no need for congressional approval (the last formal declaration of war was in 1941), no tax or war bonds, and now the knowledge that the Americans at risk are mainly just American machines, the already falling bars to war may well hit the ground. A leader won’t need to do the kind of consensus building that is normally required before a war, and won’t even need to unite the country behind the effort. In turn, the public truly will become the equivalent of sports fans watching war, rather than citizens sharing in its ­importance.
Given this kind of scenario, cheap and costless unmanned wars would significantly lessen political repercussions. Singer argues that this is a frightening prospect and that it would "pervert the whole idea of the democratic process and citizenship as they relate to war." His fear is that, when a citizenry has no sense of the horrors and true cost of war, they will choose to go to war like any other policy decision, "weighed by the same calculus used to determine whether to raise bridge tolls." Public engagement will turn to indifference and titillation over all the war-porn on YouTube.

Singer's prognosis is grim:
When technology turns war into something merely to be watched, and not weighed with great seriousness, the checks and balances that ­undergird democracy go by the wayside. This could well mean the end of any idea of democratic peace that supposedly sets our foreign-policy ­decision ­making ­apart.
We're heading down a very strange and treacherous path.

No comments:

Post a Comment