DYSPEPSIA GENERATION

We have seen the future, and it sucks.

War Machines: Recruiting Robots for Combat

28th November 2010

Read it.

They wanted me to go to jihad, but I said no, no, no….

Three backpack-clad technicians, standing out of the line of fire, operate the three robots with wireless video-game-style controllers. One swivels the video camera on the armed robot until it spots a sniper on a rooftop. The machine gun pirouettes, points and fires in two rapid bursts. Had the bullets been real, the target would have been destroyed.

No, ‘killed’. The target — a sniper — would have been ‘killed’. C’mon, New York Times, use the word. I dare you.

Because robots can stage attacks with little immediate risk to the people who operate them, opponents say that robot warriors lower the barriers to warfare, potentially making nations more trigger-happy and leading to a new technological arms race.

“Wars will be started very easily and with minimal costs” as automation increases, predicted Wendell Wallach, a scholar at the Yale Interdisciplinary Center for Bioethics and chairman of its technology and ethics study group.

Gee, isn’t that what jihadists are doing right now? Except that they aren’t part of the Blame America First crowd, which I suspect Wendell Wallach is.

Civilians will be at greater risk, people in Mr. Wallach’s camp argue, because of the challenges in distinguishing between fighters and innocent bystanders. That job is maddeningly difficult for human beings on the ground. It only becomes more difficult when a device is remotely operated.

No, it will be easier, because a commander won’t be hesitating to do the right thing because of the prospect of some bleeding-heart Cincy Sheehan back home raising a stink because her widdle babykins got killed doing his duty.

Yet the shift to automated warfare may offer only a fleeting strategic advantage to the United States. Fifty-six nations are now developing robotic weapons, said Ron Arkin, a Georgia Institute of Technology roboticist and a government-financed researcher who has argued that it is possible to design “ethical” robots that conform to the laws of war and the military rules of escalation.

But it’s not ‘nations’ that are the problem; it’s two-bit terrorist organizations like Al Qaeda and the Taliban, who one doubts are spending a lot of their R&D dinars on robot fighting vehicles.

“If the decisions are being made by a human being who has eyes on the target, whether he is sitting in a tank or miles away, the main safeguard is still there,” said Tom Malinowski, Washington director for Human Rights Watch, which tracks war crimes. “What happens when you automate the decision? Proponents are saying that their systems are win-win, but that doesn’t reassure me.”

Hate to break it to you, Tom, but your reassurance isn’t a high priority with the people who actually have to face the murderous swine of the world. (I love that: ‘… Human Rights Watch, which tracks war crimes’ — but only when they aren’t committed by Muslims. You Can Look It Up.)

Comments are closed.