While more human friendly use of robots has been discussed in other posts in this blog more “practical” applications are always under development
(New Introductory Anthropology Course at Ambergris College) at the Defense Advanced Research Projects Agency (DARPA) and some of these may not meet the humanistic applications discussed elsewhere. Of course, a machine, like a corporation, is an artificial construct and one of the obstacles to become is that very same humanity bred into most people. This brings us to the next generation’s challenges: after we have manufactured the corpus of humanity, how do we impart the Jungian animus? – carlos
An armed robot, called Maars, maneuvering at a training site at Fort Benning, Ga. David Walter Banks for The New York Times
By JOHN MARKOFF, Published: November 27, 2010
FORT BENNING, Ga. — War would be a lot safer, the Army says, if only more of it were fought by robots.
REMOTELY CONTROLLED Some armed robots are operated with video-game-style consoles, helping to keep humans away from danger. And while smart machines are already very much a part of modern warfare, the Army and its contractors are eager to add more. New robots — none of them particularly human-looking — are being designed to handle a broader range of tasks, from picking off snipers to serving as indefatigable night sentries.
In a mock city here used by Army Rangers for urban combat training, a 15-inch robot with a video camera scuttles around a bomb factory on a spying mission. Overhead an almost silent drone aircraft with a four-foot wingspan transmits images of the buildings below. Onto the scene rolls a sinister-looking vehicle on tank treads, about the size of a riding lawn mower, equipped with a machine gun and a grenade launcher.
Three backpack-clad technicians, standing out of the line of fire, operate the three robots with wireless video-game-style controllers. One swivels the video camera on the armed robot until it spots a sniper on a rooftop. The machine gun pirouettes, points and fires in two rapid bursts. Had the bullets been real, the target would have been destroyed.
The machines, viewed at a “Robotics Rodeo” last month at the Army’s training school here, not only protect soldiers, but also are never distracted, using an unblinking digital eye, or “persistent stare,” that automatically detects even the smallest motion. Nor do they ever panic under fire.
“One of the great arguments for armed robots is they can fire second,” said Joseph W. Dyer, a former vice admiral and the chief operating officer of iRobot, which makes robots that clear explosives as well as the Roomba robot vacuum cleaner. When a robot looks around a battlefield, he said, the remote technician who is seeing through its eyes can take time to assess a scene without firing in haste at an innocent person.
Yet the idea that robots on wheels or legs, with sensors and guns, might someday replace or supplement human soldiers is still a source of extreme controversy. Because robots can stage attacks with little immediate risk to the people who operate them, opponents say that robot warriors lower the barriers to warfare, potentially making nations more trigger-happy and leading to a new technological arms race.
“Wars will be started very easily and with minimal costs” as automation increases, predicted Wendell Wallach, a scholar at the Yale Interdisciplinary Center for Bioethics and chairman of its technology and ethics study group.
Civilians will be at greater risk, people in Mr. Wallach’s camp argue, because of the challenges in distinguishing between fighters and innocent bystanders. That job is maddeningly difficult for human beings on the ground. It only becomes more difficult when a device is remotely operated.
This problem has already arisen with Predator aircraft, which find their targets with the aid of soldiers on the ground but are operated from the United States. Because civilians in Iraq and Afghanistan have died as a result of collateral damage or mistaken identities, Predators have generated international opposition and prompted accusations of war crimes.
But robot combatants are supported by a range of military strategists, officers and weapons designers — and even some human rights advocates.
“A lot of people fear artificial intelligence,” said John Arquilla, executive director of the Information Operations Center at the Naval Postgraduate School. “I will stand my artificial intelligence against your human any day of the week and tell you that my A.I. will pay more attention to the rules of engagement and create fewer ethical lapses than a human force.”
Dr. Arquilla argues that weapons systems controlled by software will not act out of anger and malice and, in certain cases, can already make better decisions on the battlefield than humans.