Robotics Prof Fears Rise of Military Robots 258
An anonymous reader writes "Interesting video interview on silicon.com with Sheffield University's Noel Sharkey, professor of AI & robotics. The white-haired prof talks state-of-the-robot-nation — discussing the most impressive robots currently clanking about on two-legs (hello Asimo) and who's doing the most interesting things in UK robotics research (something involving crickets apparently). He also voices concerns about military use of robots — suggesting it won't be long before armies are sending out fully autonomous killing machines."
Re:"Friendly AI" (Score:5, Informative)
Yes, it will happen. No, it won't stop development. Depending what you mean by autonomous, it may have already happened [wired.com].
Re:Electromagnetic Pulse, anyone? (Score:3, Informative)
A focussed EM beam would work well though - eg a high gain microwave or radio waveguide could cause serious disruption.
Re:Running spider mines (Score:5, Informative)
Imagine them communicating with each other in pack and relaying GPS location data. If one finds a target, they start to zero in on the victim.
Reality: DARPA funded work on that in 1997. Sandia made it work. [sandia.gov] The Sandia concept turned out not to be too useful militarily, but paved the way for the Precision Urban Hopper. [youtube.com].
3rd Armored Corps commander wants killbots (Score:5, Informative)
The US military wants robots. More robots. Robots that kill. Now.
Read Failure To Field The Right Robots Costs Lives, General Says [nationalde...gazine.org]. Lt. General Rick Lynch, commander of the U.S. Army's 3rd Armored Corps, wants autonomous killbots. His corps lost 155 soldiers in Iraq, and he claims that 80% of them would have been saved if the right kind of robots were deployed. On watching "hotspots" for enemy activity: "Robots can take the soldiers' places. They can continuously keep watch on an area, and if nefarious activity is spotted, we can take appropriate action. ... We can kill those bastards before they plant the IEDs"
This is a combat general in charge of a major Army command making it happen.
Re:"Friendly AI" (Score:2, Informative)
Well, suppose your Mom was at a restaurant having dinner, and it got blown up, killing her and most of the rest of the clientele, and you learned that the restaurant was bombed without warning because a "high value target" was supposed to have been there, but wasn't. (This has happened, and it was no accident.) I assume, based on the above, you would feel that "them's the breaks," but I can assure you that many people would conclude that the people dropping the bombs don't really care much as to whether civilians were killed or not, and you don't have to dig very deep to learn that in reality many of the people at the receiving end of such incidents do indeed feel that the people behind the bombs deserve punishment.
Just because you are upset and may want retribution, you are still going to see the distinction between this and someone intentionally killing your mother.
No one said that accidental deaths are meaningless. But jollyreaper was claiming that there is no distinction between the intentional murder and accidental deaths (or even collateral damage). I think stdarg's example was spot on.
It's Public Law (Score:2, Informative)
"National Defense Authorization Act for Fiscal Year 2001 (as enacted into law by Public Law 106-398; 114 Stat. 1654A-38) that, by 2015, one-third of operational ground combat vehicles be unmanned."
I'm guessing they won't all be logistical delivery vehicles.