Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Robotics The Military

Robot Soldiers Are Already Being Deployed 258

destinyland writes "As a Rutgers philosopher discusses robot war scenarios, one science magazine counts the ways robots are already being used in warfare, including YouTube videos of six military robots in action. There are up to 12,000 'robotic units' on the ground in Iraq, some dismantling landmines and roadside bombs, but 'a new generation of bots are designed to be fighting machines.' One bot can operate an M-16 rifle, a machine gun, and a rocket launcher — and 250 people have already been killed by unmanned drones in Pakistan. He also tells the story of a berserk robot explosives gun that killed nine people in South Africa due to a 'software glitch.'"
This discussion has been archived. No new comments can be posted.

Robot Soldiers Are Already Being Deployed

Comments Filter:
  • Waldos (Score:4, Insightful)

    by John Hasler ( 414242 ) on Wednesday May 20, 2009 @03:22PM (#28029717) Homepage

    None of the devices currently in use are robots. They're just military waldos.

  • by FooAtWFU ( 699187 ) on Wednesday May 20, 2009 @03:26PM (#28029759) Homepage
    Yeah, they've got a few glitches, but you should see what they have to go through to make the legacy system they're replacing work. You're talking years of training, each unit.
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Wednesday May 20, 2009 @03:28PM (#28029793)
    Comment removed based on user account deletion
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Wednesday May 20, 2009 @03:30PM (#28029805)
    Comment removed based on user account deletion
  • For ev (Score:2, Insightful)

    by Anonymous Coward on Wednesday May 20, 2009 @03:32PM (#28029845)

    Coming soon to a battlefield near you, EMP weapons.

  • by fuzzyfuzzyfungus ( 1223518 ) on Wednesday May 20, 2009 @03:35PM (#28029901) Journal
    And it isn't as though the fleshy ones are without glitches of their own [cnn.com].
  • by Hatta ( 162192 ) on Wednesday May 20, 2009 @03:39PM (#28029955) Journal

    we need robots and machines that PREVENT war through simulation and complex analysis.

    After all, the only winning move is not to play.

  • by CopaceticOpus ( 965603 ) on Wednesday May 20, 2009 @03:40PM (#28029975)

    Any machine that fires a weapon needs to be built with an excessive number of safeguards. If something goes wrong, there should be several checks which shut off the weapon before it ever has a chance to fire. The fact that this machine would go berserk and fire its gun in a big circle shows that there was criminal neglect and carelessness by the developers, and whoever approved this design should probably be on trial.

  • by Publikwerks ( 885730 ) on Wednesday May 20, 2009 @03:51PM (#28030109)
    Is anyone else sick of people calling these RC vehicles robots? When I hear robotic I think of a complex machince, not a RC car with a M-249 on it(although that is pretty sweet). I mean, we have reporters talking about robotic ethics when the vehicles make no independant decisions. Until these vehicles have the ability to make independant decisions(like the X-47) , lets nix the robot talk.
  • by silas_moeckel ( 234313 ) <silas.dsminc-corp@com> on Wednesday May 20, 2009 @03:51PM (#28030117) Homepage

    War is ultimately the only way to inflict a nations will upon another. Lets not even get started with the fallacy of you cant gain land though war or a world court etc. All laws have to be enforced ultimately via violence. Throw religion into the mix and it's a ugly irrational thing.

    Want to avoid war find the solutions to our 2 core problems the need for energy and resources. As long as it's easier to get take either of those from somebody else than get it yourself you will have war. Only two solutions is fission/fusion and space colonization / mining. We have one but are afraid to use it and the other is a ways off and under funded.

  • by stoolpigeon ( 454276 ) * <bittercode@gmail> on Wednesday May 20, 2009 @04:02PM (#28030331) Homepage Journal

    the cell phone does not 'think'. I meant AI when I said it.

  • by Lord Ender ( 156273 ) on Wednesday May 20, 2009 @04:06PM (#28030417) Homepage

    You don't know what you mean when you say "think," then. I take it you got your ideas of "AI" from sci-fi movies, rather than from the computer science classroom.

  • Re:Waldos (Score:3, Insightful)

    by rtfa-troll ( 1340807 ) on Wednesday May 20, 2009 @04:13PM (#28030511)

    A robot is pretty much defined as a device with sensors which acts independently on them. The US Army predator drones are able to land on their own with no operator input and as such definitely count as robots. However, most do not kill automatically, but there seeem to even be some which do that [vnunet.com].

    However, I think you are right in a deeper way. None of these things are "intelligent" robots in the sense of Asimov stories. The story has a discussion about the possibility of designing these robots to make ethical decisions but one which ignores the fact that these are hard AI problems over which there has been practically no progress since the dawn of computing. These kind of discussions often end up with someone quoting the Asimovian three laws and this even happens on forums with relatively intelligent informed readers [schneier.com] but, apart from the fact that laws designed to ensure safety can't really apply to a device designed for killing, that's totally irrelevant since the three laws are stated in English. The real problem is how to state them in actual program code.

  • by vertinox ( 846076 ) on Wednesday May 20, 2009 @04:15PM (#28030533)

    War is ultimately the only way to inflict a nations will upon another.

    Unless both side has nukes.

    The the only way to inflict your will is through smaller proxy wars and economics.

    Of which both I suppose could also benefit from robotics.

  • by fishtorte ( 1117491 ) on Wednesday May 20, 2009 @04:22PM (#28030637)

    War is ultimately the only way to inflict a nations will upon another.

    The notion that it's a good idea or even possible for one set of people to force its will on another is what leads to war, and it's one we might do well to change.

  • by jollyreaper ( 513215 ) on Wednesday May 20, 2009 @04:41PM (#28030897)

    Technically speaking, a homing missile or torpedo could count as a robot weapon. We tend not to think that way because the gap between pressing the button and impact is short enough it's just like pulling the trigger on a gun and watching someone die.

    Landmines and other boobyraps are, intellectually, about the same thing as an autonomous AI weapon -- they kill without human intervention, are impersonal and horrific. Yes, it's more frightening to imagine a T-800 coming after you and taking your leg off with a chainsaw but seriously, the results aren't that much different from a landmine.

    When talking about the dangers of taking the human out of the loop, we've already got enough problems with humans in the loop. We took more kills from friendly fire than from the Iraqis in Gulf War 1. The more powerful the weapon, the easier the oops. I don't know how many top generals were accidentally killed by sentries back in the days of Rome -- kinda hard to accidentally run someone through with your gladius -- but just ask Stonewall Jackson how easy that sort of thing became with firearms. We'd never have gone through and killed an entire bunker of civilians by accident if our soldiers were doing the work with knives but that becomes as easy as an oops when dropping LGB's from bombers on the word of some faulty intel. Powerful weapons compound and magnify human errors.

    Aside from the typical fear we have at the thought of impersonal killing machines taking us out, I think we have two other big fears -- 1) war becomes easier and less painful when robots are doing the dying and 2) a robot will never refuse a legal yet immoral order.

    We've had bad wars historically but the 20th century really had them all beat. Technology allowed for total war, the bending of an entire nation's will to the obliteration of another. Ambitions grew bigger, power could be projected further, and the world became a smaller, more dangerous place. Battlefield robots will be a continuation of this trend.

  • Re:Waldos (Score:4, Insightful)

    by RsG ( 809189 ) on Wednesday May 20, 2009 @05:09PM (#28031413)

    These kind of discussions often end up with someone quoting the Asimovian three laws and this even happens on forums with relatively intelligent informed readers [schneier.com] but, apart from the fact that laws designed to ensure safety can't really apply to a device designed for killing, that's totally irrelevant since the three laws are stated in English. The real problem is how to state them in actual program code.

    The second and third laws could still apply though. The whole "shall not harm, or by inaction allow harm to come to, a human being" law does make for a fairly useless war machine, but you'd want to hardcode the robot to follow orders from a human operator and preserve its own integrity.

    The second law is at least easy to approximate in modern code. If a given order with the right authorization is received through whatever channels the robot is designed to listen to, then it obeys. That actually could be a problem if the machine is used against an enemy with significant electronic warfare capability - they might be able to block orders entirely or substitute new ones.

    There's a world of difference between a machine autonomous enough to need ethical programming and what we have today. I could fairly easily envision a combat robot that had nothing even remotely approximating strong AI, yet still functioned autonomously (would need general orders, but not step by step instructions). A sort of middle ground between an Asimov robot and a modern combat drone.

    For ground robots to fill the role of infantry or armoured vehicles, you'd need some fairly advanced terrain navigation software. This isn't too far off, but we're not there yet. You'd need software to evaluate standing orders versus mission orders and prioritize them accordingly, which seems like it could be accomplished with modern code. You'd need to be able to phrase instructions in a way that a machine can understand, which is as you rightly pointed out difficult, but obviously still possible.

    The real challenge is going to be IFF software - how do you judge a civilian from a combatant, or one side's soldiers from the other? This would be on par with robotic ethics, but target recognition is bound to be simpler to program than right or wrong.

    If those problems were solved, then a combat robot could operate on orders that amount to "proceed to the following GPS coordinates, engage targets, report back."

    My own estimate is that we'll reach this middle ground in a matter of decades, if we're quick about it. We'll doubtlessly see fully autonomous aircraft before ground units - say at least 5-10 years between the former and the later. Will we ever see strong AI deployed independently in warfare? I doubt it. No commander is going to trust a machine that implicitly. What we may see is a centralized strong AI used to manage a network of drones and soldiers, since that at least leaves human decision making in the system.

  • Software tackling certain problems which are easy for humans but hard for computers,

    That's the most "moving goalposts" definition of something I've read all day. You do realize that chess used to be (relatively) easy for humans, and extremely hard for computers, and nowadays, you can download free chess programs that run on commodity computers and can be all but the most exceptional human chess players? When computers can do something that was hard for them before, and then it becomes commonplace for computers to do it, is it no longer AI?
     
    Please, tell us a substantial definition if you have problems with how everyone is using the term. This weird subjective thing you've presented has no place as a CS definition.

  • by easyTree ( 1042254 ) on Wednesday May 20, 2009 @08:04PM (#28033687)

    Bah. That'll never fly. There's no profit motive in not killing thousands of foreigners then not being paid to rebuild their damaged country.

Old programmers never die, they just hit account block limit.

Working...