Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Robotic Cannon Loses Control, Kills 9 580

TJ_Phazerhacki writes "A new high tech weapon system demonstrated one of the prime concerns circling smarter and smarter methods of defense last week — an Oerlikon GDF-005 cannon went wildly out of control during live fire test exercises in South Africa, killing 9. Scarily enough, this is far from the first instance of a smart weapon 'turning' on its handlers. 'Electronics engineer and defence company CEO Richard Young says he can't believe the incident was purely a mechanical fault. He says his company, C2I2, in the mid 1990s, was involved in two air defence artillery upgrade programmes, dubbed Projects Catchy and Dart. During the shooting trials at Armscor's Alkantpan shooting range, "I personally saw a gun go out of control several times," Young says. "They made a temporary rig consisting of two steel poles on each side of the weapon, with a rope in between to keep the weapon from swinging. The weapon eventually knocked the pol[e]s down."' The biggest concern seems to be finding the glitches in the system instead of reconsidering automated arms altogether."
This discussion has been archived. No new comments can be posted.

Robotic Cannon Loses Control, Kills 9

Comments Filter:
  • by User 956 ( 568564 ) on Thursday October 18, 2007 @06:35PM (#21033579) Homepage
    Robotic Cannon Loses Control, Kills 9

    To be fair, it did give them 30 seconds to comply.
    • by Jeremiah Cornelius ( 137 ) on Thursday October 18, 2007 @06:51PM (#21033795) Homepage Journal
      I submitted the same story.

      Unfortunately, the editors may not have approved of my comments linking Bill Joy's "Cassandra" predictions of killer robots, with the pledge to remove the Roomba from my home - and idle speculation about the possible involvement of Windows XP in this incident...
    • by Anonymous Brave Guy ( 457657 ) on Thursday October 18, 2007 @07:10PM (#21034051)

      I think I'm too old for this stuff. It seems like these days, if I mention to a younger software developer that even now Robocop is still one of the scariest films I've ever seen, they assume it's because of the ketchup effects.

      • by Nazlfrag ( 1035012 ) on Thursday October 18, 2007 @09:17PM (#21035325) Journal
        Well, if you'd grown up all your life in the despotic, decadent corporate dystopia depicted in Robocop like those young'uns did, you'd be fairly oblivious too.
      • by jollyreaper ( 513215 ) on Thursday October 18, 2007 @11:06PM (#21036357)

        I think I'm too old for this stuff. It seems like these days, if I mention to a younger software developer that even now Robocop is still one of the scariest films I've ever seen, they assume it's because of the ketchup effects.
        Ever watch the special commentary on Hellraiser? They interview the original makeup guys and they're like "Yeah, we were trying to go for something really horrific with the Cenobites, something that would make you sit back and go 'Holy fucking Christ, what happened to these people?' Give you a real shock reaction." Then they cut to the body modification freaks. "So we saw this and thought yeah, this is something we want to do to ourselves." The makeup guys thought they were making a horror movie, not a fashion statement. Reminds me of the comment "Hey, neocons! 1984 wasn't supposed to be an instruction manual!"
    • by suprcvic ( 684521 ) on Thursday October 18, 2007 @07:38PM (#21034383)
      The fact that anybody is joking about 9 people losing their lives sickens mean. Have you all truly lost touch with reality to the point that the loss of human life is completely lost on you? Seriously?
      • by Skreems ( 598317 ) on Thursday October 18, 2007 @07:47PM (#21034473) Homepage
        When you're talking about massive loss of life while testing armed robots that the military wants to turn loose on the world, sometimes humor is the only way to deal with reality.
        • by delong ( 125205 ) on Thursday October 18, 2007 @08:45PM (#21035027)
          Kind of like my response to Slashdotters objecting to an automated weapon designed to shoot down cruise missiles, which leave too little reaction time for human-controlled defenses to counter, which save lives of soldiers, airmen, and sailors from massive loss of life.
        • by Rich0 ( 548339 ) on Thursday October 18, 2007 @08:53PM (#21035095) Homepage
          Honestly, from reading the article it isn't clear that a software problem was even the cause of this disaster. It could have been some kind of mechanical gun jam.

          Any time you are dealing with big guns, fast motors, high-speed fire, large rounds, and explosive projectiles there is a risk of disaster if things go wrong. These things aren't toys. Even if the fire button was completely manual things could still go wrong.

          I recall reading an article about a magazine detonation in a battleship which went into all kinds of detail about all the things that could go wrong - and this was a fairly manual operation. It did involve lots of machinery (how else do you move around shells that weigh hundreds of pounds?), but it was all human operated.

          Assuming the system is well-designed the automation actually has great potential to LOWER risk. Humans make mistakes all the time. They're even more prone to making mistakes when a jet is incoming loaded with cluster bombs.

          Another thing to keep in mind is that peacetime training disasters always make the news with the military. However, the military has a fine line to walk - on one hand they want to be safe in their exercises, but on the other hand they want to be able to handle combat operations. A 30 minute single-shot firing procedure that allows for all kinds of safety checks sounds great in theory, but in wartime you'd lose more people to incoming fire than you'd ever save from gun explosions. Sure, you don't want to kill yourself, but if you're so ineffective that the enemy overruns you it is all for nothing. As a result we tolerate some friendly fire, accidents, etc.

          Like it or not robotic weapons WILL be the future of warfare. Sure, one country might elect not to develop them, but sooner or later somebody else will, and once they work out the bugs they'll be overrunning everybody else...
          • by AJWM ( 19027 ) on Friday October 19, 2007 @01:17AM (#21037445) Homepage
            Even if the fire button was completely manual things could still go wrong.

            Absolutely. I was on the range once when the guy a couple of spots over had the mechanism fail (never did find out if it was dirt or breakage) on his FN and it started firing full auto without his hand anywhere near the trigger. Fortunately he (and/or the sergeant that was on him almost immediately) had the presence of mind to keep it pointed downrange until it emptied.
          • Re: (Score:3, Funny)

            by bentcd ( 690786 )

            once they work out the bugs they'll be overrunning everybody else
            I think it'll take a little more effort than just a few rounds of work-out at the local gym for mere bugs to be overrunning us. Now, if instead they could arrange for an automated gamma ray "accident" in army ant territory, /then/ we're talking . . .
        • by Johnny5000 ( 451029 ) on Thursday October 18, 2007 @10:23PM (#21035967) Homepage Journal
          When you're talking about massive loss of life while testing armed robots that the military wants to turn loose on the world, sometimes humor is the only way to deal with reality.

          Seriously.. this thing was built with the explicit purpose of raining death down on people.

          And lookee, it apparently did the job it was built to do....
          Only on people we've all decided "deserved" to keep their lives.

          Unlike the people this thing was *intended* to kill.
        • Re: (Score:3, Insightful)

          by timeOday ( 582209 )
          Does the robotic aspect make this any different from a fatal bridge collapse or a tire failure? IMHO it's the same.
          • by profplump ( 309017 ) <zach-slashjunk@kotlarek.com> on Friday October 19, 2007 @02:08AM (#21037747)
            No, it's totally different.

            You see, it's not just a structural failure of the support system for the (at least as far as reported) otherwise working gun, it's a structural failure of the support system for the otherwise working robotic gun. Apparently. I'll admit the difference doesn't seem important to me either, but all comments here have convinced me that adding the word "robot" to any story involving a mechanical failure is grounds for anti-technology panic.

            And remember, those doors at the supermarket aren't just automatic, they're... ROBOTIC. OH NOES!!1! THE SUPERMARKET DOORS COULD KILL SOMEONE IF THEIR SUPPORT TRACK FALL OFF THE WALL. WE MUST REMOVE ALL ROBOTS TO SAVE THE CHILDRENS.
      • by Detritus ( 11846 ) on Thursday October 18, 2007 @07:53PM (#21034559) Homepage
        If I stub my toe, it's a tragedy. If you get run over by a herd of elephants, it's funny.

        If you want really sick and twisted humor, try living in a war zone.

      • by jlarocco ( 851450 ) on Thursday October 18, 2007 @08:33PM (#21034905) Homepage

        150000 people die every day. That's almost 2 a second. I'm sure the family and friends of these 6 are heart broken, but for the 6.5 billion people who don't know them, it's not all that remarkable.

        The only thing unique about these 6 people is that they died in a somewhat amusing way. If you want to mourn, mourn for the other 149994 people who died today that you'll never hear about.

        • by Denial93 ( 773403 ) on Friday October 19, 2007 @10:03AM (#21041887)
          Huh? You mean on 9/11, 2.5% of fatalities worldwide were due to terrorism? And since then, terrorist deaths have practically flatlined, with rarely more than 0.01%, way behind pulmonary heart diseases, the flu, starvation, war, crime, work accidents, motorvehicle crashes and all sorts of other causes? You mean it doesn't make sense to throw terabucks into the War On Terror when relatively cheap nutrition programmes could save 27000 lives per day?

          What is this, a remaining pocket of common sense?
      • by Anonymous Coward on Thursday October 18, 2007 @08:39PM (#21034957)
        If you think this is sick, you should hear what comes out of the mouths of soldiers in combat.

        It's called gallows humor, and it has been shown to be one of the most effective coping strategies when being involved with or witness to a traumatic situation that you have little control over.

        Oh... after looking through your history, I finally get it. It's sick and disgusting to you because it happened to soldiers, rather than soldiers slaying civilians with their arsenal. Gotcha.
      • Re: (Score:3, Insightful)

        by microTodd ( 240390 )
        This thread happens every single time some tragedy with loss of life is posted here on Slashdot. Some people find the humor, then others are "sickened" and "can't believe the heartlessness".

        The simple matter is, many, many people die every day. Many, many people are also born every day. You can't be personally upset over every life lost or you would spend all your time in overwhelming grief. And sometimes humor is the only alternative to what would otherwise be shock, anger, sadness, or fear.
    • by jlawson382 ( 1018528 ) on Thursday October 18, 2007 @07:41PM (#21034405)
      Are you still there?
    • by Aqua OS X ( 458522 ) on Thursday October 18, 2007 @07:44PM (#21034445)
      "I came here with a simple dream, a dream of killing all the humans."
      -Bender

      too soon?
    • by petsounds ( 593538 ) on Thursday October 18, 2007 @08:59PM (#21035143)
      This entire story is inaccurate. The Oerlikon weapons system [wikipedia.org] they were using is a variant of a towed anti-air gun first made in 1955. This version has a computer-based, laser-guided targeting system. But it was made in 1985. This is not robots gone crazy. This is just a software glitch (or perhaps hardware failure) from an outdated system. This is not a fracking robot.

      This is typical of recent slashdot who is trying to compete more with the sensationalism of digg and other tech blogs. No fact-checking, just throw it up and wait for the ad impressions to roll in.
  • ED-209 (Score:5, Funny)

    by gEvil (beta) ( 945888 ) on Thursday October 18, 2007 @06:37PM (#21033609)
    Scarily enough, this is far from the first instance of a smart weapon 'turning' on its handlers.

    I seem to recall seeing a documentary about this about 20 years ago. Ahh, here it is. [imdb.com]
  • Finally (Score:5, Funny)

    by High Hat ( 618572 ) on Thursday October 18, 2007 @06:38PM (#21033627)
    # kill -9

    ...for the real world!

  • by User 956 ( 568564 ) on Thursday October 18, 2007 @06:38PM (#21033631) Homepage
    During the shooting trials at Armscor's Alkantpan shooting range, "I personally saw a gun go out of control several times," Young says.

    This gives new meaning to the phrase "Blue screen of death".
  • Are they certain they haven't gotten any component parts from Acme?
    • by svvampy ( 576225 )
      Well Military equipment is built by the lowest bidder. Maybe if they'd aimed for a Yugo instead of a Trabant. (Car analogy FTW!)
    • by Kamokazi ( 1080091 ) on Thursday October 18, 2007 @07:36PM (#21034355)
      From here [itweb.co.za]:

      Young says he was also told at the time that the gun's original equipment manufacturer, Oerlikon, had warned that the GDF Mk V twin 35mm cannon system was not designed for fully automatic control. Yet the guns were automated. At the time, SA was still subject to an arms embargo and Oerlikon played no role in the upgrade.

      It may just be me, but automating a machine that fires explosives that isn't designed to be automated just sounds like a Bad Idea(TM).
  • Three Laws of Robotics: [wikipedia.org]

    1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

    2. A robot must obey orders given to it by human beings except where such orders would conflict with the First Law.

    3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

    "Asimov believed that his most enduring contributions would be his "Three Laws of Robotics" and the Foundation Series."Isaac Asimov [wikipedia.org] article in Wikipedia.
    • Yes... did anyone even read the books before posting that? seriously, there are issues with those laws.
      • Re: (Score:3, Insightful)

        by FooAtWFU ( 699187 )
        Not the least of which is, with current artificial intelligence, they're laughably unenforcable. In Asimov's books, you had this neat little "positronic brain" which was capable of resolving sensory inputs and determining things like "that is a human -->" (to say nothing of "I am harming it", especially through indirect causality.) They were even capable of coming up with ideas to avoid the "through inaction" clauses.

        Really, the stories weren't about robots, they were about people just like us, with a ce

    • Re: (Score:3, Insightful)

      Sorry, I missed the end of that story. How did it turn out, again?

    • by Cheapy ( 809643 ) on Thursday October 18, 2007 @08:44PM (#21035007)
      I always find it hilarious that people will always post those "Laws", as if they were Universal Laws such as "1+1=2".

      They are a set of fictional laws made up by an author for his science fictional books. Are we seriously going to accept every and all Laws that appear in fiction?
    • Re: (Score:3, Interesting)

      by Mr. Flibble ( 12943 )
      The "Zeroth" law:

      0. A robot must know it is a robot.
  • by Sloppy ( 14984 ) on Thursday October 18, 2007 @06:43PM (#21033693) Homepage Journal
    ..killbots have preset limits.
  • by Merovign ( 557032 ) on Thursday October 18, 2007 @06:44PM (#21033697)
    As I used to say to developers at a company I used to work for,

    "I want to tell you about a radical new idea I had - testing things before deploying them."

    In the case of weapons systems, that means debugging the software before loading the gun.

    Truth me told, most "automated" weapons are more like remote control, for precisely this reason.

    Also, while my experience is not vast in the area, most American weapons testers follow a lot of safety rules - including not being in the line of fire of the darned thing. Note I said most - we have our munitions accidents here, too.
    • The irony is not lost on me - I have had many people argue with me that testing is worthless.

      These same people later pay heavily for me to rescue their production systems.
    • by Fishead ( 658061 ) on Thursday October 18, 2007 @06:53PM (#21033829)
      As a robotics technician with close to 7 years experience working with Automated machines, all I can say is "PLEASE DON'T GIVE THEM GUNS!!!"

      Many times I have seen an automated system go out of control due to something as simple as a broken wire on an encoder to an entirely failed controller. Closest thing to this that we ever got was one day a SCARA robot (about the size and shape of a human arm) ran away (out of control) and hit the door on the work cell. Wouldn't have been a big deal except that another of the robotics guys was walking by and walked into the door as it swung open. Good times, good times, but I would never want to be around an automated machine with a gun, just too big of a chance for something to go wrong.
  • from the keep-quiet-on-the-terminator-jokes dept.
    No Sarah Connor! Denied!
  • by riker1384 ( 735780 ) on Thursday October 18, 2007 @06:45PM (#21033709)
    Why didn't they have some provision to cut power to the weapon? If they were testing it in a place where there were people exposed in its possible field of fire (effectively "downrange"), they should have taken precautions.
    • Re: (Score:3, Funny)

      by PingPongBoy ( 303994 )
      Why didn't they have some provision to cut power to the weapon?

      My dear Mr. Watson, there was a provision. The problem was the confusion between programming for MS-DOS versus Unix.

      The clues have told us exactly what happened. From "Robotic Cannon Kills 9", we see clearly the command kill -9 was issued but the weapon was DOS based and did its job all too well.
    • by Ruvim ( 889012 ) on Thursday October 18, 2007 @07:36PM (#21034361)
      There were provisions for that... But it was smart enough to take out the people at the button first.
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Thursday October 18, 2007 @06:45PM (#21033715)
    Comment removed based on user account deletion
    • As with most automated technologies it will make some mistakes, but less than a human on average. The friendly fire rate for most militaries is no where near perfect.

      Err... It's still firing at humans, and needs to be controlled somehow.. there's always the potential for friendly fire, especially so with automated weaponry. How will the weapon identify friend vs foe?

      Ok, so you have some sort of identifier badge or something, but what happens if an enemy is mixed in there? How will the weapon identify "safe" firing situations?

  • I for one welcome our Oerlikon GDF-005 overlords.
  • SkyNet (Score:3, Funny)

    by PPH ( 736903 ) on Thursday October 18, 2007 @06:46PM (#21033735)
    When it was done, did it say, "I'll be back"?
  • No pun intended (Score:5, Insightful)

    by geekoid ( 135745 ) <dadinportland@y[ ]o.com ['aho' in gap]> on Thursday October 18, 2007 @06:46PM (#21033737) Homepage Journal
    But shouldn't this thing have a kill switch? Seriously, my table saw has a kill switch.

  • Riiight (Score:4, Insightful)

    by Colin Smith ( 2679 ) on Thursday October 18, 2007 @06:47PM (#21033747)

    The biggest concern seems to be finding the glitches in the system instead of reconsidering automated arms altogether.
    Because human beings are so good at shooting down low flying supersonic aircraft.

     
    • Re:Riiight (Score:5, Interesting)

      by mav[LAG] ( 31387 ) on Thursday October 18, 2007 @07:35PM (#21034353)
      Of course they are. In the AAAD (All Arms Air Defence) training I did in the Royal Artillery we regularly knocked down scale targets that were moving at equivalent speeds with ordinary GPMGs. It wasn't easy at first but after a few thousand rounds you definitely get the hang of it.

      A few other points:

      * The majority of low level flying targets are subsonic anyway
      * It just takes a single hit in the right place on the airframe for the target to tear itself to pieces
      * Having a computer fire a weapon is a very very bad thing, One of the principles that was drummed into us was a human must always pull the trigger. Always. Computers can aim for you, make the tracking easier, calculate the numbers, whatever - anything but actually fire the weapon. That should always be done by a person with the correct training and authorisation.

      If this weapon fired by itself because of a software glitch, then it's poorly designed.

      • Re:Riiight (Score:5, Insightful)

        by Thaelon ( 250687 ) on Thursday October 18, 2007 @10:56PM (#21036257)
        Maybe that's what they tell the grunts. Congratulations, you managed to shoot down large mock targets that weren't shooting back.

        Think you can shoot down supersonic missile flying below the horizon? No. They let the computer guided robots do that. You're not nearly good enough at it. Ok, maybe you get lucky and nail it. Now try thirty in five seconds all coming from different bearings. Didn't think so.
        • Re: (Score:3, Funny)

          by ShakaUVM ( 157947 )
          Think you can shoot down supersonic missile flying below the horizon? No. They let the computer guided robots do that. You're not nearly good enough at it. Ok, maybe you get lucky and nail it. Now try thirty in five seconds all coming from different bearings. Didn't think so.

          You just need a trackball and a good supply of quarters
  • by jtroutman ( 121577 ) on Thursday October 18, 2007 @06:51PM (#21033803)
    Guns don't kill people. Robotic, automated, 35mm anti-aircraft, twin-barreled guns kill people.
  • by HTH NE1 ( 675604 ) on Thursday October 18, 2007 @06:51PM (#21033807)
    From "Mostly Harmless" by Douglas N. Adams, Chapter 12:

    (It was, of course, as a result of the Great Ventilation and Telephone Riots of SrDt 3454, that all mechanical or electrical or quantum-mechanical or hydraulic or even wind, steam or piston-driven devices, are now required to have a certain legend emblazoned on them somewhere. It doesn't matter how small the object is, the designers of the object have got to find a way of squeezing the legend in somewhere, because it is their attention which is being drawn to it rather than necessarily that of the user's.

    The legend is this:

    "The major difference between a thing that might go wrong and a thing that cannot possibly go wrong is that when a thing that cannot possibly go wrong goes wrong it usually turns out to be impossible to get at or repair.")
  • FTA: (Score:4, Funny)

    by CaptainPatent ( 1087643 ) on Thursday October 18, 2007 @06:53PM (#21033821) Journal

    The South African National Defense Force "is probing whether a software glitch led to an antiaircraft cannon malfunction that killed nine soldiers and seriously injured 14 others during a shooting exercise on Friday.
    in the follow-up article:
    "software engineers find that a goto statement was the cause of the recent military disaster. Experts say while this was a terrible tragedy, it could have been much worse [xkcd.com]."
  • This reminds me of a chapter of Ghost in the Shell:SAC where a Robotic Cannon lost control and began shooting the military.

    Is truth mirroring fiction now?
  • by noewun ( 591275 ) on Thursday October 18, 2007 @06:57PM (#21033871) Journal
    run like hell from our drum-fed, fully automatic robot overlords.
  • by MrKaos ( 858439 ) on Thursday October 18, 2007 @06:59PM (#21033907) Journal
    seems a bit stoopid

    By the time the gun had emptied its twin 250-round auto-loader magazines, nine soldiers were dead and 11 injured.
    was it neccesary to fill both magazines in a test fire, or for that matter in a live test fire perhaps have some sort of abort system ready - even if it just cut the power to the control systems?

    Maybe fill the magazines on the 5th live fire test???

    Just sayin, ya know.

  • by johnnywheeze ( 792148 ) on Thursday October 18, 2007 @07:03PM (#21033965)
    Guess the NRA has to change the slogan... Guns DO kill people!
  • by mrscorpio ( 265337 ) <twoheadedboy@stonepoo l . c om> on Thursday October 18, 2007 @07:12PM (#21034081)
    Dear,

    It is my humble pleasure to write this letter irrespective of the fact that you do not know me. However, I came to know of you in my private search for a reliable and trustworthy person that can handle a confidential transaction of this nature in respect to our investment plans in real estate. Though I know that a transaction of this magnitude will make any one apprehensive and worried, but I am assuring you that all will be well at the end of the day. Let me start by first, introducing myself properly to you. I am Peter Okoye, a Branch Manager at one of the standard trust bank in South Africa. A foreigner, Late Nicholas Owen, a Civil engineer/Contractor with the federal Government of South Africa, until his death three years ago in a ghastly automated robot accident, banked with us here at the standard bank South Africa. He had a closing balance of USD$25.5M (Twenty five Million, Five Hundred Thousand United States Dollars) which the bank now unquestionably expects to be claimed by any of his available foreign next of kin. Or,alternatively be donated to a discredited trust fund for arms and ammunition at a military war college here in South Africa. Fervent valuable efforts made by the standard trust bank to get in touch with any of late Nicholas Owen_s next of kin (he had no wife and children)has been unsuccessful. The management under the influence of our chairman and board of directors, are making arrangement for the fund to be declared UNCLAIMABLE and then be subsequently donated to the trust fund for Arms and Ammunition which will further enhance the course of war in Africa and the world in general. In order to avert this negative development. Myself and some of my trusted colleagues in the bank, now seek for your permission to have you stand as late Nicholas Owen_s next of kin. So that the fund (USD$25.5M), would be subsequently transferred and paid into your bank account as the beneficiary next of kin through our overseas corresponding bank. All documents and proves to enable you get this fund have been carefully worked out and we are assuring you a 100% risk free involvement.

    Your share would be 30% of the total amount. While the rest would be for me and my colleagues for purchase of properties in your country through you/your Company. If this proposal is OK by you, then kindly get to me immediately via my e-mail (pokoye_mg@mail.com) furnishing me with your most confidential telephone and fax , so I can forward to you the relevant details of this tran! saction. Thank you in advance for your anticipated cooperation.

    Best Regards.

    Peter Okoye

    Branch Manager,

    STANDARD TRUST BANK SOUTH AFRICA
  • by flyingfsck ( 986395 ) on Thursday October 18, 2007 @07:17PM (#21034141)
    In a previous life I worked on the predecessor of those guns and I have been to many tests. Problems were usually due to stupidity somewhere along the line, not due to failures. I suspect that it is still the exact same guns, totally refurbished and with new electronics. The guns move *very* fast and fire at a *very* high rate (similar firing rate to an assault rifle, but with much larger projectiles). Just getting side swiped by the moving barrel can kill an operator. The projectiles actually have various safeties: a. Launch G force b. Spin c. Time delay d. Self destruct The gun also has protection with no-fire zones - to prevent this exact kind of accident. These no-fire zones must also have malfunctioned. I find it surprising that the projectiles exploded, but the article is not clear, maybe the safeties worked and they did not explode. The problem is that they still move at supersonic speed and when they impact something close to the gun, the projectile and whatever it hits will break up, even if it doesn't explode. So, I feel sorry for the operators and I hope that whoever wrote and tested that buggy code have already been fired too.
  • by stor ( 146442 ) on Thursday October 18, 2007 @07:30PM (#21034293)

    The biggest concern seems to be finding the glitches in the system instead of reconsidering automated arms altogether"


    You call this a glitch? We're scheduled to begin construction in 6 months. Your temporary setback could cost us 50 million dollars in interest payments alone!

    -Stor
  • Historical precedent (Score:3, Informative)

    by earthforce_1 ( 454968 ) <earthforce_1&yahoo,com> on Thursday October 18, 2007 @08:26PM (#21034857) Journal
    It isn't unusual for even not so smart weapons to turn on their handlers. There are lots of very old historical precedents.

    A few years back, a cadet had his hands blown off by a cannon at Fort Henry, Ontario. While he was tamping down the powder charge ,a few leftover embers from a previous shot touched off the powder and blasted away the tamping rod with his hands attached. Apparently this was a common way to be injured or killed on wooden warships.

    I was not unusual for soldiers to be killed by accident with US civil war gatling guns which lacked a mechanism for locking the crank in place. As a result, the crank would occasionally make a quarter turn or so under force of gravity, popping off a few rounds. Tough beans for anybody unlucky enough to be in front of it. Automatic weapons can "cook off" a round just from the heat of prior sustained firing.

    The Forrestal fire http://en.wikipedia.org/wiki/USS_Forrestal_(CV-59) [wikipedia.org] of 1967 was caused when an freak electrical surge caused a F4 to launch a missile across the deck, puncturing the fuel tank of another plane loaded with live munitions and touching off a chain reaction that ultimately killed 132 of the crew.

    HERO (Hazards of Electromagnetic Radiation for Ordinance) http://usmilitary.about.com/od/glossarytermsh/g/h2814.htm [about.com] has long been a concern for the military.

  • by Genda ( 560240 ) <(mariet) (at) (got.net)> on Thursday October 18, 2007 @08:39PM (#21034965) Journal

    In point of fact the gun worked perfectly, it was just ill advised to use the "Dick Cheney" AI personality for live testing.

    -And low, the lawyers ran like rabbits... and it was a good thing...

  • by Crypto Gnome ( 651401 ) on Friday October 19, 2007 @12:30AM (#21037095) Homepage Journal
    • This is a weapon
    • The intended purpose of weapons is to kill people
    • They were military personnel
    • The intended purpose of military personnel is to die horribly

    er, statistically speaking, of course.

  • Gentlemen (Score:3, Funny)

    by hcdejong ( 561314 ) <hobbes.xmsnet@nl> on Friday October 19, 2007 @02:21AM (#21037827)
    after careful consideration I've come to the conclusion that your new defence system sucks.

Heard that the next Space Shuttle is supposed to carry several Guernsey cows? It's gonna be the herd shot 'round the world.

Working...