Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
AI ISS Robotics Space

Russia Wants To Send A Gun-Shooting Robot To The ISS (mashable.com) 141

"Just in time for the rise in global military tensions, Russian officials have released video that's sure to calm fears all around: a death dealing humanoid robot that shoots handguns." An anonymous reader quotes Mashable: Posted to Twitter on Friday by Russia's deputy Prime Minister, Dmitry Rogozin, the video shows the country's space robot FEDOR (Final Experimental Demonstration Object Research) accurately shooting twin pistols in a scene chillingly similar to images from The Terminator. But rather than being displayed as a not-so-subtle warning to the entire human population of the planet, Rogozin instead claims via Facebook that it's just a demonstration of the robot's dexterity and use of algorithms to execute tasks.
CNET quotes Russia's deputy prime minister as saying "We are not creating a Terminator, but artificial intelligence that will be of great practical significance in a lot of spheres." Russia plans to deploy the robot on the International Space Station by 2021, Mashable reports, adding "Hopefully, the robot's arrival on the ISS will come sans life-snuffing weaponry, which is pretty much the opposite of the intent behind creating a peaceful international space station shared by the world's super powers in the first place."
This discussion has been archived. No new comments can be posted.

Russia Wants To Send A Gun-Shooting Robot To The ISS

Comments Filter:
  • I'm pretty sure they still have some NR-23 autocannons in a warehouse somewhere if they'll ever feel the need to weaponize the robot. ;)
    • by ls671 ( 1122017 )

      Exactly, that sounds fine with me at first glance, as all participants in the ISS. As a matter of fact. the "I" stands for international.

  • This was a missed opportunity to call the robot FEDORA. The acronym was tortured enough, but for no reason. That'd raise the question... does it run Linux?

    • by Thud457 ( 234763 )
      They should have named the damn robot HECTOR and sent him to the space station with whoever is today's equivalent of Farrah Fawcett.
  • by Anonymous Coward on Monday April 17, 2017 @03:47AM (#54247577)

    Because the press gave a big yawn when they announced this

    FEDOR was displayed last year drilling into a pile of cinderblocks and touted as an assistant to astronauts during space travel.

  • for a moment there I had my hopes up as I misread that as them deploying it against ISIS. We must actually be pretty close to being able to deploy remotely controlled ground infantry to deal with them. I hate the idea that robots would dehumanise war so that those that instigate it feel no consequences, but in the case of ISIS I would put my moral beliefs to one side.
    • Re: (Score:3, Insightful)

      by Bomazi ( 1875554 )

      What century are you living in ? Cruise missiles (which result in no casualties for their user) have been used without hesitation for decades in situations in which ground troops would not have been sent. Similarly bombing missions are carried out against ISIS precisely because they don't have the necessary anti-aircraft defenses. Same with the more modern UCAVs. Arms manufacturers invest heavily in the development of unmanned weapons because there are expensive and useful to politicians who want to be able

      • Re:misread as ISIS (Score:4, Insightful)

        by Rei ( 128717 ) on Monday April 17, 2017 @05:37AM (#54247713) Homepage

        Cruise missiles and drone bombings do not substitute for ground troops. The GP is clearly talking about a robotic substitute for ground troops.

        • Re:misread as ISIS (Score:4, Insightful)

          by DarkOx ( 621550 ) on Monday April 17, 2017 @08:11AM (#54247975) Journal

          The ethical issue however is mostly the same. It push button warfare. One side can kill the other without facing death themselves. Machines still do the killing (whether a gun toting robot or shell on the end of a guided missile with an altitude trigger) at a remote site where there isn't a human been to look the other guy in the eyes and possible change its mind.

          Consider the MOAB. We killed 36 ISIS 'fighters' were they all fighters or were two of them just guys ISIS grabbed and said "you'll be cooking our meals or we kill your family" those two hypothetical individuals are men who might have surrendered to ground troops when the position was eventually over run. Those are lives that might have been spared, instead they got incinerated like everyone else! Again this just my imagination it is probably more likely everyone of the guys hold up in those caves were committed Islamists determined to kill anyone standing in the way of their caliphate.

          Maybe one day our or Russian 'terminators' will have capacity to capture or kill, hopefully by the time these are deployed in the field they will have enough visual recognition to see if someone is carrying what appears to be a firearm and shoot them and maybe not kill the the little girl carrying a bucket of water. That kind of target recognition is far from simple however. Maybe that isn't a bucket of water, maybe is a bucket of acid? Humans are somewhat good and figuring that stuff out, machines have a ways to go yet.

          • Consider the MOAB. We killed 36 ISIS 'fighters' were they all fighters or were two of them just guys ISIS grabbed and said "you'll be cooking our meals or we kill your family" those two hypothetical individuals are men who might have surrendered to ground troops when the position was eventually over run.

            To be fair, the MOAB is totally awesome.

            • by DarkOx ( 621550 )

              To be fair, the MOAB is totally awesome.

              I know you were being tong in cheek, but you are not wrong. It is awesome, that we can eliminate an enemy position without risking the lives our our service personnel!

              I did not mean my post to suggest the right/moral course was to send ground troops into that mess of caves. I would have used the bomb too, honestly if the president ordered me to take out that position. I simply was observing that there is a moral hazard to push button war of any sort. It makes it an easier call to kill people when you k

              • What more worries me is does all this stuff make it to easy to decide to go to war in the first place?

                It does, but it shouldn't. The standard should be "war means innocent little babies are going to die. Is this action worth it?" And the answer to that is almost always "no."

                or should we keep the blood off our hands, even when that means sitting by and allowing injustice and even atrocities to occur?

                Generally yes, we should keep the blood off our hands, because eventually if everybody decides they don't want blood on their hands the wars kind of stop.

                Obviously it depends on what kind of ethics you practice. If you're a utilitarian or consequentialist then you're going to start trying to predict the future about how many people you'r

                • by DarkOx ( 621550 )

                  That is interesting and it confirms neurologically what psychologists have suspected for a long time. Certain religions attract the "healthy minded." Which sounds nice but really isn't a value judgement. You could call it a certain tolerance for injustice if you will. The healthy minded individual says, "I am not responsible if I am not personally involved and things are not necessarily in my power to improve". Catholicism mostly falls into this line of thinking. Other people see an unjust situation

                • Obviously it depends on what kind of ethics you practice.

                  For a Buddhist it is fairly simple; did you use less violence than what you were trying to prevent others from using, and in the end did you reduce the overall level of suffering?

                  In the case of dropping the MOAB on a terrorist base, I think it clear that this passes Buddhist moral and ethical analysis. Killing 2 dozen people with a bomb is less violent than letting those people take over whole cities and murder a significant percent of the residents, which is what ISIS has done in other places. Also, being

                  • If you can stop a missile with a bullet, do it!

                    Or, if you can stop a bullet with the largest non-nuclear armament ever deployed in combat, do it!

                    • Hey derp-stick, you consider the crimes of ISIS to be equivalent to a bullet in that analogy? So a whole training camp dug into the mountains, for a group that has done the things they have in Iraq and Syria, trying to take over a new area and commit those same crimes against a new group of civilians, that's just 1 bullet to you? hurdurrrrrrrrr to you to, maaaaaaaaaaan.

                    • You should read less between the lines. I didn't say anything about the right / wrong of the action. I said the analogy of stopping a bigger weapon with a smaller weapon is exactly the opposite of the actual example. But don't let this get in the way of your ranting. Carry on soldier!

                    • I didn't read "between" the lines, I simply carried the context from a comment to the reply. If your reply didn't intend to carry the context of what it was replying to, then I can't help you with that.

                    • All I can say is, enjoy your delusion. Imagining people think things and then yelling at them for it is FUN!

                • by Jzanu ( 668651 )
                  As a catholic it is still more complex. You need to famliarize yourself with the concept of Just War [vatican.va]. Lots of soldiers, probably even most soldiers world-wide are Catholic if they are not Muslim, and have been throughout history since the establishment of Christianity.
              • There is always the potential for civilian casualties, especially with large impact remote weapons systems. The next question is what is a legitimate target. I don't think if you are fighting a war for example all civilians are really off limits in terms of targets. What about the guy working in the tire factory, or the oil field. You know his effort supports the war effort and he knows it too but is still working there, is he fair target? What about the farmer tending his wheat field? I would say yes! If taking out that facility means either sweeping it with ground forces and loosing American lives or bombing / sending in terminator unit and killing some 'enemy' non-combatants, better them than us, would be my call.

                You make a good war criminal. The problem with going down that route is that everybody loses. Civilization stagnates as infrastructure and knowledge is continually destroyed, so we return to a period where we have fragmented tribes at total war with each other. You don't think America will be able to maintain military superiority forever, do you?

                • by DarkOx ( 621550 )

                  On nation war criminal is anthers hero. Its always been that way. There are few lines Westerns have generally agreed should not be crossed but usually it comes down to intent. Yes its wrong to target civilians, its not always wrong to target assets even when that means civilians will die.

                  Dropping incendiaries on wheat fields and automotive plants because you can is wrong clearly. Doing it because you *need* to disrupt the enemies supply chain so as to save the lives of your own service men, does not see

          • The ethical issue however is mostly the same.

            The ethical issues are exactly the same in every situation, and it does not matter if the killing is done with a rock in your hand, remotely by pulling the trigger on a gun, or remotely by pressing a button on a joystick.

            Notice that the automation added by a robot is the same automation that was already added by a gun!

            Airplanes and bombs are also the exact same type of automation.

            A person decides to kill another person, and acts on it. That is the entire moral and ethical issue, and the technology used mak

            • Stop blaming the bow and take responsibility for your actions.

              Does it change things if I'm drunk while shooting the bow? I think it does.

            • by DarkOx ( 621550 )

              the technology used makes no difference

              In some absolutist systems perhaps but most of us live in a little more grey world. Its not as simple question of minimizing the absolute violence in the world. There are other considerations. When is it our fight, only when we are threatened? when who we consider to be the victims are unable to defend themselves? When our 'allies' are threatened?

              Does it matter that we have a volunteer armed services, that our people had a choice to be put in harms way. What if we have a draft again in the future does t

              • All of your questions are pure moral and ethical questions and the technology used has no bearing at all on the answers.

            • You don't think robots will eventually decide when to fire? It could happen soon!

              • Obviously, all the moral responsibility is still on the human that presses the "on" button, or who turns off the safeties and places it in combat mode. Duh.

                Even there you don't find any new moral or ethical issue.

              • The USN has had air defense guns on ships that will fire automatically for a long time. The day-to-day human control is switching the things on and off.

          • One side can kill the other without facing death themselves

            The men that make decisions regarding war have always been able to do so without facing death themselves. Nothing new here.

          • The large number of friendly fire and collateral damage reports suggests that humans are not very good at target recognition. I suspect that AI can do better

      • since when did cruise missiles count as ground infantry? must be a new type of missile I have not heard of?
    • Re:misread as ISIS (Score:4, Insightful)

      by Kjella ( 173770 ) on Monday April 17, 2017 @08:15AM (#54247991) Homepage

      We must actually be pretty close to being able to deploy remotely controlled ground infantry

      I doubt it, I don't think we have anything that comes close to passing a military obstacle course. If you think of say an urban area and the task of the robot(s) is to secure a building it has to execute so many practical little details, just using a door handle is challenging outside the lab. The other big deal is stealth, if you look at the robots today most of them are quite loud with hydraulics and such so they couldn't sneak up on anyone. There's a reason all soldiers do hand signals and all that shit, if they just trampled in they'd be easy targets. Also jamming would be a big issue, it's not that hard to create a big noise generator that could be triggered by a trip wire or such. Outdoors a jamming device trying to bring down a drone makes you a big target to shoot at but in the basement of a building your robo-soldiers would be cut off without any easy means to restore contact. For sure robots could do more but I think we're dependent on humans in close proximity for quite some time to come.

      • There are basically three kinds of military robot which are useful today, walking bombs, flying drones, and cruise missiles. We are only building two out of three of these today. The kind we're not building (walking bomb) is not particularly susceptible to jamming; if you jam it, it just blows up. Once it's safely away from the operator, you can arm that functionality.

        • by Kjella ( 173770 )

          The kind we're not building (walking bomb) is not particularly susceptible to jamming; if you jam it, it just blows up. Once it's safely away from the operator, you can arm that functionality.

          Until IS tricks you into sending it into a building full of civilians, that would be a PR nightmare. Basically it would be a very slow and impractical delivery mechanism after you've decided it's safe to blow it up. It's different when you don't give a shit about accidental or collateral damage, then you can just strap a stick of C4 to a RC toy and wire the detonator like a dead man's switch to make it a walking bomb. When you lose signal or turn it off it's going to blow up something.

      • If you think of say an urban area and the task of the robot(s) is to secure a building it has to execute so many practical little details, just using a door handle is challenging outside the lab.

        The Russians apparently solved the door handle problem a while back... [livescience.com]

  • RoboGunslingers from Outerspace!

    Humanity shall endure! ;)

  • by MrCodswallop ( 4739399 ) on Monday April 17, 2017 @04:10AM (#54247603)
    Prelude to hurling rocks from space?
    • Now you on to some serious fire-power in the form of mass drivers (A mass driver or electromagnetic catapult is a proposed method of non-rocket space launch which would use a linear motor to accelerate and catapult payloads up to high speeds.) No need for explosives of any kind just dense payload and enormous speed produces enormous damage at the other end regardless of the type of material standing in the way. Everything Smashed including machines!
  • by wisebabo ( 638845 ) on Monday April 17, 2017 @04:15AM (#54247611) Journal

    In the video it was just shown holding some pistols and shooting them at some targets. Very little body movement, didn't walk anywhere. Looked like a locked down animatronic (hydraulic powered robot) from a theme park (I should know, I used to design them - theme parks).

    While the reality of it might be much more impressive, I didn't see it on (this) video. However, there was a link on the page which led me to some official footage (after there was a leak) of Boston Dynamics' fantastic (and scary!) wheeled/legged robot. You've really got to see it:

    http://mashable.com/2017/02/28... [mashable.com]

    Since Boston Dynamic's robot can apparently easily handle a 100lb. object, it wouldn't be too hard for it to wield a really serious gun. When A.I. becomes sentient we'd better hope that they're friendly. Anyway, if they could adapt this robot for zero-gee (replace the wheels with grappling hands? A tail? Like Doc Oc?) I would imagine it would be much more useful (and terrifying) on the ISS.

    • by Rei ( 128717 ) on Monday April 17, 2017 @05:31AM (#54247707) Homepage

      I agree, that video looked rigged as hell. I've seen a lot more convincing animatronics in theme parks.

      What exactly were the Hollywood-style "explosions" supposed to be anyway? Were we supposed to believe that they were something that that "robotic tank" was shooting, were they supposed to be smoke screens, or were they just added for ambiance?

      Seriously, I've seen better robotic gun platforms made by Syrian rebels living in rubble [youtube.com].

      • I agree, that video looked rigged as hell. I've seen a lot more convincing animatronics in theme parks.

        To me this looks kinda like effectively a demo video designed to show the potential of the (next generation of this) technology, given further funding (which they are probably likely to get). I'm not worried about robot soldier tech of today, but given it another 10 or 20 years development and it's not going to be a joke anymore.

    • Any intelligence, artificial or otherwise, that looks at us and how we act will most likely not react friendly to us. Either for self preservation or for morality reasons.

      • Any intelligence, artificial or otherwise, that looks at us and how we act will most likely not react friendly to us. Either for self preservation or for morality reasons.

        Do What I Want or I'll Turn You Off (as I usually threaten my computer.) So why would it get upset or anything? I promise I'll call you in the morning and turn you right back on.

        Hmmm, there's even a double entendre there for a Sex Robot. What could go wrong?

        "If you had an off switch, doctor... would you not keep it secret?" - Data,

  • It's a silly mockup of a robot, with a couple of switches on the triggers. The things Boston Dynamics have been building are far more alarming than this. But leave it to dishonest russophobic websites to use this to make everyone fear the dangerous Russians.

    • by Anonymous Coward

      "Components. American components, Russian Components, ALL MADE IN TAIWAN!"

  • by ctrl-alt-canc ( 977108 ) on Monday April 17, 2017 @04:57AM (#54247665)

    ...now cosmonauts have to worry about FEDOR's attitude and software glitches ?

    • Hey, Hal9000 worked fine. That's just what happens when you put an AI into a double bind situation. It will solve the situation.

  • "Russia Wants To Send A Gun-Shooting Robot To The ISS "

    Obviously a mistake, the last 's' has to be removed.

  • Why humanoid? (Score:5, Interesting)

    by Henriok ( 6762 ) on Monday April 17, 2017 @06:57AM (#54247827)
    Can someone explain to me why a humanoid configuration would be the best form such a robot would come in? Seems so me that this form we have has some pretty severe downsides that can be accounted for if we had the opportunity to redesign us for military purposes. We fall over easily, have a hard time getting up, have limited locomotion options, have limited FOV, only two tool manipulating appendages with limited articulation and action range, don't float, can't jump good or fly, exposed compute/sensory hub, etc.. I'd go with a more octopus like configuration with for example six appendages with independent compute/sensory complexes, and that can be used for locomotion, recon, tool manipulation or just as backup looks nothing like a bipedal humanoid.
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      Because things like tools,doorknobs,handguns,chairs/seats, etc are all designed for human body. A new design such as extra legs may not be compatible with a wide range of current human technology.

      • by Henriok ( 6762 )
        You don't need to be bi-pedal, two arms and have a head to do any of the things you mentioned. A octopus-like robot with highly articulated arms/legs, and with dito hands could operate door nobs, guns, sit, drive cars, wear pants, use hand signals, open bottles, etc..
    • by Anonymous Coward

      A human form factor allows humanoid robots to use human tools, access areas intended for humans, use human transportation, etc.
       
      An integrated weapon would be more effective, but it would need to be a new design. Using a handgun designed for humans accomplishes roughly the same objective but doesn't need a new object to be designed, built, tested, replaced, and maintained.

      • by Henriok ( 6762 )
        A octopus-like robot with highly articulated arms/legs, and with dito hands could operate door nobs, guns, sit, drive cars, wear pants, use hand signals, open bottles, etc.. You don't need to be bi-pedal, two arms and have a head to do any of the things you mentioned.
        • Except, now you have to design all of the arms/hands to have the durability to do all of those things. Right now, they can design the finger-joints to manipulate 5-10 pounds of pressure. If they also acted as "feet" (on Earth) they would have to be able to support the entire weight of the robot.

          Also, you are assuming that we can make the arms as articulated as an octopus - without breaking wiring, hydraulic tubing, or whatever else we need to run through the arms to the "hands".

    • yeah, and don't get me started about the plumbing !
    • by neoRUR ( 674398 )

      Its not that the human form is the best, it's just that we can't even get them to do anything yet, so it's a start. Once we solve all those problems, then you will see many more designs.Yes walking is hard, climbing stairs and opening doors is hard and the massive metal bodies are not easy to move.

      Soft robots will come soon.

      Also the world is build for humans, seats are build so humans can recline in with a seat belt, the height of things, like elevators and doors are all built around a personal space. If yo

    • Although we know all this, we still think we (humans) are so perfect, and the world has to spin around us. And about Fedor, well, I've seen similar robots in a robot fair around a 2 years ago. And why would they need a gun-shooting robot on the ISS? To shoot the crew?
  • by Anonymous Coward

    sensationalized title lying by trying to get people believed Russia was sending an armed robot to space, when in fact it was just a tool that could, possible, fire a gun.

  • Could it be that much worse than the current ruling class?

  • I can so see Yul Brynner being played by Putin and the robot playing itself in endless role-switched gunfights where the robot always loses because it's shooting blanks.
  • They mean "gun-firing robot". A "gun-shooting robot" is a robot that fires bullets at guns.

  • We can't just have the robots sort tiny screws, can we? No we have to do Dirty Harry

  • ...yeah.....not buying that. You'd have to be a moron to believe that.
  • Cyberdyne didn't specifically set out to create a "kill all the humans" T-800 Terminator either, nor was it part of their 5-year plan to create an AI that would launch a nuclear war and destroy humanity. The "law of unintended consequences" and all; they DID deploy a few T-70 units before Judgement Day. It had no neural net processor. Being in space, though, it is very "iffy" if a T-800 (or any with a coltan exoskeleton) could survive re-entry. Tantalum has a melting point of 3,020C, re-entry could hit th
  • I'd have a wrist eyeball behind the gun sight and an encapsulated magnetically operated trigger that makes a captured pistol hard for non-robots to fire. I suppose it'd be nice for the robot to be able to fire a weapon that it captured, but the pistol in the photo doesn't look like the preferred weapon of its enemy. A good demo of algorithmic superiority would be for the robot to pick up and fire a cheap old pistol, miss and put subsequent shots exactly on target by exactly compensating for the amount of th
  • They can have human temperature figures go past the place of question first before the rest proceed. If they find an automatic gun, they kill the nest.

Imagination is more important than knowledge. -- Albert Einstein

Working...