Forgot your password?
typodupeerror
Robotics The Military Hardware

Not Quite a T-1000, But On the Right Track 159

Posted by Unknown Lamer
from the dancing-robot-death-machines dept.
New submitter misanthropic.mofo writes with a look at the the emerging field of robtic warfare. Adding, "Leaping from drones, to recon 'turtlebots', humanity is making its way toward robo-combat. I for one think it would make it interesting to pit legions of robot warriors against each other and let people purchase time on them. Of course there are people that are for and against such technology. Development of ethical robotic systems seems like it would take some of the fun out of things, there's also the question of who gets to decide on the ethics."
This discussion has been archived. No new comments can be posted.

Not Quite a T-1000, But On the Right Track

Comments Filter:
  • by Anonymous Coward

    Either I'm not the geek I thought I was, or you, sir, are not. What is a T-100? Did you mean T-1000?

    One of us is turning in their geek card tonight.

    • by Anonymous Coward
      No, you fool. He was talking about a line of semi-autonomous, lethal pickup trucks. [wikipedia.org] Who's turning in what now?
    • T-101
    • by c4tp (526292)

      I thought the robots were the enemy in that reference?!? The last thing I want to hear about are turtlebots with frickin' laser beams attached to their heads.

      Hell, the Big Dog [bbc.co.uk] is scary enough for me.

      • Hell no. I know which is the winning side. I'm teaching the robots everything I can about other humans.

    • by Anonymous Coward

      Interesting. The Slashdot RSS feed has both versions of the headline... corrected and incorrect.

      • by BagOCrap (980854)

        Noticed it too. I just assumed the article submitter was progressing from T100 to T1000 network speeds.

    • by asylumx (881307)
      Obviously he was referring to the new line of lethal graphing calculators...
  • So, perhaps the next cold war will be fought in bunches of skirmishes between the US and China in Third World countries using flying, land and water robots to protect small numbers of imported humans controlling large numbers of worker robots, rather like when ants go to war angainst each other and the first one to take over the other's nest and larvae becomes the winner while the losers scatter.
    • by icebike (68054)

      And perhaps that will last just as long as it takes for one country to face defeat of its robots, whereupon a switch will be flipped and humans become a legitimate target for autonomous machines.

      You abhor drone strikes now, wait till there is no human in the loop.

    • It may start off as robot wars, but it will quickly end in a thermonuclear exchange. They are the trump card. They will always be the trump card.

      • I'll bet 100 on cockroaches! What's the prize?
        • The prize is top spot in the foodchain of the future. I'm betting on mice to take it. They eat anything, they're small and clever, they won't be outside when the bombs start blowing and their favourite food is cockroaches so they'll always have something to eat. Perhaps we'll eventually have one tonne mouse predators hunting ten tonne mouse herbivores and even Rodent Sapiens making mouse warrior robots to fight their mouse wars for them against the Rodent Erectus scum.
    • by Ghaoth (1196241)
      Warning, warning Will Robinson. Activating Skynet now.............
    • by Seumas (6865)

      All the "robots" and machines in the world doing battle won't ever change the fact that only slaughtering young men and women (sent their, usually, by men wealthy men closer to their death than their birth) really has an impact on societies and the need to push for or withdraw from war. Frankly, not even much demand over humans these days, either as witnessed by the last twelve years.

    • Re:Robot wars (Score:4, Interesting)

      by C0R1D4N (970153) on Tuesday March 05, 2013 @08:34AM (#43077565)
      In a war between robots, innocent civilians will be 100% of the casualties.
    • by whitroth (9367)

      You're living in a videogame world.

      In reality, the controllers - if any, if they're not autonomous 'bots - will be so busy trying to get the enemy drones that they'll slaughter the people who they're allegedly protecting - y'know, the folks whose country it is? - without even noticing.

      Come *on*, when a drone, or a cruise missle, targets in the middle of a village or city, you really think that the explosion, the shrapnel, and the falling bits and pieces of buildings and/or vehicles won't kill and maim innoc

  • Am I the only one who misread as "...it interesting to pit pigeons..."? I was just thinking jeez, can't we just leave birds alone? First sparrows, now pigeons?
    • by fyngyrz (762201)

      it interesting to pit pigeons

      I may be winging it, but I think you're just squabbling about the title.

    • Actualy pigeons [wikipedia.org] have a long military history, the Taliban still forbids their possesion or use in Afghanistan. 34 pigeons were decorated with the Dickin Medal [wikipedia.org] for "conspicuous gallantry or devotion to duty while serving or associated with any branch of the Armed Forces or Civil Defence Units".

  • Sigh (Score:5, Insightful)

    by Kell Bengal (711123) on Monday March 04, 2013 @11:13PM (#43075373)
    Hello - robotics researcher here (specialising in UAVs). I wonder when these breathless articles about battlefield robotics will end. There is nothing new about battlefield robots - we've had tomahawk missiles since the early 80s. It's just that these days we think about them as robots rather than as cruise missiles. Drone strikes? What about the missile strikes from the Gulf War? They were the champions of good and (along with stealth technology) the gold hammer of the Forces of Good.

    The only thing that has changes is more penetration of robots into our militaries and more awareness of some of the ethical considerations of automated weapons. Don't forget - the machine gun and landmine have killed far more people than drones likely ever will. They kill mindlessly so long as the trigger was pulled or they are stepped on. And yet, their ethical considerations were long debated. It's just that "omg a robot!" is headline magic.

    (To whit - the author of this article must not know that much about robotics if they're claiming "The turtlebot could reconnoitre a battlesite". No it can't - it's a glorified vacuum cleaner. I just kicked the one in my lab. It can barely get over a bump in the carpet.)

    Let's focus on the real ethics of robotic warfare: how our leaders choose to use the tools we have made.
    • Re:Sigh (Score:4, Insightful)

      by Anonymous Coward on Monday March 04, 2013 @11:19PM (#43075411)

      The only thing that has changes is more penetration of robots into our militaries and more awareness of some of the ethical considerations of automated weapons. Don't forget - the machine gun and landmine have killed far more people than drones likely ever will. They kill mindlessly so long as the trigger was pulled or they are stepped on. And yet, their ethical considerations were long debated. It's just that "omg a robot!" is headline magic.

      Both machine guns and landmines are pretty easy to avoid: Go where they are not.
      The game changes when the killing device can move itself around and decides (by itself) if it wants to kill you.

      • Re:Sigh (Score:5, Insightful)

        by hairyfish (1653411) on Tuesday March 05, 2013 @12:02AM (#43075631)

        Both machine guns and landmines are pretty easy to avoid: Go where they are not.

        Like a movie theatre [wikipedia.org] for instance? a School [wikipedia.org] maybe? What about summer camp? [wikipedia.org] Or the humble old supermarket? [wikipedia.org]

        • by nedlohs (1335013)

          There were no machine guns or landmines used in any of those, so yes.

          • No terminator robots either yet people still got murdered. Weird huh?
          • Most people consider semi-automatic rifles to be machine guns.
            • And those people are wrong.

            • by Phrogman (80473)

              The difference between being killed by an semi-automatic rifle and being killed by a machinegun (sub or otherwise) is lost on me. The point was that previous technologies have most likely killed more people than the newer technologies will (particularly as the newer technologies will most likely incorporate some of the older technologies), not to argue whether the person got their terms exactly right.
              And in the popular mind I would agree, the distinction between semi-automatic and machine guns is generally

              • Re:Sigh (Score:5, Insightful)

                by rohan972 (880586) on Tuesday March 05, 2013 @07:24AM (#43077311)

                The difference between being killed by an semi-automatic rifle and being killed by a machinegun (sub or otherwise) is lost on me.

                If someone is specifically talking about the risk of being killed by one or the other it becomes relevant, otherwise not so much.

                The average person probably thinks the categories are: pistol, shotgun, rifle, machinegun and thats pretty much it.

                I aspire to more intelligent discussion than the average person I suppose. I don't see how this is possible unless words are used correctly.

                When people with a political agenda of banning guns use incorrect terminology that confuses semi-auto with full auto weapons it seems like they are deliberately obfuscating the issue to exploit the average persons ignorance. That requires correction, unless you're in favor of deceiving people to sway their political opinion. I know that's a popular tactic to the point of being near universal but I always live in hope of conversing with people who prioritize truth over their own opinion.

                • The average person probably thinks the categories are: pistol, shotgun, rifle, machinegun and thats pretty much it.

                  I aspire to more intelligent discussion than the average person I suppose. I don't see how this is possible unless words are used correctly.

                  It's an age old problem. Experts in a field will classify things differently than those who are not in the field. Are tomatos a fruit, or a vegetable? It depends who you are talking to. Culinarily, a tomato is a vegetable because it is used in savoury dishes rather than sweet ones. Botanically, it is a fruit - a berry actually - because it consists of the ovary of the plant. Different set of people; different definitions and classifications; same object being discussed. The average person is not inco

                • by Phrogman (80473)

                  Yes and I used to despair of trying to get people to understand that their browser was not their "operating system", or that the box that contains their computer is not their "hard drive". The fact that I know the difference, the fact that it makes a great difference when you are trying to solve a problem on their computer does not change the fact that they neither know the difference, or really care.
                  I am aware of the difference between a rifle, a semi-automatic assault weapon and a machinegun. I spent 10 y

            • by nedlohs (1335013)

              The most people are idiots.

        • Both machine guns and landmines are pretty easy to avoid: Go where they are not.

          Like a movie theatre [wikipedia.org] for instance? a School [wikipedia.org] maybe? What about summer camp? [wikipedia.org] Or the humble old supermarket? [wikipedia.org]

          Yes, as explained in the second line of the quote - you know, the one you omitted.

          Both machine guns and landmines are pretty easy to avoid: Go where they are not.
          The game changes when the killing device can move itself around and decides (by itself) if it wants to kill you.

          I suppose I should commend in at least a left handed fashion. You did manage to turn what is essentially an off-topic or redundant remark into a +5 insightful by showing instances of what the parent to your post directly stated and which you glossed over. I guess it must be time we rehash the whole violence / gun violence / "assault weapon" topic in this discussion on robotics / drones since I'm not sure it has otherwise com

          • You clearly missed the point the first time so I'll explain the joke to you. A man with a gun is equal to if not worse than a killing "device". In fact you could define a looney with a gun as a "killing device".
        • by MacDork (560499)
          Only the government should be allowed to control killbots and computer viruses, yes? I know, let's outlaw computer viruses. That will surely stop them.
      • Re:Sigh (Score:5, Insightful)

        by Jeremi (14640) on Tuesday March 05, 2013 @12:33AM (#43075819) Homepage

        Both machine guns and landmines are pretty easy to avoid: Go where they are not.

        Landmines have an annoying habit of being buried where you can't see them. This makes it difficult to ensure that you are going where they are not.

        • Re:Sigh (Score:5, Insightful)

          by camperdave (969942) on Tuesday March 05, 2013 @02:15AM (#43076237) Journal

          Landmines have an annoying habit of being buried where you can't see them.

          Plus they have the nasty habit of remaining active long after the conflict has ceased.

          • And in the rainy season if they are on soft ground they can get washed downhill to another place - even if their location was recorded in the first place (not always the case by locals or superpowers).

            A friend in Cambodia says this is a real problem in hilly areas, dirt roads are cleared and then after heavy rains you have to assume the road to the next town might be live with UXO again and has to be checked before you can drive out again.

            Stuff that was dropped/ planted in 1975 is still killing people.

        • That's against the Law of Land Warfare, landmines have to be either marked or under direct observation. The number and location of each landmine has to be recorded so that the landmines can be acurately removed when the installing unit leaves or responsibility transfered to the relieving unit.

          • by Jeremi (14640)

            That's against the Law of Land Warfare, landmines have to be either marked or under direct observation.

            Great, but people who go around planting land mines don't necessarily spend a lot of time worrying about how well they are complying with the Law of Land Warfare.

      • by gatkinso (15975)

        You know where the landmines are? All of them? The US military would love to pay you millions for your technique.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      I think the sticking point that you might be missing is that we're reaching a point where it's in the mind of the public as conceivable, that we're getting to a point where robot autonomy is becoming more mainstream. There are certain ethical questions that go along with a program that is going from targeting something specifically to making decisions on potential targets. People see every day more and more advanced drones performing all sorts of little mini miracles of tossing sticks around and creating sm

    • by Paul Fernhout (109597) on Tuesday March 05, 2013 @12:03AM (#43075643) Homepage

      http://www.pdfernhout.net/recognizing-irony-is-a-key-to-transcending-militarism.html [pdfernhout.net]
      "Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead? ... There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all."

      There are only so many hours in the day. If we put those hours into finding new ways to kill other people and win conflicts, we will not be putting those hours into finding new ways to heal people and resolve conflicts. Langdon Winner talks about this topic in his writings when he explores the notion of whether artifacts have politics.
      http://en.wikipedia.org/wiki/Langdon_Winner [wikipedia.org]

      Albert Einstein wrote, after the first use of atomic weapons, that everything had changed but our way of thinking. You make some good points about us long having cruise missiles, but on "forces of good", here is something written decades ago by then retired Marine Major General Smedley Butler:
      http://www.warisaracket.com/ [warisaracket.com]
      "WAR is a racket. It always has been. It is possibly the oldest, easily the most profitable, surely the most vicious. It is the only one international in scope. It is the only one in which the profits are reckoned in dollars and the losses in lives. A racket is best described, I believe, as something that is not what it seems to the majority of the people. Only a small "inside" group knows what it is about. It is conducted for the benefit of the very few, at the expense of the very many. Out of war a few people make huge fortunes. ..."

      Just because it was "hot" before, with cruise missiles and nukes and poison gases, does not mean we will be better off when our society reaches a boiling point -- with robotic soldiers and military AIs and speedier plagues and so on. Eventually quantitative changes (like lowering prices per unit) become qualitative changes. Every year our planet is in conflict is a year of risk of that conflict escalating into global disaster. So, the question is, do our individual actions add to that risk or take away from it?

      I'm impressed with what some UAVs can do in terms of construction vs. destruction, so obviously there is a lot of different possibilities in that field.
      http://www.extremetech.com/extreme/107217-real-life-constructicon-quadcopter-robots-being-developed [extremetech.com]

    • by Anonymous Coward

      There is a difference between guided munitions, and automated combatants eventually making the decision about when to attack.

    • by Anonymous Coward

      I think the point of contention is that the patriot missile did not decide to kill you, a human did. A completely autonomous machine programmatically deciding whether you should live or die seems like something completely different to me.

      • I would agree. I don't really see a difference between an autonomous robot deciding whether you should live or die and an engineered virus deciding whether you should live or die.

        Tim.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      The other obvious issue is the "arms race" aspect to this discussion. If it is mandated that all robots are designed not to kill humans, you can guarantee that someone will make one that doesn't comply, or complies conditionally.
      Something about genies and bottles.

    • "There is nothing new about battlefield robots - we've had tomahawk missiles since the early 80s."

      And since when tomahawks decided what their targets should be based on general autonomous and situational considerations?

      A tomahawk is not a robot, it is a tool.

      Well, there are still no robots in the battlefield and it'll be long before there are, so this is more sci-fi than anything but, answering the question "who gets to decide on the ethics", I thought that one was obvious: Isaac Asimov, of course!

      • by EdZ (755139)

        And since when tomahawks decided what their targets should be based on general autonomous and situational considerations?

        Most missiles (modern Air to Air missiles being an excellent example) fly to their targets inertially, then in the final homing phase look for the target that matches the pre-programmed target criteria, then home in on it for the terminal phase.
        The Tomahawk is given its route to the estimated target location and what it's target looks like, then launched. The missile then autonomously finds it way (either through GPS or TERCOM), gets to the desired location, then picks out it's target to hit. It does this

    • Let's focus on the real ethics of robotic warfare: how our leaders choose to use the tools we have made.

      I'm more interested in the imbalance of power robots have the potential to create. Not an imbalance between countries but between the personal power of a few and the the rest of us. What if Tony Stark or Superman were real, and complete sociopaths to boot? Why wouldn't they rule this rock like god-kings? When a few wealthy people or politicians can remote control an entire army sans restriction, we're going to start seeing a new and very ugly kind of tyranny emerging. Maybe not in western democracies, hopef

    • Don't forget - the machine gun and landmine have killed far more people than drones likely ever will.

      As have carpet bombing and guys with swords. As long as they're controlled by humans (at least during targeting) they aren't true robots.

      ..and yes, our leaders have been sending drone strikes and thousands of troops to go kill people based on a pack of lies. Afghanistan harboring Osama? No it was Pakistan, the same country we keep sending huge bundles of cash and free F-16's. Iraq having weapons of mas

    • I think it isn't quite so simple. Machine guns are clearly under the control of the guy at the trigger - he, and his command chain have absolute responsibility for their actions. As robots become more autonomous, it will become less clear who is responsible for mistakes. When an automated drone mistakes a school bus for a tank, is it he fault of the drone? The guy who programmed the image recognition in the drone? The safety "logic" in the drone? The commander who ordered that particular drone into the fie

    • by gr8_phk (621180)

      Don't forget - the machine gun and landmine have killed far more people than drones likely ever will. They kill mindlessly so long as the trigger was pulled or they are stepped on. And yet, their ethical considerations were long debated.

      Don't forget that landmines were not just debated, for the most part they have been banned IIRC. The ethical considerations were not debated - the problem was clear - what went on for a long time was deciding what to do about them.

      And let us not forget, a large scale robo

  • by Lisias (447563) on Monday March 04, 2013 @11:15PM (#43075387) Homepage Journal

    Billions of dollars can be deactivated by a simple PEM.

    You know... the bombs that emits an electro-magnetic pulse that disables everything that are digital...

    They are so simple to build that USA would restrain itself from use them, as the enemy would easily figure out how to build one by analyzing the bomb's scraps...

    • That's why the Russians still make vacuum tubes and possibly still use them in military equipment. Vacuum tube circuits are much more resistant to EMP attacks and as I recall the Soviets designed a few EMP weapons, although of course a nulcear bomb does a fine job as it comes.

      Small EMP devices are very easy to make and the designs for basic circuits were easily available in any large book store last time I looked for other books about electronics, maybe eight years ago. They always gave me a cold shiver w
    • by viperidaenz (2515578) on Monday March 04, 2013 @11:36PM (#43075503)

      You know... the bombs that emits an electro-magnetic pulse that disables everything that are not adequately shielded...

      FTFY.
      I think you also mean EMP, not PEM.

    • by geekmux (1040042)

      Billions of dollars can be deactivated by a simple PEM.

      You know... the bombs that emits an electro-magnetic pulse that disables everything that are digital...

      They are so simple to build that USA would restrain itself from use them, as the enemy would easily figure out how to build one by analyzing the bomb's scraps...

      Perpetual revenue streams is the end-goal here. Destruction of hardware is a by-product of that, and thus is a goal as well. We're not sending drones to a battlefield to teach them patty cake, although that would be one of the funniest hacks ever witnessed by man ("Did that drone just try to mount my drone?")

  • "When people stop fighting battles for themselves war becomes nothing more than a game." -- Quatre
  • ... obviously! Some things will never change.
  • ... I for one think it would make it interesting to pit legions of robot warriors against each other...

    it's been done. [wikipedia.org] According to the all-wise wikipedians, "Storm 2" was the last world-champ.

  • Of course there are people that are for and against such technology.

    They should find something else to occupy themselves with because the technology will happen regardless. As long as we have a world of competing nation states (generally a good thing) if one doesn't develop a technology to its full military potential, another one will. Even nuclear weapons are slowly but surely proliferating despite major technological difficulties and the most intensive legal/diplomatic etc efforts ever made to p

    • Most of it is ridiculous, whether its based on "rigorous rational thinking and an axiomatic approach" or not (actually not). Besides, which ethics school do you apply? Consequentialists (ends justify the means a.k.a. screw the principles), utilitarians (happiness of the majority is key, a.k.a screw the minority), Pragmatists (whatever society decides ... so much for rigorous rational thinking) etc etc

      • by zedrdave (1978512)
        You are mixing personal ethics and ethical theories that can be applied to a community (of people/countries).

        Given the question at hand, I'll venture a wild guess and say Military ethics [wikipedia.org] are most applicable. You might know them (in large part) as the Nuremberg Code, Helsinki Declaration or the Geneva Convention. In the modern mainstream world (outside of religious/political nuts), there isn't a lot controversial about them. That is, until a country decides that breaking them might possibly give them an up
        • by Sigg3.net (886486)

          *sigh*

          Put it like this: ethics is the study of the human being. With that in mind, and a hypothetical ultimate morality as a goal, the truth of what we are must guide us. The endeavour must be scientific, which entails that some ethical theories are empirically false (Hobbes and children are inherently flawed, for instance) while appeals to religion is cultural and will often prove moot (understood as early attempts at the same task).

          How will a discussion about the "ultimate morality" come about with any h

          • While I'm sympathetic to discourse ethics and wish you all the best in your intellectual endeavors, I have some critical remarks.

            First remark: According to your definition ethics is the same as anthropology. That's not enough. Starting with Plato and Aristotle reasonable moral philosophers have always taken into account empirical data about morality (e.g. akrasia, Plato's concrete suggestions for the education of philosophers, marxist theory/praxis problem), but there is still the problem of the naturalisti

            • by Sigg3.net (886486)

              You are correct, of course, but not entirely. The independent disciplines carry out the very fieldwork philosophy in turn responds to and absorb, and is also instrumental to change how we understand the world which may again change or create new disciplines. When there is a meeting point of a controversial philosophical claim and new evidence or discovery in another field, it is hardly coincidental.

              Anthropology is very interesting to philosophers because its detailed study of the specifics illuminates the t

    • As a philosopher who currently works at the borderlines between philosophy of language, logic, and ethics (work that is overlapping with AI research and also working together with computer scientists occasionally), I have something to say about that. You might not like it, though.

      Ethics is neither rigorous nor particularly rational nor is most of it axiomatic. Rationality has been undermined for decades now by recent trends like 'moral intuitionism', 'moral contextualism', and 'moral particularism'. These a

      • by Sigg3.net (886486)

        Doing a MA in ethics and political philosophy here.

        I wrote a comment above so I will be brief. Philosophers in my field deal with ethics qua an endeavour to discover the truth of humanity. Politicians and public offices paradigmatically deal with ethics (today) qua Christian Protestant morality or praxis (limited to Western world and international relations).

        This is simplified, but I wanted to point out that an ethical project must take empirical evidence into account (I find Neuroscience very interesting

    • by rohan972 (880586)
      I doubt that military killbots will be programmed by the philosophy department of a university. More likely by engineers working for generals.
  • Ethics are NP Hard, good luck with that.

  • Wars don't end because either (or both) of the sides are tired of committing atrocities.
    Wars end because either (or both) of the sides can't sustain its own casualties.
    See Iraq. See Vietnam.

    Robot soldiers mean that atrocities can take place with no human toll, no witnesses.
    No battle fatigue.

    Robots will do to war what Facebook did to idle chat...
    (How about that?)

    • by geekmux (1040042)

      Wars don't end because either (or both) of the sides are tired of committing atrocities. Wars end because either (or both) of the sides can't sustain its own casualties. See Iraq. See Vietnam.

      Robot soldiers mean that atrocities can take place with no human toll, no witnesses. No battle fatigue.

      Robots will do to war what Facebook did to idle chat... (How about that?)

      How in the hell is sending a robot on behalf of my emotions going to resolve anything? You think I'm going to FEEL any differently about my enemy when they "kill" my drone? War requires emotion to realize it is wrong. Robot warfare won't even seem wrong in the eyes of children. It will seem like a damn game.

      Unfortunately, paying the admission price for that "game" means budgets spent on warfare rather than welfare. People will simply starve to death rather than die on the battlefield. Gee, I feel so

  • Of course there are people that are for and against such technology.

    The depth and razor-sharp incisiveness of this analysis leaves me breathless.

    Add a quote from a taxi driver in Beirut and it could be Thomas Friedman under a pseudonym.

  • "Development of ethical robotic systems seems like it would take some of the fun out of things"

    What kind of twisted fantasy world are you guys living in? War means killing people. It isn't fun. It isn't a video game. And in response to Kell's comment above, we aren't the "Forces of Good" battling the "Forces of Evil". We are a nation state with imperfect leaders and selfish short-sighted goals just like every other nation state on the planet. The difference between having real armies and having robot armies

  • We are in a really bad spot if you think wars are "fun". Wars are bad and are supposed to stay that way. Just give one country a disproportionate budget to build battle robots and human being on the other side will die like flies... without even the inconvenience of pulling the trigger, no risk, nor seeing the blood, nor living with the remorse (don't have to worry about war crimes either right? We can at most talk about a regrettable malfunction). Just a permanent, very profitable war industry fighting whe
  • Probably won't be a post that gets scored up, but I came very close to saying "dupe post". I know I saw this article yesterday or Sunday with the exact same headlines. After spending 20 minutes digging through Slashdot archives, and digging through all other news sites I have read in the past couple of days, it finally occured to me that I saw this in the Firehose yesterday. Oops.

  • The US military is working very hard on robots to assist in the kind of house-to-house combat they have been involved in during Iraq and Afghanistan. In that kind of conflict, there are a lot of casualties and that puts massive pressure on the politicians back home. The pressure is delayed, but very real.

    However, once they get robots which can assist in that kind of conflict, it completely unbalances the US Constitution by essentially removing the Second Amendment: effective combat robots are equivalent to

  • Make EMP bombs for Anti-Tyranny kit.
  • I for one think it would make it interesting to pit legions of robot warriors against each other

    Would be nice if the attacking robots would target only defending robots, but why would they? They will always target the Humans that are important for the enemy, otherwise the war would be pointless.

  • The difference between a graphing calculator and a time-traveling, shape-shifting death bot is one bit...
  • Your shell. Give it to me [botaday.com]

Money is the root of all evil, and man needs roots.

Working...