Military Robots Get Machine Guns 665
javaxman writes "Next spring, the U.S. military is expecting to deploy Talon robots with machine guns. They can also be equiped with rocket launchers. Really, they're remote-controlled 'bots, not true autonomous 'bots, so you can save the Skynet jokes for, um, some day in the not-to-distant future. This is just the first, or maybe second step. As for me, I just want to see arena matches between gangs of these suckers. Robot wars indeed!"
What if... (Score:0, Insightful)
What about ethics? (Score:5, Insightful)
Whatever happened to Asimov's rules of robots that they can do no harm to humans? For years, bearded terminal hackers have done their thing, hacking on software, hardware, and such, with little regard to the ethics of the situation. But now, with our creations affecting mankind in a more profound way, we give little more thought to ethics than we did with a simple BASIC shell script.
Think about this the next time you are coding a servo controller on your Redhat compiler. Could your code be misused in a way you would not approve?
Not so bad... (Score:2, Insightful)
Re:What about ethics? (Score:4, Insightful)
We already have autonomous firing systems (Score:5, Insightful)
I think it'll be a long time before autonomously firing ground systems are in place, because it's hard enough doing IFF in the sky, let alone on the ground. I think the fire-finder system (used in the Balkans to take out mortar positions in the mountains firing upon cities) might do this in some limited capacity, but that's only anti-artillery, rather than telling the difference between a guerilla carrying an RPG and a farmer carrying a section of irrigation pipe. Sure, you could wait until they shoot first for all of these systems, since that's a lot easier to determine automatically, but I think it's quite obvious that waiting for the other guy to shoot first is very far from the policy of the current administration.
Re:Not so bad... (Score:5, Insightful)
Re:This Will Save Lives (Score:3, Insightful)
Re:Not so bad... (Score:2, Insightful)
Fewer of ours, more of theirs...OOH RAH!
Saw this on SeaQuest (Score:3, Insightful)
With the advances in VR and forms of total control of remote devices and such based on muscle movement and in some cases even brain wave activity, how far away are we from a time when anyone with a joystick can command a combat robot?
It really reminds me a lot of Largo from MegaTokyo and his army of Ph34rbots.. but on a serious note, however, I really do wonder. It would seem that, while these types of things are great in that they save lives ultimately, at the same time, they could ultimately be a supreme form of evil.
Even though bad things DO happen in any armed conflict, at least in this case, fields of robots battling it out, even if they are merely remote controlled, will keep real people from dying needlesly. However, again, how long before someone figures out how to gain control of these things and turn them against civilian populations, villages, cities, etc.
On a side note, what I really find funny is that, traditionally, the military is the last major area of manual labor that has NOT been severly affected by technology (in the sense of robots replacing workers as they have in manufacturing and other areas) and now, there exists a real possibility of the military being downsized due to robots replacing soldiers. Maybe the Teamsters can organize the military!
Seeing your work used "for evil" (Score:5, Insightful)
Think about this the next time you are coding a servo controller on your Redhat compiler. Could your code be misused in a way you would not approve?
Y'know, I hear this kind of question a lot. I work for a defense contractor. When I'm explaining my work to people, invariably the question of "don't you worry that your work will be used in some future war that you don't approve of?" No, actually, I don't and the reason isn't that I approve of all (or even most) of the military actions that my country is involved in. Part of it is a bit of short-sightedness on my part. I work on very "research-y" topics: data fusion, sensor resource management, and other stuff that isn't gonna get implemented until 2015 at the very earliest. Part of it is that I think war is a necessary part of humanity. I wish it weren't but a simple examination of the human brain reveals that the "R-complex" (aka reptilian brain) is present in every person. I have learned to use my other brain portions to control my aggresive tendancies but there are lots of people who will never master that trick.
But I think the main reason why I don't lie awake at night worrying that the results of my efforts might make the world a worse place is the same reason why parents don't usually lie awake worrying that their kids are going to turn out to cause more harm to society than benefit. I don't have kids but I'm thinking that if I did, I probably wouldn't spend too much time worrying that my kid is gonna become the next Kenneth Lay and be the cause of a great deal of suffering. I would probably think that my kid is more likely to be a benefit to society or I'd just be enjoying the process of raising my kid and not get all worried about how he's going to turn out.
I don't see any reason why one should assume that the products of their efforts will only be used for applications that they 100% agree with. Really, I think that's terribly naive. Do sheetmetal workers lie awake at night worrying that the steel they cast that day might be used in the casing for a bomb?
GMD
Re:Not so bad... (Score:1, Insightful)
In the short term that may be a good thing, but in the long term it means that the United States is even more powerful militarily than it has been before. Will technology like this make America more bold? Would attacking other countries such as Iran be an easier choice to make without the threat of American casualties to sway the public?
Nobody wants to see more people die in war, but even fewer want to see a lone superpower with even less hesitancy to enforce its agenda around the world. In the end, things like this could cause more deaths than they save.
Re:Not so bad... (Score:4, Insightful)
Another angle on this is that mutually assured destruction through nuclear weapons was enough intimidation that it prevented nuclear war. In a similar fashion, fighting a war where your side suffers human losses while the enemy loses robots would be a humiliating, demoralizing experience - perhaps to an extent that fighting against such a miliatary would be a lost cause before the first round is fired.
There are pros and cons to that - it could be a very real deterrent to warfare, but it could just as easily alienate and silence people with a just cause for fighting. I doubt those people would shrug their shoulders and go home - they'd probably settle for guerilla warfare amongst the civilian population where an armed robot isn't a feasible option. Hm, not a far cry from terrorism.
I'm seriously not a hippy but the prevalence of "insurgent" style warfare these days is starting to convince me that war really isn't the answer - not because war is unhappy or unpleasant, but because people who are motivated enough to fight a war will express themselves despite being outright defeated in a war. If they want to kill, they'll kill regardless of your tanks or soldiers or barricades or armed robots. It's just too bad that nobody tosses tea into the harbor anymore.
Re:Not so bad... (Score:4, Insightful)
Re:A trend (Score:2, Insightful)
Ah, and that's when the badness starts (Score:4, Insightful)
That, my friend, is the argument for making the robots autonomous. Insert sci-fi armageddon of choice here.
Put a M16 in Asimo's hands and you have one hell of a prototype.
No this will cost lives (Score:5, Insightful)
And thus negating the most important check and balance against perpetual war.
It is necessary for soldiers to die in a war, because their death reminds us that war has a price. If you can operate a completely robotic army/navy/air force, you lose that human connection, and create a killing force that can operate without any moral conscience whatsoever.
That makes for a generation of politicians who will decide that because there is no human cost to their side, they may as well just send in the robots and exterminate the opposition.
You are already seeing this process in action with such edicts as not being allowed to show coffins returning from Iraq. If you don't see the cost of war you are more likely to support it, and robot killing machines are the ultimate expression of that lack of human cost to war.
Continuing down that path will have only one outcome, and it won't be pretty.
Re:What about ethics? (Score:3, Insightful)
Ways you many not approve of? (Score:3, Insightful)
Just think of these robots doing really dangerous things - going down terrorist booby-trapped tunnels and the like.
Or would you feel better just sending human fodder into such situations?
If you think wars suck, then you should like modern high-tech wars. War still sucks, but far fewer people get killed doing it.
Hmmm is that a good thing? On balance, I think so.
Re:A genuine worry (Score:2, Insightful)
Re:Captured robots (Score:5, Insightful)
Re:Remote controlled war! (Score:3, Insightful)
Re:Good News in War Against China (Score:5, Insightful)
Some time in 2000 I spidered the CIA world factbook.
There is an entry in that book labelled;
"Military manpower - fit for military service"
In the edition which I have, it lists the USA as having 2,056,762 people who are fit for military service. I believe that was supposed to include women.
Thats less than one percent of the population.
Every other listed country can manage at least 10%
After the Sept.11 attacks these figures were no longer listed. Instead today it says "NA"
The USA is the *only* country listed as "NA".
Why does the USA *need* machines like this?
Do the math.
I know the parent post was mostly humerous but frankly the idea of a USA with autonomous fighting machines scares the bejebits out of me since lack of manpower seems to be the *only* thing holding them back from a classic Civ endgame.
Re:Seeing your work used "for evil" (Score:3, Insightful)
If we worried more about the consequences of our actions, we would probably all be better off.
Re:Not so bad... (Score:3, Insightful)
Re:Not so bad... (Score:5, Insightful)
Man, I hope so. Otherwise, what good is it? Seriously.
Nobody wants to see more people die in war, but even fewer want to see a lone superpower with even less hesitancy to enforce its agenda around the world.
We all got agendas, Bunky. I have one. America and Iran have one. You clearly have one. And although I don't have the poll numbers in front of me, I believe the number of people who want to see the agenda of a country sitting atop a substantive percentage of the world's oil supply, draped in medieval-level religious fanaticism, and armed with newly-minted atomic weapons achieve ascendancy over that of a nation whose principal exports are fast food, Hollywood movies, Internet cafes, arrogance, swagger, and democracy is rather small.
It's been over a hundred years that America could be viewed as the underdog, and pop culture teaches us never to root for the Big Lone Superpower, but when the little guys are murderous right-wing religious lunatics, aintcha glad that pop culture's got nothing on plain ol' common sense?
Re:Good News in War Against China (Score:1, Insightful)
The Chinese have low regard for human life [phrusa.org]
The assumption that all chinese have a low regard for human life based on what they are doing in Tibet is the same sort of mentality that causes people to carry out attrocities like what is happening in Tibet.
Re:Not so bad... (Score:3, Insightful)
Re:Seeing your work used "for evil" (Score:1, Insightful)
Or you have seen them way too many times...
There is usually, a substantial trauma involved the first time a cop ends up killing someone in line of duty. Well, at least the first time.
Re:What about ethics? (Score:2, Insightful)
No matter the field, I think scientific advancements are always potentially beneficial, weapons engineering included. The important consideration is, once we've made the breakthrough, who do we let loose on the world with it?
Re:Dehumanization of Killing (Score:4, Insightful)
The real concern is the number of human lives lost in stopping these recurring acts of idiocy. The actual effect of technological advancement has been to steadily reduce the number of combatant and noncombatant casualties as technology improved. Modern technology makes it possible to confront agression with less cost in human lives over shorter periods of time.
But if it assauges your sense of moral rectitude, we can go back to the days of sword-weilding armies and the concomitant casualty rates of 20-40% of entire populations during wars. We wouldn't be isolated at all from the act of killing -- a large plurality of us would have a constant connection with death, rather than our 1-2% or so who have intimate experience with it now.
If you think more experience with death promotes peace, talk to a Bosinan, or a Croat. They'll set you straight.
Re:Good News in War Against China (Score:2, Insightful)
Considering the situation in Iraq, do you really think the Americans will last long enough to build these robots...plus the Chinese or Japanese could easily buy these robots once they are build through the all-mighty corruption magic?
I have no doubt the current US Administration will build these robots. The question is whether other countries could also get their hands on the 'bots or not. Oh, and I think all the mighty, high tech 'bots can't stand against a 10 years old nuke.
Re:What about ethics? (Score:4, Insightful)
Or protected by it either, I guess
gangs of suckers (Score:2, Insightful)
It's not between "gangs of these suckers" that you will see the "action", but rather gangs of these suckers slaughtering tons of civilians somewhere in the 3rd world that doesn have any. Wars are never about fair competition.
Re:Good News in War Against China (Score:3, Insightful)
What it comes down to is the almighty dollar.
Re:We already have autonomous firing systems (Score:1, Insightful)
So that the administration can liberate the poor suffering oil reserves.
Re:Dehumanization of Killing (Score:1, Insightful)
No. Not really. If you look at the wars that the US has been involved in, the advent of technology from the 1700's through 1945 show a direct link between the rise of technology and an increase in number of casualties. The wars since then have been limited engagements. While midevil conflict may have had high percentages of populations slayed, that is more an issue of barbarism, not technology.
Anyway, looking at the window of time that I indicated I don't see how your statement could be said to be true.
Re:Dehumanization of Killing (Score:2, Insightful)
That statement is completely idiotic and, if anything, probably only serves to illustrate your devastating lack of education.
In WWII, almost 60 million people were killed, about two thirds of them civilians. Without technology, such "convenient" ways of genocide as the holocaust or air raids wouldn't have been possible.
In WWI, 15 million dead. Many died in the trenches, because such advances as machine guns and heavy artillary forced things into a deadlock.
If there's enough of a technological imbalance, and no other factors to make up for it, the more advanced country has an easier time killing lots of enemy soldiers or civilians. The US has repeatedly been in that situation. But then there's also cases there one side is more advanced, and the other is more numerous, and a huge carnage results, such as in the Russian theater in WWII, or the Korean war.
I find it disgusting how easily Americans like you talk about others who "whine" about yet more weapons. Your criminal government has shifted to a policy of preemtive - illegal! - wars. Thousands get killed. Sure, you in your self-righteousness watch O'Reilly and believe that it's all cool. No, it is not. People get sick of it. People get sick of wars, of killing, and of fucking arrogance as you display it.
Re:Captured robots (Score:5, Insightful)
Now, tricks like this (probably not these exact tricks) likely wouldn't let you send commands to the bot; however, they might let you know what is being sent to the bot, and what it is sending back.
Personally, I'm kinda curious as to how effective tempest attacks would be against "secure" communication devices, especially radios. I mean, radios make sounds by using pulsed magnetic fields to vibrate a diaphragm - sounds like a good way to broadcast unwanted RF to me
Re:Captured robots (Score:5, Insightful)
Re:Seeing your work used "for evil" (Score:4, Insightful)
Even with the necessity argument, one of the main reasons that war is accepted as necessary by the general masses is because we value our lives over the lives of others. We constantly demonize the actions of nazi soldiers because they were killing innocent people, but how often does the mainstream criticize the US for Hiroshima? If we are going to look at war, I think that it's important to put the human being back into the equation. With technology increasing its presence on the battlefield, we can look more and more casualties for the "enemy" and less and less for us. This will further push the disconnect between the idea of war and the reality of war.
Re:Not so bad... (Score:5, Insightful)
"...but when the little guys are murderous right-wing religious lunatics, aintcha glad that pop culture's got nothing on plain ol' common sense?"
And when the people in charge of the Big Lone Superpower are murderous right-wing religious lunatics with a massive military, nuclear, chemical and biological weapons (and a history of actually using them), and effectively unchecked political power over the rest of the world, then please tell me, who are we supposed to root for?
Re:Seeing your work used "for evil" (Score:4, Insightful)
Nope, the reason is that warfare fills your mouth with food and your pockets with money. The rest of your comment is mostly trying justify yourself that you're a great guy and you're doing things necessary for the humanity.
If you work for a company that sells weapons, your inventions will be used to kill. It's that simple. Nobody wastes loads of money just to not use what they bought.
That is a fallacy. You should blame also the farmers for selling you food to keep your brain functioning... you're designing directly weapons, or support devices for ppl that carry weapons to kill. Period.
Re:What about ethics? (Score:3, Insightful)
Asimov just wanted to write Sci-Fi stories that avoid the cliche of square jawed human heros blow away evil robots (the irony of the I Robot film). So he came up with the laws. They also let him write neat logical puzzle type stories where the laws lead to uninteaded consequences, including the robots sometimes doing 'bad' things.
The laws were created as a dramatic and plot device. I'm sure he had plent of concept of human history and where technological inovation came from, but he was writing about his own fictional world.
Re:Good News in War Against China (Score:2, Insightful)
just me? (Score:1, Insightful)
The obvious wingnut fallacy (Score:3, Insightful)
Yeah, of course. Nobody wants to see the mullahs take over the world. So? How does that imply that we should go to war? The uttermost wingnut error is this dogma:
"A is evil. Therefore we should go to war with them"
"B is evil. Therefore we should go to war with them"
I'm no pacifist. Sometimes war is the right answer. But it isn't in all cases. The grandparent makes a point that reducing the risks of war may create a situation in which a powerful nation is more likely to start wars at the drop of a hat. Since in general wars kill people, starting more of them is not a a priori a good idea. You have to argue that a specific war is worth the cost, and you haven't done that with respect to, e.g., Iran.
Pointing out solely that Iran has an evil agenda does not in any way shape or form refute the grandparent's argument. Yes, they're evil. So what? That's irrelevant: the point wasn't about the nature of the agendas in question, it was about the effects of the policies pursued in support of those agendas. A very different question.
Your attitude, (which you call "common sense"), that violence is the only effective strategy against bad people, is neanderthal and ignorant of history. Even in the bloody 20th century, significantly more oppressive regimes and dictatorships were overthrown by nonviolent means than via wars. Yes, sometimes, war can overthrow a dictator and bring peace and democracy afterwards (as WWII). Sometimes, though, war can fail in its goals and set the stage for the rise of brutal dictatorships (as we failed to free Vietnam, but set the stage for Pol Pot). How many examples do you want of wars fought by the US that failed to bring democracy to the target country and/or gave rise to a brutal dictator afterwards?
Different situations require different responses.
History most definitely does *not* support your implicit assumption that a nation more willing to go on the warpath to enforce its agenda increases the likelihood of that agenda "winning". More often, that kind of attitude just fuels resentment and defensiveness, leading to a bigger fight and a lot more people dead, which isn't in anyone's best interests. There are a lot of arguments to be made that a more warlike America would do a great deal of harm to the cause of democracy around the world.
Already, anti-US rhetoric and consequent terrorist recruiting in the middle east has tripled as a result of the Iraq war. From that aspect, it would seem that a little more caution might be advised, not less. There may also be positive outcomes
Re:Seeing your work used "for evil" (Score:3, Insightful)
As for Hiroshima, the U.S. hasn't gotten a carte blanche on this one. I've seen plenty of criticism for this one though it seems the mainstream falls on the side of using the A-bomb as being the right one, or at least understandable given the context. I happen to agree with this assesment, if you don't, sorry, but it's not the same thing as not being criticized at all.
Terrorism and the boston tea party (Score:3, Insightful)
Yeah, exactly.
People fight when they 1) have a grudge, 2) are poor 3) feel they're being taked advantage of 4) are scared or 5) are disenfranchised and feel they don't have a say in their own future.
How they fight depends on their circumstances. If they're wealthy, they use technology at arm's length and/or send other people (usually their own poor) to fight. This is how the US does it.
If they're outnumbered or outgunned, they fall back on guerilla tactics and/or terrorist tactics. This is how the iraqi insurgents fight.
Take for example Israel and Palestine. Both populations feel they have a historical and religious claim to the land. But the israelis are wealthy workers who have jobs and a vote. So they send tanks and helicopters. Palestinians are dirt poor, live under continuous occupation (=disenfranchisement) and are outgunned. So they blow themselves up and use guerilla tactics.
If the USA were under occupation by a superior force, we'd fall back on insurgent and/or terrorist tactics, too. Go watch "Red Dawn". As a matter of fact, that's how we won our independence. We couldn't defeat the british regulars on the field, so we slinked through the forests and sniped their leaders
Robots won't change this scene at all. They'll just change the balance of power and drive the other side to new tactics, as you suggest. We get all high and might about how the other side uses unspeakable insurgent and/or terrorist tactics, but we would do exactly the same thing if the conditions were reversed.
It's just too bad that nobody tosses tea into the harbor anymore.
They do [urbanplanet.org]. Just nobody pays attention. The reasons people use such violent tactics these days is because in the age of sensationalist profit-driven news reporting, if you don't make a big noise nobody even hears about it.
BTW, throwing tea in the harbor in 1773 is about equivalent moneywise to torching 15 hummers to make a political statement in 2003. That guy is headed to jail, and nobody is singing his praise...
Re:Seeing your work used "for evil" (Score:4, Insightful)
If you work for a company that sells weapons, your inventions will be used to kill.
Or defend. It's not really that simple. If we hadn't developed sonar and depth charges, Germany would have ruled the atlantic indefinitely. If we hadn't developed superior aircraft, they would have ruled the skies. If Britain hadn't developed radar, many, many more of their civilians would have died. As long as bad people exist, we need to develop weapons for defense.
you're designing directly weapons, or support devices for ppl that carry weapons to kill
Uh, no. The poster said he's doing research for a defense company. Stuff that may be used for defense, but may also be used for some cool domestic application, like, you know the INTERNET!
Every bit of technology ever developed has at sometime has been applied to the practice of killing people, whether directly or indirectly. Following the sheetmetal example, don't you think the first army to use body armor, shields, and swords had a decisive advantage? Should the scientists and blacksmiths at that time have gone on strike, skipped that overrated "progress" thing, and let themselves be conquered and killed by the barbarians?
Re:US Cowardice (Score:3, Insightful)
Re:US Cowardice (Score:3, Insightful)
How many times have you realized that you really should do something, but you were afraid for your own safety? Aside from disagreeing with their beliefs, these terrorists did what they thought they needed to do, and were not cowards.
Enemies? Sure. Needing a good ass-kicking? Definately. Hiding in Iraq? Only since the fall of Saddam.