Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Robotics The Military

Examining the Ethical Implications of Robots in War 369

Schneier points out an interesting (and long, 117-pages) paper on the ethical implications of robots in war [PDF]. "This report has provided the motivation, philosophy, formalisms, representational requirements, architectural design criteria, recommendations, and test scenarios to design and construct an autonomous robotic system architecture capable of the ethical use of lethal force. These first steps toward that goal are very preliminary and subject to major revision, but at the very least they can be viewed as the beginnings of an ethical robotic warfighter. The primary goal remains to enforce the International Laws of War in the battlefield in a manner that is believed achievable, by creating a class of robots that not only conform to International Law but outperform human soldiers in their ethical capacity."
This discussion has been archived. No new comments can be posted.

Examining the Ethical Implications of Robots in War

Comments Filter:
  • by KublaiKhan ( 522918 ) on Monday January 28, 2008 @02:56PM (#22210858) Homepage Journal
    If you've got battlebots, why not have one against another to resolve international conflicts, rather than destroy infrastructure and the like?

    It'd probably take a mountain of treaties and the like, and of course any organization used to judge the battlebot contest would be rife for corruption and whatnot, but it couldn't be that much worse than what happens around the World Cup and the Olympics...
  • by The Aethereal ( 1160051 ) on Monday January 28, 2008 @03:02PM (#22210926)

    Obviously a country that can send robots instead of soldiers to fight is way more likely to become 'war happy'
    Of equal concern to me is the fact that a country with a robot army can use them against their own citizens with no chance of mass mutiny.
  • by moderatorrater ( 1095745 ) on Monday January 28, 2008 @03:02PM (#22210932)

    If you've got battlebots, why not have one against another to resolve international conflicts, rather than destroy infrastructure and the like?
    We've already built structures to solve international conflicts, and it works extremely well when the two sides are willing to work through those structures. The US doesn't need battlebots to deal with European powers, because both sides are willing to talk it through instead. However, when Iraq refuses to cooperate, or the Arabs in Israel refuse to cooperate, the procedures break down and you're left with two countries that can't reach an agreement without raising the stakes.

    In other words, for those countries willing to abide by a mountain of treaties, the problem's already solved. It's the other countries that are the problem, and they're unlikely to resolve their differences like this anyway.
  • by KublaiKhan ( 522918 ) on Monday January 28, 2008 @03:03PM (#22210952) Homepage Journal
    I'd think that it'd be more effective to attack infrastructure--things like power stations, traffic control systems, that manner of thing--than to go after civilians directly.

    For one thing, what's the point of taking over a territory if there's nobody there to rebuild and to use as a resource?

    For another, it looks a -lot- better on the international PR scene if your robots decidedly ignore the civilians and only go after inanimate strategic targets--at least, up until the point that they get attacked. With that sort of programming, you could make the case that you're "seeking to avoid all unnecessary casualties" etc. etc.

    Mowing down a civilian populace does sow terror, of course, but keeping the civilians intact (if in the dark and without water) can be argued to be more effective.
  • by moderatorrater ( 1095745 ) on Monday January 28, 2008 @03:07PM (#22211010)
    It sounds like this is proposing something along the lines of Asimov's Three Laws of Robots, only instead of not being able to harm humans at all, they're able to harm humans only in an ethical manner.

    Instead of sending human soldiers into Iraq who are able to go crazy and kill civilians, you could send in a robot that wouldn't have emotional responses. Instead of having VA hospitals filled with injured people, you could have dangerous assignments filled out with robots that are replaceable.

    However, there's too much potential for abuse for me to feel comfortable about this. As the gap between the weapons available to citizens and the weapons available to the government widens, the ability for the government to abuse its own citizens grows.
  • War is what happens when treaties stop working. You can't have a treaty for some other competition to replace war--if that was the case, FIFA would have replaced the UN by now and Brazil would be a superpower. The purpose of war is to use force in order to impose your will on the enemy, whoever those people may be. The idea is, after your robots destroy the enemy's robots, they will continue to destroy the enemy's infrastructure and population until they give up.
  • by ccguy ( 1116865 ) * on Monday January 28, 2008 @03:12PM (#22211084) Homepage

    Mowing down a civilian populace does sow terror, of course, but keeping the civilians intact (if in the dark and without water) can be argued to be more effective.
    When desperate enough civilians can become soldiers. In fact, some can be willing to die (being willing to die and accepting a certain risk are totally different things). This is proven day after day in the Gaza strip for example.
  • by Overzeetop ( 214511 ) on Monday January 28, 2008 @03:18PM (#22211186) Journal
    Wars are won by those who do not follow the "rules." There are no rules in war. If there were, then there would be a third party far more powerful than either side who could enforce said rules. If there was, then that power could enforce a solution to the conflict that started the war, and there would be no need for war. Said power would also not need answer to anyone, and would be exempt from said rules (having no one capable of enforcing them).

  • Re:Wow (Score:2, Insightful)

    by Hatta ( 162192 ) on Monday January 28, 2008 @03:19PM (#22211194) Journal
    Seriously, is this a joke? Ethics and war in the same sentence? War is not ethical, it never will be. Robots are not going to change that.
  • by sammyF70 ( 1154563 ) on Monday January 28, 2008 @03:22PM (#22211240) Homepage Journal
    How well this work and how willing the US is at talking to .. well .. ANYBODY .. can be seen in archive footage of the UN meetings prior to the latest Iraq invasion.
    If you decide to resolve wars using only bots (or even by playing out a virtual video-game like war), my bets are that one of the side will realize it can actually physically attack its opponent, while the opposing side is arguing that the random number generator used is unfair.
    Add to that that what you want are generally the natural ressources of the country you're invading and that people are expendable, I'd guess that robots would be programmed to leave vital assets intact and wipe out the humans, instead of doing it the other way around. After all, you can run an oil refinery with a few hundred people, and it costs much more to rebuild it after the war instead of just flying in a few workers to operate it.

    There is nothing civilized about war and hoping for fair behaviour on either side is hopelessly optimistic.

  • by fuzzyfuzzyfungus ( 1223518 ) on Monday January 28, 2008 @03:23PM (#22211268) Journal
    The question of making lethal robots act ethically is far easier in some ways than doing so with humans and far harder in others. On the plus side, robots will not be subject to anger, fear, stress, desire for revenge, etc. So they should be effectively immune to the tendency toward taking out the stress of a difficult or unwinnable conflict on the local population. On the minus side, robots have no scruples, probably won't include whistleblowing functions, and will obey any order that can be expressed machine-readably.

    The real trick, I suspect, will not be in the design of the robots; but in the design of the information gathering, storage, analysis, and release process that will enforce compliance with ethical rules by the robot's operators. As the robots will need a strong authentication system, in order to prevent their being hijacked or otherwise misused, the technical basis for a strong system of logging and accountability will come practically for free. Fair amounts of direct sensor data from robots in the field will probably be available as well. From the perspective of quantity and quality of information, a robot army will be the most accountable one in history. No verbal orders that nobody seems to remember, the ability to look through the sensors of the combatants in the field without reliance on human memory, and so on. Unfortunately, this vast collection of data will be much, much easier to control than has historically been the case. The robots aren't going to leak to the press, confess to their shrink, send photos home, or anything else.

    It will all come down to governance. We will need a way for the data to be audited rigorously by people who will actually have the power and the motivation to act on what they find without revealing so much so soon that we destroy the robots' strategic effectiveness. We can't just dump the whole lot on youtube; but we all know what sorts of things happen behind the blank wall of "national security" even when there are humans who might talk. Robots will not, ever, talk; but they will provide the best data in history if we can handle it correctly.
  • by SkelVA ( 1055970 ) <winhamwr@gmail . c om> on Monday January 28, 2008 @03:23PM (#22211270) Homepage

    Besides, if your enemy expects your robots to defeat their army, what would be the point of fighting them in the first place? Attacking civilians seems a more logical step (I don't think it's reasonable to demand any country at war not to attack only military targets where there's none that can't be replaced easily).
    I think we just saw the thought process that bred guerrilla warfare (or terrorism, depending on your point of view). I'll make the logical leap.

    Besides, if your enemy expects your highly-trained, well-financed, well-organized US military to defeat their army, what would be the point of fighting them in the first place? Attacking civilians seems a more logical step
    Guess what. We've already reached the point you fear (at least from the point of view of most of the western world and the larger military powers). Robots augment armed forces that already have overwhelming force. They're not going to be creating a military where there was none.

    To use a contemporary example, Iran isn't going to pump out a bunch of robots and all of the sudden have an armed forces capable of withstanding the US's in a conventional war. As per the logical process in the quotes though, you don't necessarily have to destroy the other side's army (or robots).
  • by amasiancrasian ( 1132031 ) on Monday January 28, 2008 @03:24PM (#22211280)
    I see a lot of implementation problems before even getting involved with the ethical issues. I mean, there's the usual friend-or-foe IDing issues. Then there's the problem of getting the software to recognise a weapon. If you program it to recognise the shape of an AK, it'll pick up replicas or toys or, heck, lots of stuff that looks vaguely gun-shaped. And the enemy will simply resort to distorting the shape of the weapon, which can't be hard to do. Given that it will be a while before AI technology will improve, it doesn't seen any more effective than a remote-controlled car. And as far as the legal issues, this seems like skirting the boundaries, and definitely violating the spirit, if not the letter of the law.
  • by The One and Only ( 691315 ) * <[ten.hclewlihp] [ta] [lihp]> on Monday January 28, 2008 @03:29PM (#22211344) Homepage
    Depends on the robots. What about the people who build and maintain the robots? They can mutiny. Also I'd bet you need some sort of networking to coordinate the robots. Probably wireless. Sure you can set the right failure modes for jamming, but what about signal intrusion? You could make the robots mutiny for you.
  • by Splab ( 574204 ) on Monday January 28, 2008 @03:29PM (#22211354)
    You must be American to have such a screwed up view of whats going on in Iraq and Israel.

    And talk it through? Since when did Americans start to respect any treaty that didn't put them in a favorable view? Building a robot army is just the next logical step in alienating the rest of the world.
  • by meringuoid ( 568297 ) on Monday January 28, 2008 @03:32PM (#22211406)
    War is about sacrifice, cost, and essentially fighting for what you believe in, hold dear, and WILL DIE to preserve. If you remove the *human* cost from war, then where is the cost? What will it mean if no-one dies? Will anyone remember what was fought for? Will they even recognize why it was so important in the first place?

    Bullshit. War is about taking orders, fighting for what someone else believes in, and then getting blown up. Dulce et decorum est pro patria mori and all that shite. That poetic nonsense you spout there is just part of the cultural lie that sells war as romantic and idealistic to every generation of young fools who sign up and go out there to put their lives on the line for the sake of the millionaires. You got it from anime, too... how sad is that? You're buying the same line of bullshit that inspired the damn kamikaze! Clue: Bushido is a lie. Chivalry is a lie. War is about nothing but power.

    Also, if we have mass armies of robots, won't the victor simply be the one with the most natural resources (metal, power, etc) to waste? (Better weapons technology aside)

    Yes. How does that differ from the present situation?

  • by MozeeToby ( 1163751 ) on Monday January 28, 2008 @03:40PM (#22211514)
    It is well that war is so terrible -- lest we should grow too fond of it.

    Robert E. Lee
  • by Applekid ( 993327 ) on Monday January 28, 2008 @03:40PM (#22211516)
    Talk about human support for mutiny is moot when even today in the United States our rights are being stripped out one thread at a time and nobody so much as blinks or turns away from American Idol or Walmart.

    One worker might talk about it and wind up turned in (because he's a terrorist, obviously) and those that betray will be rewarded with coupons to McDonalds.
  • by smussman ( 1160103 ) on Monday January 28, 2008 @03:41PM (#22211538)

    Depends on the robots. What about the people who build and maintain the robots? They can mutiny. Also I'd bet you need some sort of networking to coordinate the robots. Probably wireless. Sure you can set the right failure modes for jamming, but what about signal intrusion? You could make the robots mutiny for you.
    But I don't think you really want that, because if the maintenance people can make the robots mutiny, how would you prevent your opponent from making them mutiny? Even if it requires very specialised knowledge, all it takes to get the secret is one converted/planted maintenance person.
  • by TheRaven64 ( 641858 ) on Monday January 28, 2008 @03:42PM (#22211542) Journal

    Obviously a country that can send robots instead of soldiers to fight is way more likely to become 'war happy' - so I'm not sure this robot thing is a good idea at all.
    Not necessarily. One of the big reasons the USA lost in Vietnam was that it became politically unacceptable to have body bags coming home. The current administration found a solution to that; ban news crews from the areas of airports where the body bags are unloaded.

    Beyond that it's just a question of economics. It costs a certain amount to train a soldier. Since the first world war, sending untrained recruits out to fight hasn't been economically viable since they get killed too quickly (often while carrying expensive equipment). A mass-produced robot might be cheaper, assuming the support costs aren't too great. If it isn't then the only reason for using one would be political.

  • by Arkham ( 10779 ) on Monday January 28, 2008 @03:43PM (#22211568)
    I don't think automatic robots will ever be a smart plan. The chance of malfunction is just too great, and the consequence would be too serious. There've been a million sci-fi movies to that effect, from "Terminator" to "I, Robot".

    What would be interesting though would be robots as a shell to the humans they represent. Think "Quake" with a real robot proxy in the real world. Soldiers with hats on showing wide angle camera views of their area and a quake-like interface that would allow them to attack or assist as needed. Limited automation, but case-hardened soldiers being run by trained humans would present a powerful adversary. Heck, every army recruit would already know 80% of how to operate one on signing day if the UI was good.

    I know I'd be a lot upset with "Four robots were blown up by a roadside bomb today. They should be operational again by tomorrow." than to see more soldiers die.
  • by Kjella ( 173770 ) on Monday January 28, 2008 @03:58PM (#22211822) Homepage

    Depends on the robots. What about the people who build and maintain the robots? They can mutiny. Also I'd bet you need some sort of networking to coordinate the robots. Probably wireless. Sure you can set the right failure modes for jamming, but what about signal intrusion? You could make the robots mutiny for you.
    They can mutiny with what, sticks and stones? Whoever makes the robots will surely put in digital signatures and kill switches so that they can reclaim control from the operators as well as prevent them from being used against themselves. Hell, it's difficult enough to run your own code on a game console and try breaking WPA 128-bit encryption if you can. After the first attempts are quickly rounded up by a special ops division operated by devout fanatics, it won't happen again.
  • by kabocox ( 199019 ) on Monday January 28, 2008 @04:08PM (#22211966)
    Besides, I still fail to see why a country which is likely to lose in the robotic war would accept these rules, when it makes a lot more sense to attack the other country's civil population - which in turn might reconsider the whole thing.

    Fighting from the sofa is one thing, having bombs exploding nearby is quite different.


    Um, cause they may be terrified that the robots would switch from ethical mode to genocide on populations found to be training terrorists or recently conquered populations found to be terrorists need to have extreme measures taken on them. If you are dealing with an enemy that has vast hordes of seemingly ethical robots, make sure you play by their rules otherwise they can define your entire population as unethical terrorists that need to be removed/eliminated.

    You seal off the borders, kill off the entire population including all reporters, send in the cleaning robots to tidy up the place, and then you send in the real estate robots to sell all these new homes to your citizens at low prices. If the housing is subpar, you may have to knock it all down, have robot builders come in and build new homes and then start the re population process. It may seem evil and unethical to your enemies, but your citizens would like/love the government that was providing all these cheap new resources.
  • by JonWan ( 456212 ) on Monday January 28, 2008 @04:38PM (#22212484)
    Since when did any country start to respect any treaty that didn't put them in a favorable view?

    There I fixed it for you.

  • by mi ( 197448 ) <slashdot-2017q4@virtual-estates.net> on Monday January 28, 2008 @04:46PM (#22212630) Homepage Journal

    Obviously a country that can send robots instead of soldiers to fight is way more likely to become 'war happy' - so I'm not sure this robot thing is a good idea at all.

    This is not robot-specific — it is true about any superiority in weapons...

    Besides, if your enemy expects your robots to defeat their army, what would be the point of fighting them in the first place? Attacking civilians seems a more logical step

    Again, nothing robot-specific here either. Unable to take on our military directly, Al Qaeda has already taken to attacking our civilians. Likewise, unable (since 1970ies) to take on Israeli military directly, various assholes have been attacking Israeli civilians for decades.

    One side having better weapons makes the other side look for an alternative edge. Whether that superiority is achieved via robotics or any other technological advance is irrelevant.

  • by Amorymeltzer ( 1213818 ) on Monday January 28, 2008 @05:49PM (#22213588)

    a country with a robot army can use them against their own citizens with no chance of mass mutiny.
    You don't know the American Army! Our boys are so well trained they wouldn't think twice before firing upon the innocent masses. Hell, they wouldn't think once!
  • by HybridJeff ( 717521 ) on Monday January 28, 2008 @06:04PM (#22213828) Homepage
    Don;t give the credit to Toys, sounds too me like they copied the idea from Ender's Game [wikipedia.org]. Granted, they modified the idea slightly.
  • by moogleii ( 704303 ) on Monday January 28, 2008 @07:11PM (#22214824)

    I pay taxes, I vote. That should be enough. Buying American is just icing on the cake. In a capitalist system, I'm going to buy what I deem to be the better value, because...that's part of capitalism. I'm not going to restrain myself to products because they were built in a certain country - that sounds like some kind of twisted form of economic welfare if you ask me.

    And who said abandoning unions is bad? Depends who you ask, I guess. Me? I think the unions are holding GM back.

Always draw your curves, then plot your reading.

Working...