Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Government Robotics The Military

UN to Debate Use of Fully Autonomous Weapons, New Report Released 180

concertina226 (2447056) writes "The United Nations will debate the use of killer robots for the first time at the UN Convention on Certain Conventional Weapons (CCW) this week, but human rights activists are calling for the robots to be banned. Human Rights Watch and Harvard Law School's International Human Rights Clinic have published a new report entitled 'Shaking the Foundations: The Human Rights Implications of Killer Robots', which calls for killer robots to be banned to prevent a potential arms race between countries. Killer robots, or fully autonomous weapons, do not yet exist but would be the next step after remote-controlled armed drones used by the US military today. Fully autonomous weapons would have the ability to identify and fire on targets without human intervention, putting compliance with international humanitarian laws in doubt. Among the problems with killer robots highlighted in the report is the risk of criminal liability for a military officer, programmer or weapons manufacturer who created or used an autonomous weapon with intent to kill. If a robot killed arbitrarily, it would be difficult to hold anyone accountable."
This discussion has been archived. No new comments can be posted.

UN to Debate Use of Fully Autonomous Weapons, New Report Released

Comments Filter:
  • by srussia ( 884021 ) on Monday May 12, 2014 @08:57AM (#46978289)
    Don't mines qualify as "autonomous weapons"?
  • by EmagGeek ( 574360 ) on Monday May 12, 2014 @08:58AM (#46978291) Journal

    Bans will not only not prevent them being developed, probably even by a technologically advanced State that is a signatory to the treaty, but it will also not prevent them being used by rogue or puppet states who don't care about bans, or who use them at the behest of a signatory state that is just using them to do their dirty work.

  • Re:3 laws deleted (Score:5, Insightful)

    by RobinH ( 124750 ) on Monday May 12, 2014 @09:10AM (#46978361) Homepage

    Stop with the "3 laws" nonsense. Asimov's "laws" were never intended as actual laws, they were a plot device, and they're certainly not something you "delete" because they were never there in the first place. We already have regulations about machine safety (I work with them every day). The laws govern the control of hazardous energy in a system, with various guarding and interlocks being required to protect humans from injury when they interact with the system, and design constraints determined by how likely certain safety critical component failure is, and redundancy, etc.

    Nobody building a killer robot is going to be worrying about any laws, pretend or otherwise. They're worried about how many units they can sell.

  • by swb ( 14022 ) on Monday May 12, 2014 @09:15AM (#46978387)

    I don't know how robot soldiers identify targets, but presuming they have some mechanism whereby they only kill armed combatants it's not hard to see some advantages over human soldiers at least with respect to civilian noncombatants.

    More accurate fire -- ability to use the minimal firepower to engage a target due to superior capabilities. Fire back only when fired upon -- presumably robots would be able to withstand some small arms fire and thus wouldn't necessarily need to shoot first and wouldn't shoot civilians.

    Emotionally detached -- they wouldn't get upset when Unit #266478 is disabled by sniper fire from a village and decide to kill the villagers and burn the village. You don't see robots engaging in a My Lai-type massacre.

    They also wouldn't commit atrocities against civilians, wonton destruction, killing livestock, rape, beatings, etc. Robots won't rape and pillage.

  • by MozeeToby ( 1163751 ) on Monday May 12, 2014 @09:18AM (#46978405)

    What will happen is that the defense contractors will develop autonomous less-lethal robots that can scout, identify targets, and engage with less lethal weapons. But you know... for flexibility purposes... we'll just make sure the weapon hardpoints are as modular as possible. Hey! I know! We'll make them be adaptable to any standard infantry fir... errrrr, less-lethal weapon.

  • by buchner.johannes ( 1139593 ) on Monday May 12, 2014 @09:24AM (#46978465) Homepage Journal

    Bans will not only not prevent them being developed, probably even by a technologically advanced State that is a signatory to the treaty, but it will also not prevent them being used by rogue or puppet states who don't care about bans, or who use them at the behest of a signatory state that is just using them to do their dirty work.

    Any state today is dependent on trade from the international community. If the US and the EU (or any other large fraction of the international community) decide not to trade with a country, and not grant bank transfers to that country, that has a huge effect on their economy. The countries able to withstand this are countable on one hand. Of course, trade sanctions are not a plan, but the lack of a plan.

    It is always better though to help the particular country address their actual problems rather than supporting their approach. For example, perceived threats can be thwarted by establishing a neutral buffer zone controlled by a third party.

    So no, contrary to the common opinion on Slashdot, I think collectively agreeing to not use a certain, dangerous technology can be useful, and is also enforceable.

  • by davester666 ( 731373 ) on Monday May 12, 2014 @01:37PM (#46980939) Journal

    Just like civil liberties are ignored after pretty much any "event" within the US.

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...