The UN Will Consider Banning Killer Robots (hrw.org) 210
Friday the United Nations agreed to discuss a ban on "killer robots" in 2017. The 123 signatories to a long-standing conventional weapons pact "agreed to formalize their efforts next year to deal with the challenges raised by weapons systems that would select and attack targets without meaningful human control," according to Human Rights Watch.
"The governments meeting in Geneva took an important step toward stemming the development of killer robots, but there is no time to lose," said Steve Goose, arms director of Human Rights Watch, a co-founder of the Campaign to Stop Killer Robots. "Once these weapons exist, there will be no stopping them. The time to act on a pre-emptive ban is now."
schwit1 reminded us that IEEE Spectrum ran a guest post Thursday by AI professor Toby Walsh, who addressed the U.N. again this week. "If we don't get a ban in place, there will be an arms race. And the end point of this race will look much like the dystopian future painted by Hollywood movies like The Terminator."
schwit1 reminded us that IEEE Spectrum ran a guest post Thursday by AI professor Toby Walsh, who addressed the U.N. again this week. "If we don't get a ban in place, there will be an arms race. And the end point of this race will look much like the dystopian future painted by Hollywood movies like The Terminator."
Meh. (Score:5, Insightful)
Like many of the proclamations from the UN, such a ban will have little influence over the development and use of "killer robots".
Automation of the military (Score:3, Interesting)
Not just that - this is a macrocosm of 2nd amendment arguments in the US. Just like people who advocate banning assault rifles would only affect law abiding citizens, similarly, such a ban would affect law abiding nations, but do nothing about rogue nations who sooner or later would have that capability
Besides, I disagree w/ this proposal for a simple reason. We should avoid intervening in other countries, such as Syria. But if we have to go in, I'd rather send in killer robots after ISIS rather than
Re: (Score:2, Insightful)
Besides, I disagree w/ this proposal for a simple reason. We should avoid intervening in other countries, such as Syria. But if we have to go in, I'd rather send in killer robots after ISIS rather than American (or any other) humans who'll get killed or maimed for life. We should have killer robots substitute soldiers: it would also solve the issue of a depleted military as well as the idea for a draft
If you're not man enough to look the enemy in the eyes while killing him, you're... American. The term is becoming synonymous with coward, and I'm ashamed.
From carpet bombing Dresden to napalming villages in Viet Nam to drone strikes in Afghanistan, it's clear that American soldiers are not men enough to risk their own lives to spare civilians.
Re:Automation of the military (Score:4, Insightful)
Re: (Score:3)
Generals and terrorist leaders have already been using robots for quite some time. Biological ones, that is. They haven't risked their own lives, either.
Yes, it has been a long time since the days of chivalry, and leaders being called leaders because they were leading.
Re:Automation of the military (Score:4, Insightful)
"If you're not man enough to look the enemy in the eyes while killing him, you're... American. The term is becoming synonymous with coward, and I'm ashamed."
The objective of war is not to look "man enough" but to kill sufficiently large numbers of enemy that he will no longer be inclined to attack you.
Re: (Score:2)
The objective of war is not to look "man enough" but to kill sufficiently large numbers of enemy that he will no longer be inclined to attack you.
That's the weirdest twisting of jus ad bello I've seen since the Bush Doctrine.
Re:Automation of the military (Score:5, Insightful)
"Your purpose is not to die for your country. Your purpose is to make the other poor dumb bastard die for his country."
If you think that making war is all about being "fair", you're doing it wrong....
Re:Automation of the military (Score:4, Insightful)
The idea is to kill the enemy without getting killed. There are no points for style. We try to limit death of non combatants but if they're in the kill zone that's just the way the cookie crumbles. You fight wars to win because the alternative is something no one want to experience. American soldiers put themselves in harms way if needed but they are trained to survive while accomplishing the mission.
Re:Automation of the military (Score:4)
The idea is to kill the enemy without getting killed. There are no points for style. We try to limit death of non combatants but if they're in the kill zone that's just the way the cookie crumbles. You fight wars to win because the alternative is something no one want to experience. American soldiers put themselves in harms way if needed but they are trained to survive while accomplishing the mission.
Wars serve a political objective, usually badly. There are points for style, primarily because style has propaganda value, and propaganda changes political support for wars, leaders, and causes--whether that means someone voting to bring troops home or someone telling a private where the enemy ambush is waiting for his unit. Even when you lose a battle or part of one, there can be points for style--Dunkirk, Pearl Harbor, Bastogne, all involved at least temporary defeats that were in some sense spun into political victories that served the war effort. On the offensive side, style matters too--the drone attack with collateral damage that injures or kills non-terrorists, for example, can create more terrorists in the next generation.
Mod Parent Up (Score:4, Insightful)
"Death, destruction, disease, horror. That's what war is all about, Anan. That's what makes it a thing to be avoided."
Star Trek The Original Series: "A Taste of Armageddon"
Exactly. One of the consistent and reasonable critiques of modern American warfare is that because there is no draft, the influential wealthy and policy-making classes have no personal incentive to avoid war. Many people know few or even no service members. The further you push human beings away from the horrors of war, the more those people will be willing to engage in war.
I've met people who've been personally tortured by foreign heads of state. I've seen people fighting politically to pull their countries together in the face of what seems like neverending war and oppression by warlords. And I've read the stories of people who have seen their countries fall apart in the face of characteristic propaganda and strong men taking power. The less real all of this is and the less human it is, the more people will be willing to stay unengaged in matters of life and death.
Re: (Score:3)
Oh, not looking the enemy in the eyes started long, long before Dresden, or even The Blitz. Hell, it was old when the English at Crecy and Agincourt rained death on the French with longbow and cannon. It was probably old when Xerxes attempted to do the same (well, sans cannon) to the Greeks at Thermopylae.
War isn't a contact sport, it's a continuation of politics by other means.
Re: (Score:2)
If you're not man enough to look the enemy in the eyes while killing him, you're... American.
Thanks for the info, Rambo. BTW, where exactly have you served? What MOS?
Re: (Score:2)
Whatever, Gen. Powell was once asked the question of if America was fighting like weenies from the skies. Gen. Powell's response was, if I have an advantage in a fight, I'm going to use it. I want to win and get it over with as soon as possible.
And using Dresden, Viet Nam, and Afghanistan are stupid examples give the cost in American lives during those wars. The U.S. was certainly man enough to risk U.S. lives in those operations. The goal of war is not to show how big is your dick, something the jihadis fo
Re: (Score:2)
Whatever, Gen. Powell was once asked the question of if America was fighting like weenies from the skies. Gen. Powell's response was, if I have an advantage in a fight, I'm going to use it. I want to win and get it over with as soon as possible.
People who think you can [b]win[/b] a war scares me. Colin Powell is a sad short-sighted man who thinks in terms of battles, not wars.
The idea is to attain your goals and achieve peace with as little loss as possible. On both sides, because as a conqueror, you assume the responsibility of the conquered. They become your problem.
Has nothing to do with manliness (Score:2)
It's hard, cold dollars. The US will spent 5x the cost of the Afghanistan war over the next few decades just taking care of all of the humans that were injured in the fight - and nobody has budgeted for that.
We gave up feeling the warm blood, entrails, and life draining from our vicims when we invented firearms.
Re: (Score:2)
The U.S.?
First use of ships in war: perhaps Hittite empire 1200 B.C.
First military submarine: 1720, Yefim Nikonov for Tsar of Russia
First use of aircraft in war: balloons by French Aerostatic Corps
Oh, don't want to count balloons as "aircraft".
First use of airplane in war: Italians against Turks, 1911
Missiles with exploding warheads used in war: China, 1200 A.D.
Land Mines: China, 1300 A.D.
Re: (Score:2)
So, the type of man that would kill 999 out of every thousand people, I guess.
You assume the only way to reduce population is through killing? That says more about you than me.
Re: (Score:2)
Yeah.. I mean archers didn't have ANY influence over the ancient battlefields... not one bit.
One big difference is that battlefields were filled with soldiers. Not civilians.
Carpet bombing or robots make no distinction.
"Select and attack targets without human control" (Score:3, Insightful)
As the summary says, the proposed ban is on devices which "select and attack targets without meaningful human control". So basically none of what you wrote applies.
In fact, it's the exact opposite of "macrocosm of 2nd amendment arguments in the US" - supporters of the second amendment point out that "guns don't kill people, people kill people"; their argument is that the device is controlled by a person, who can do good or bad with a steel pipe too.
Re: (Score:2)
You are referring to another aspect of the 2nd amendment argument. I was talking about the one regarding the question of who follows the law. In case of guns, criminals usually flout gun laws, and so it's just the law abiding who are handicapped. In this case, nations replace individuals, the UN replaces law enforcement and killer robots replace guns. Making it the macrocosm
Re: (Score:3)
Nations aren't American people, though.
Americans tend to care about "freedom". Not any particular freedom, mind you, but they cling to the fantasy story that they are somehow "free" in an abstract sense, and any limit on that freedom is a grave assault on their very essence. However, the more recent evolution of this philosophy has extended the concern to others' freedoms as well. The privacy advocates don't have anything to hide themselves, but they're sure that someone out there has horrible secrets they'
Re: (Score:2)
It doesn't really work like that. The US is the #1 exporter of weapons to the world. So if "rogue nations" acquire killer robots, there's a good chance that they will have bought them from a US company.
So i
Re: (Score:2)
That's my point. These "killer robots" are likely to be high-dollar weapons, so they're more likely to end up in the hands of our enemies because those are the kind of weapons US companies are more likely to sell.
Meanwhile, America's lust for personal weapons enriches our enemies.
Win-win. For our enemies.
Re: (Score:2)
US gear only falls into the hands of our enemies in 2 scenarios: we sell it to them, or it gets seized from our allies. Like ISIS seizing US humvees from those Iraqi forces.
If we ban all exports of this, and only use it for our own consumption, that would ensure that they don't get used by our enemies. (This obviously assumes that our enemies would be unable to hack them and turn them against us, which is another discussion altogether)
Re: Automation of the military (Score:4, Insightful)
US gear only falls into the hands of our enemies in 2 scenarios: we sell it to them, or it gets seized from our allies
Actually, there's another third common scenario: You sell it to your 'allies' (you know, nice friendly countries like Saudi Arabia) and they sell it on, leaving you with no control over its final destination.
Re: (Score:2)
Having to murder a bunch of living people via remote control may still give people PTSD. But yes, I don't think it would be as bad - especially with plenty of sleep, little personal fear, and other stressors being reduced.
Re: (Score:2)
Re: (Score:2)
So was I. Supposedly drone pilots, who get to work in air conditioned trailers at an airbase in Nevada and go home every day, get PTSD sometimes. I don't know how prevalent it is. And certainly, if they are working in optimal conditions, not chaotic field conditions, you could rotate your combat operators off duty every month or so. Have them spend 2 weeks decompressing and talking about their experiences under the influence of some drugs to help them process what they had to do.
Another thing not fully
Re:Meh. (Score:5, Informative)
It is also about 60+ years too late. G7es, Mk24 Mine, V1, JB-1 Loon.... All killer robots as is every modern torpedo and missile.
Only if they aren't aimed (Score:2)
The proposed ban is on devices which "select and attack targets without meaningful human control". That's quoting the summary at the top of this page.
> All killer robots as is every modern torpedo and missile.
I'm pretty sure that with "every modern torpedo and missile" a human selects the target and initiates the attack. The definition could be stretched to include certain types of IEDs, though, aka land mines, which are already banned by international law.
Re: (Score:3)
Re: (Score:2)
Autonomous missiles - those using their own radar emitters, and IR guided missiles - do indeed acquire their own targets. But they do not autonomously launch, even an anti-ship missile sent on an over-the-horizon attack was vectored toward a known target (or group of targets).
Of course this means that they may go astray and end up selecting the wrong target. This definitely happens on occasion, though usually the unintended "target" is another military platform in the area. It is alleged (but the truth of t
Re: (Score:2)
Re:Only if they aren't aimed (Score:5, Informative)
As for banning landmines.
There is a landmine treaty which isn't signed by a handful of nations that don't take human rights too seriously. [wikipedia.org] Not surprisingly, the US is on that list, rubbing shoulders with the likes of North Korea, Uzbekistan and Syria.
After all, maiming civilians is what it's all about for these brave warrior nations.
Re: (Score:3)
After all, maiming civilians is what it's all about for these brave warrior nations.
The US uses landmines only along the Korean DMZ, where there are no civilians.
Re: (Score:2)
After all, maiming civilians is what it's all about for these brave warrior nations.
The US uses landmines only along the Korean DMZ, where there are no civilians.
Here is a link to that policy [state.gov]. The is now explicitly in compliance with the Ottawa Treaty (Mine Ban Treaty) with the sole exception of its use on the Korean DMZ, which is a closed, fenced, thoroughly marked, and patrolled military zone where there is no possibility of civilian (or for that matter military) encounter.
OTOH, the U.S. still uses sea mines, which can sink civilian ships as easily as military ones. Not quite the same problem as land mines, but not completely different either.
Re:Only if they aren't aimed (Score:4, Informative)
The USA also uses limited lifetime landmines so they don't stick around after a conflict to keep killing people. I may not like landmines but I understand why they are used and having ones that self destruct is much better than ones that stay around.
Re: (Score:2)
Unfortunately the U.S. does use cluster bombs in a big way, and "dud" cluster munitions aren't much different from land mines in the civilian safety problem they present.
Now, it would be possible - and actually straightforward - to make cluster munitions that cannot create a long term safety hazard.
How? By using insensitive explosives detonated by an exploding bridge wire (EBW), or a "slapper" exploding foil (EF), detonator to directly fire the high explosive, with a circuit that has a designed-in power dra
Re:Meh. (Score:4, Interesting)
Like many of the proclamations from the UN, such a ban will have little influence over the development and use of "killer robots".
Some arms control treaties have been relatively successful; but I wouldn't be optimistic here:
Somethings are just too convenient to ban; and those are usually a lost cause; but among stuff you can get support for; there seems to be a difference between hardware where you can say 'no legitimate use, period' and hardware where certain uses are forbidden; but there are enough legitimate use cases that the relevant hardware remains in stock, widely available to people relatively low on the food chain, and easily amenable to quiet 'off label' use.
Military small arms ammunition, say, tends to be pretty reliably jacketed, even disreputable outfits don't tend to produce their own dum-dums and hollowpoints; though irregular forces and police-derived paramilitaries are obviously more likely to be using weapons not concerned with Geneva convention compliance in the first place.
Stuff with both 'legitimate' and 'illegitimate' uses has been harder to keep a lid on. Phosphorus is a lovely illuminating agent; and produces great smoke; but it's considered poor taste to use it as an incendiary. Hard to make that stick when a large number of people, relatively low on the food chain, have access to it because of its legitimate applications, though.
In the case of 'killer robots'; the obvious problem is that all the hardware; and much of the software for a 'killer robot' will be identical to that of a 'human directed' robot with some automation of routine navigation stuff; machine-vision-assisted targeting and IFF, and so on. So long as you pinkie-swear that a human will have to press the 'make it so' button; you can develop all the elements, navigation, targeting, etc. that an autonomous killbot would need; but make sure that the firmware running on anything pesky journalists get to see has a human pressing the red button to approve what the autonomous killbot comes up with.
"Good faith" adherents to a 'no killbots' rule will likely find themselves easing their way toward autonomy with some (admittedly plausible) special cases: "We can't keep a human in the loop for our CIWS/RAM system; human reflexes aren't fast enough for contemporary missile intercept; but don't worry, the CIWS turrets are bolted onto the ship and aren't going to go wandering off." Not false; but an autonomous killbot. Then we'll need an 'emergency defensive protocol' for human-oversight robots that lose their link to HQ, whether to technical failure or hostile jamming; which will be OK; because it's strictly for the robots to engage in defensive actions in their existing location until communication is restored!
People who don't give a damn, of course, will just stub out the 'ask human for confirmation' part and carry on with their day.
Aside from this, some lucky logic-chopper is going to have the unenviable task of explaining why existing, more or less universally accepted, 'fire-and-forget' missiles and other similar hardware that gets its activation command from a human; but thereafter guides itself to target without external intervention, isn't a killbot; but the more drone-shaped hardware that gets its initial activation command from a human; but thereafter guides itself to target without external intervention, is a killbot.
People claiming that diplomatic pressure and arms control conventions are totally useless are seriously exaggerating their case(land mines, chemical, and biological weapons certainly are way down; people have been jumpy about blinding laser weapons, etc.); but there is a lot less room for optimism when you can't draw a bright line around whatever you are trying to ban and declare it and everything similar Off Limits.
With Killbots, I'm not optimistic: too much of the tech is shared with 100% legit 'human approved, machine assisted' systems; and the excuses to get a foot in the door(even if you care enough about perception of legality to not simply quietly switch off human approval) are too plausible.
Re: (Score:2)
Aside from this, some lucky logic-chopper is going to have the unenviable task of explaining why existing, more or less universally accepted, 'fire-and-forget' missiles and other similar hardware that gets its activation command from a human; but thereafter guides itself to target without external intervention, isn't a killbot; but the more drone-shaped hardware that gets its initial activation command from a human; but thereafter guides itself to target without external intervention, is a killbot.
I think there difference between a killbot and other systems is that it makes its own target selection, as in what are legitimate targets and how, where and when do you engage them. If the soldiers got their hands up and are waving the white flag they're not legitimate targets even if they wear the enemy's uniform. Will the software consider that they have hostages or human shields? People are going to die, it's not unreasonable to demand a human has to take responsibility for each individual decision. The
Re: (Score:2)
> such a ban will have little influence over the development and use of "killer robots"
Correct, what is needed is a treaty. Unfortunately, governments don't normally ban a weapon preemptively. Chemical, biological, nuclear- all had been seen in battle before people banned them to any measurable degree. In this thread, there's plenty of people pointing out the obvious- it would be much safer if there were more ways of projecting force that don't endanger soldiers as much. For this reason alone, autono
Re: (Score:2)
There was a comedy skit years back of Perez de Cuellar along the lines of "We assure you that the innocent will be UNprotected, the guilty will be UNpunished and the problem will be UNsolved. Please don't kill each other"
Yes, a law! (Score:4, Funny)
That will stop those killer robots from killing us.
If there's one thing Skynet recognizes, it's the authority of the written word.
Re: (Score:2)
Well, yes, actually. Developing high end military killbots requires vast amounts of research, development and funding. If it is illegal to develop such things, most of the countries in a position to resource such an effort will refrain from doing so. Sanctions can be brought against those that ignore the rules.
It might not stop it ever happening, but it will certainly slow it down. Slowing it down means we have more time to understand the issues and how we can avoid going full Skynet.
Re:Yes, a law! (Score:4, Informative)
Re: (Score:2)
I see your point but I don't think asking automated killing is a few lines of code. Abby country likely to develop this technology will also want it to at least try to avoid civilians and friendly targets, and account for all the rules of war.
I'm sure it will happen one day, I'm just saying there is value in delaying it as long as possible.
Re: (Score:2)
Re: (Score:2)
Outside of a hot war, I don't think you can justify a system that gives so little opportunity for averting civilian casualties. Especially with the current war on terror, which seems to be where robots are used most often.
The Human Thought Process (Score:2)
It's strange how the human race will go through so much effort to define the " rules " of War, when the effort would be better spent on removing War entirely.
( yeah I know, sometimes you just gotta bomb the shit out of somebody because their ideals conflict with other ideals and they won't take no for an answer. Example: Sharia Law vs The Civilized World )
It's rather strange to wrap my head around the idea of being prosecuted for " War Crimes " ( lol wtf . . it's a War ) for killing someone in an unauthor
Re: (Score:2)
Wars is hell, right? Even in hell, there are some places that are worse than others.
robot law (Score:2)
Re: (Score:2)
The World's First Robot Lawyer
http://www.donotpay.co.uk/ [donotpay.co.uk]
Voyager S02E13 Prototype. (Score:2)
I'm sure there are many good references but the only one that comes to mind is voyager S02E13 Prototype.
You know the one where they end up getting pulled into a war between robots.
The war ended but the robots kept fighting.
Re: (Score:2)
There is a reason they add the qualifier to the term Science Fiction.
Re: (Score:2)
Yeah I just thought it was a more relevant example than terminator.
In terminator iirc they weren't trying to build an automated weapon they just ended up with an ai that built them after nuking everybody.
None of the rules they are trying to suggest would even apply to that scenario.
Re: (Score:2)
Yeah: because it's fiction. But that doesn't mean it's impossible.
At some point we're going to get to the tech level where we can quite feasibly make robots that could continue fighting each other long after we're dead. We'd better be damn well prepared for that, because if they get out of control they'll be kinda hard to stop.
Have you ever accidentally fork bombed yourself on a Linux system? No amount of hitting ctrl+c will save you once it's started. When you start making robots that do things in the real
Robo laws (Score:2)
And I Am STILL Keeping My Electromagnets (Score:2)
Magnets are your friends.
In related news... (Score:3)
Killer Robots Will Consider Banning UN
Whatever.. (Score:2)
The powers that be will just find some loophole. Perhaps having the bot phone a human before it fires. A human who will always respond "good to go!".
We've had this before. (Score:2)
Whoever develops them anyway wins world war 3.
Re:We've had this before. (Score:5, Insightful)
Whoever develops them anyway wins world war 3.
No one wins world war 3.
Re: (Score:2)
A notable quote that I'm too lazy to look up:
I don't know what weapons World War Three will be fought with, but World War Four will be fought with sticks and stones.
What a relief! (Score:2)
We are all safe, now that the UN has banned killer robots. Everyone will surely respect this new ban, just like they respect UN bans against war crimes. I've got a better idea...the UN should ban war! Why do they still allow war anyway???
Re: (Score:2)
Yep. They should ban killer humans and even killer animals (like some of the big cats and sharks) before banning something that isn't even a problem yet.
Re: (Score:2)
Ban Wars? I thought the KOREAN WAR was U.N. vs. "Them"?
What about cruise missiles? (Score:3)
Re: (Score:2, Insightful)
The same can be said about any bullet, when fired they can not be stopped. This proposal says picking the tearget must involve humans.
Love Killer Robots (Score:2)
United Nations (Score:4, Funny)
Why not just ban war?
This sounds like: (Score:2)
Black Mirror.
Re: (Score:2)
Sentience is not the biggest danger (Score:2)
That's fine (Score:2)
Nobody will give a shit anyway.
Too late, moron (Score:2)
"Once these weapons exist, there will be no stopping them. The time to act on a pre-emptive ban is now."
These weapons already exist, asshole. Sentry Guns have been a thing for over a decade.
Shh. Think Boston Dynamic's Stock (Score:2)
They likely haven't anything that doesn't kill people.
Mean while ISIS is arming of the shelf drones (Score:2)
There's more than a few videos out there of ISIS using off the shelf 'toy' drones to drop ordnance people in several conflict zones.
Sorry UN... Every major power is rolling out their own full military versions of drones, and they have been for years. You've lost that fight before you ever know there was a fight going on.
Unless they are from Venus. (Score:2)
I agree that we should ban killer robots, but only if they arn't from Venus. I think Killer Robots from Venus are A-Ok. Heck, they gave me some zucchini from their garden yesterday!
No stopping them? Really? (Score:2)
Clearly, the UN doesn't understand the laws of thermodynamics and power source design.
Unintended consequences (Score:2)
And then there are the unintended consequences. ITAR has been around for decades but only in the last few years has it gotten so out of hand that even a standard commercial off-the-shelf screw that you can buy in Home Depot can be considered a controlled item simply because it's part of a complete product that is controlled. What's worse, the powers-that-be want to restrict online discussions of firearms on the grounds that such behavior can be considered technology transfer never mine the fact that that
So like landmines then? (Score:2)
Next in Slashdot (Score:2)
Re: (Score:2)
Obviously, drones are only killer robots when they are controlled by china or russia! And if you ever catch an US drone, you should better return it fast, or trump will make a furious tweet on twitter!
Re: (Score:2)
The UN Will Consider Banning Killer Tweets
Re: (Score:2)
! And if you ever catch an US drone, you should better return it fast, or trump will make a furious tweet on twitter!
Actually, the story on that's changed. Now it's that you should never return it, because we don't need it to be returned - and it wasn't ever any other way!
Re: (Score:3)
The Reaper does not fall under this category.
Re: (Score:2)
"Smart bomb" doesn't refer to the IQ of a human, right?
Re: (Score:2)
Yes South Korea already has them. (That's the that has lights.) https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
Re: (Score:2)
They probably count as tanks.
Re: (Score:2)
Depending on how asininely they set up the definition, you could make the ban stretch to cover land mines as robots, with "anyone that steps on me" as the selection criteria; they certainly operate without human supervision.
Re: (Score:2)
Cluster mines and certain types of personnel mines are already banned. Mines kill around 4 civilians for every enemy and is a coward's weapon.
Re: (Score:2)
A human being is probably a better assassin.
Re: (Score:2)
A human being is probably a better assassin.
Human beings may choose to not pull the trigger if e.g. seeing children in their sights.
Re: (Score:2)
I'm quite sure US&Russia will veto that. Especially US.
No, we'll just ignore it.
Re: (Score:2)
Does anyone else think it is a little strange that we don't already have a ban on time machines and time travel?
No. That would be like banning gods, magic or other impossible dreams. A complete waste of time, which does flow.
Re: (Score:2)
What about perpetual motion? Oh yeah, that's already banned...
Re: (Score:2)
Those aren't laws...they're just theories. ;-)
Re: (Score:2)
Speaking of things there aren't bans on...
There are no laws or regulations regarding automated vehicles in my and most other US states. Almost every law regarding automobiles is tied to the "Driver" which doesn't exist with an automated vehicle.