Musk, Woz, Hawking, and Robotics/AI Experts Urge Ban On Autonomous Weapons 313
An anonymous reader writes: An open letter published by the Future of Life Institute urges governments to ban offensive autonomous weaponry. The letter is signed by high profile leaders in the science community and tech industry, such as Elon Musk, Stephen Hawking, Steve Wozniak, Noam Chomsky, and Frank Wilczek. It's also signed — more importantly — by literally hundreds of expert researchers in robotics and AI. They say, "The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow. Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce."
Unless ofcourse (Score:4, Funny)
Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce."
They run on Windows
Crime against AI (Score:5, Funny)
{Unless of course} They run on Windows
Which would be a valid reason to introduce a new international treaty on "Crimes against sentient AI" under which to prosecute those cruel enough to subject a poor AI to running on a Windows platform~
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Narrowminded Fools (Score:4, Insightful)
This is an absolutely inane idea for several reasons:
a) They already exist; you can't defend against a sea-skimming missile or SRBMs without an autonomous system, People are just too slow.
b) Bad actors are not constrained by treaties. They'll cheat. We'd be damn fools to put ourselves at a disadvantage.
What makes more sense is to have a discussion about how they're used and how they're employed. I think it's plausible that they be prohibited from being autonomously travelling or that they must have a human authorize them to continue engaging every hour or day. An outright prohibition is just polyannaish claptrap.
Re:Narrowminded Fools (Score:5, Insightful)
Just like in the good old days!
s/spammers/bad guys/g
s/spam/autonomous weapons/g
Dear Musk, Woz, Hawking, and Robotics/AI Experts
Your post advocates a
( ) technical (X) legislative ( ) market-based ( ) vigilante
approach to fighting autonomous weaponry. Your idea will not work. Here is why it won't work. (One or more of the following may apply to your particular idea, and it may have other flaws which used to vary from state to state before a bad federal law was passed.)
( ) Bad guys can easily use it to harvest weapon designs
(X) Defense systems and other legitimate uses would be affected
( ) No one will be able to find the bad guys
( ) It is defenseless against brute force attacks
( ) It will stop autonomous weaponry for two weeks and then we'll be stuck with it
(X) Users of weapons systems will not put up with it
(X) DARPA will not put up with it
(X) The military will not put up with it
(X) Requires too much cooperation from bad guys
(X) Requires immediate total cooperation from everybody at once
(X) Many weapon producers cannot afford to lose business or alienate potential employers
(X) Bad guys don't care about illiegal weapons in their arsenals
( ) Anyone could anonymously destroy anyone else's career or business
Specifically, your plan fails to account for
( ) Laws expressly prohibiting it
(X) Lack of centrally controlling authority for weapons
(X) Open relays in foreign countries
(X) Asshats
(X) Jurisdictional problems
(X) Unpopularity of weird new treaties
(X) Public reluctance to accept weird new forms of arms control
(X) Armies of worm riddled broadband-connected Windows boxes (!)
(X) Eternal arms race involved in all filtering approaches
(X) Extreme profitability of autonomous weaponry
( ) Joe jobs and/or identity theft
( ) Technically illiterate politicians
(X) Extreme stupidity on the part of people who do business with bad guys
(X) Dishonesty on the part of bad guysthemselves
and the following philosophical objections may also apply:
(X) Ideas similar to yours are easy to come up with, yet none have ever
been shown practical
( ) Any scheme based on opt-out is unacceptable
( ) Blacklists suck
( ) Whitelists suck
( ) Countermeasures should not involve wire fraud or credit card fraud
( ) Countermeasures should not involve sabotage of public networks
( ) Countermeasures must work if phased in gradually
(X) Why should we have to trust you and your treaties?
( ) Incompatiblity with open source or open source licenses
(X) Feel-good measures do nothing to solve the problem
(X) I don't want the government limiting my arsenal
( ) Killing them that way is not slow and painful enough
Furthermore, this is what I think about you:
( ) Sorry dude, but I don't think it would work.
(X) This is a stupid idea, and you're a stupid person for suggesting it.
(X) Nice try, assh0le! I'm going to find out where you live and burn your
house down!
Re:Narrowminded Fools (Score:4, Insightful)
Wow, what a load of rubbish.
Your post can be summarized in 3 sentences:
1) Legitimate militaries will not follow/trust the treaty
2) Uncontrolled individuals/groups will ignore the treaty
3) Something like this has never existed, there is no centrally controlling authority and/or treaties can not work.
You are wrong on all three. I just need to mention the treaty on landmines (Ottawa Treaty). It works. You can control the market and the militaries, at least the bulk of it. Also for chemical weapons there is a treaty, and it works. Even for chemical weapons (Chemical Weapons Convention) the number of incidents from uncontrolled individuals/groups is low.
Some of your points are also rubbish, like:
(X) Requires immediate total cooperation from everybody at once
(X) I don't want the government limiting my arsenal
This is not fantasy, banning weapon technology world-wide has been done before. Countries joined voluntarily, one by one, and are controlled by each other.
Re: (Score:2)
This is like those feel-good petitions against landmines. It's easy for us to recuse from using them until, suddenly one day, they get used against us.
Re: (Score:2)
How and when were chemical weapons universally abandoned? After they were used in large quantities.
How and when were nuclear weapons regulated? After they were produced in large quantities.
Sure, many will have cried wolf before the turning point, but the past predicts what will eventually happen with AI and robot weapons. The problem is, this time the weapons display complex behaviour, making a rogue entity particularly hard to contain. We'd better prepare for such an event by developing anti-AI tech (such
Re: (Score:2)
What you write makes sense, and is true. Perhaps this is a reason for the Fermi paradox: for a technological (= potentially detectable) civilization, technological progress brings those advances which can kill the civilization earlier than the safeguards - for example, we already have nuclear bombs, mutually assured destruction, religion, robotic warfare, climate change and ISIL like uncontrolled minorities, who can however act and destroy on the global scale.
Then there is nobody left alive to make further
I saw the movie screamers (Score:2)
Autonomous intelligent self propelled evolving landmines turn on humans. 'nuff said.
Blue screen of death (Score:2)
takes on a whole new meaning.
Yes! (Score:3, Funny)
I heartily agree.
However, I do want research to forge ahead with gorgeous robots who force you to do things to them. Gross, unhygenic things.
Re: (Score:2)
So, instead of sending robots to kill ISIS, we send robots to rape them?
I am not entirely against this plan.
Re:Yes! (Score:5, Funny)
We don't have to kill ISIS, just distract them until they're 60 or so.
Cheaper yet, we should be airdropping porn, pot and pizza. The three P's of victory!
Re: (Score:2)
I like this plan even better. They were interested in sex slaves, so lets just drop some robots on them.
Re: (Score:2)
Re: (Score:3, Funny)
... later it will tell you to buy more things so that it will return to you.
Sounds like my wife.
Postal drones will be made to go postal anyway (Score:5, Funny)
I don't see how it won't be easy for anyone to retrofit a postal drone to.. go postal.
Is it possible? (Score:5, Insightful)
Like the summary says, nuclear weapons require expensive and hard to obtain raw materials and a significant amount of technology not common in the civilian space. This is the only reason, IMHO, that nuclear proliferation treaties work as well as they do. How does this group expect governments to keep a lid on military tech that relies on ubiquitous technology found throughout the civilian economy?
Re:Is it possible? (Score:4, Interesting)
Re: (Score:3)
Banning these things from the civilian economy, or placing restrictions which would reduce demand (example: need a licence), would certainly slow down development greatly. The military's ability to finance this sort of tech is small compared to society's. Computers are a example where the military benefits from development financed almost completely by society. (Computers are only an example of the funding model, I'm not suggesting limiting computers. They do way more good than bad, which probably won't
Re: (Score:2)
Like the summary says, nuclear weapons require expensive and hard to obtain raw materials and a significant amount of technology not common in the civilian space. This is the only reason, IMHO, that nuclear proliferation treaties work as well as they do.
On the other hand a single nuke is very powerful and easy to conceal, which is why nuclear proliferation treaties are very tough to enforce.
But no one really cares if you have a dozen autonomous weaponized drones, that's not going to give you a decisive military edge and any more than that you won't be able to conceal.
How does this group expect governments to keep a lid on military tech that relies on ubiquitous technology found throughout the civilian economy?
Make it against international law, people will occasionally violate the law but they'll be only small instances. The real cause for concern is a large scale deployment and arms race which a la
Re: (Score:2)
Futile (Score:2)
This tech exists already and only needs polishing. Auto-tracking and aiming. That will continue to be developed regardless. Slap it on a mobile Google car bought at the dealer, give it a route, and let 'er go!
Having humans decide who gets killed by the robot, as opposed to the robot deciding, is an added feature, and thus disposable to core dancing bear functionality.
For it to work it has to be banned by international law so rogue states can be punished. But it is trivial with soon-to-exist pieces.
Re: (Score:2)
...so rogue states can be punished.
With drones...
Re:Futile (Score:5, Interesting)
Start with land mines. These are autonomous weapons with little or no AI and have done far more devastation to civilian populations.
The AI arms race will basically be to develop more accurate enemy identification. The "low" tech AI will have more "friendly fire" incidents than the "high" tech AI.
The old game of nethack warned not to genocide shopkeepers. If you genocide them you would kill all humans, including your own character. This seems applicable to AI autonomous weapons.
On the other hand, a theoretically competent AI running a weapon could make choices of not engaging despite a command or even an attack on it because of the risk of civilian or collateral damage. A human holding a weapon would not necessarily be able to make the dispassionate trade of self-sacrifice for some number of strangers or monuments.
Re: (Score:3)
There are already treaties banning the use of land mines. This was one of Princess Diana's causes. Of course, that only applies to nations or groups that honor treaties or international laws, and would require other nations to enforce the restriction if violated.
screamers (Score:2)
Start with land mines.
Good god no! that's the plot device for the movie screamers.
http://www.imdb.com/title/tt01... [imdb.com]
Re:Futile (Score:5, Interesting)
It's similar to the situation at the end of WWI. Versailles called for wide-ranging disarmament among all the belligerents, which was all well and good in theory. In reality, of course, a great deal of the R&D that had gone into new weaponry; tanks, planes, ship designs, and so forth, still existed. In fact, the most valuable commodity of all, the German plans for the 1919 campaign that never was, still sat in archives, just waiting for someone to come along and dust them off.
The cat is out of the bag, has been out of the bag for a few decades now. When most of us look at devices like Mars Rovers, we're impressed by the technology and science, and yet that very same technology is easily adaptable to building autonomous weapons. Even if the Great Powers agreed, you can be darned sure they would still have labs building prototypes, and if the need arose, manufacturing could begin quickly.
well, a nuclear robot? (Score:2)
no no no, nuclear powered shark (Score:2)
... with laser.
I have no fear of AI, but fear AI weapons (Score:5, Insightful)
Instead, it is the rise of a human psychopathic tyrant working with a force of soldiers that obediently kill at his command, with no chance of moral rebellion within his own force.
Re: (Score:3)
Not necessarily a tyrant. Any psychopath with money.
Re:I have no fear of AI, but fear AI weapons (Score:4, Interesting)
Re: (Score:2)
Re: (Score:2)
Well, robbery would be a bit tougher than general mayhem. In the foreseeable future you'd probably need a human in the loop, for example to confirm that the victim actually complied with the order to "put ALL the money in the bag." Still that would remove the perpetrator from the scene of the crime. If there were an open or hackable wi-fi access point nearby it'd be tricky to hunt him down.
This kind of remote controlled drone mediated crime is very feasible now. It wouldn't take much technical savvy to
Re: (Score:2)
But let's assume these are as easy to hack as a console, sure the technical limitation still brings the potential down but there would be a decent number of people who could pull that off. Definitely an organize
Re: (Score:2)
Seriously. How many people would rob a liquor store if they didn't have to be there in person and there was no chance of being caught?
Interesting angle that I haven't heard brought up much ... and me without mod points today.
Re: (Score:3, Informative)
Re: (Score:2)
Potato, potahto
Re: (Score:3)
How many times psychopathic tyrants were toppled because of moral rebellion within own forces? Hitler? Stalin? Saddam? Current-North-Kimchi-incarnation?
I think that benevolent/democratic governments are risking a lot bigger chance of unrest, because of giving too long leash.
I suppose that what is really your point is not that psychopatic regimes will be easier to rule (they already are), but rather than countries currently democratic to some extend might turn into psychopatic regimes because of not having t
Re:I have no fear of AI, but fear AI weapons (Score:5, Interesting)
There have not been 4 attempts to do this (Hitler, Stalin, Saddam, North Korea), but 400. We stopped well over 90% of them, but you don't hear about them
As for those people you mentioned, many of them were hamstrung by ethical people whose refusal to kill slowed down their crazy lessons.
Nothing rare (Score:2)
How many times psychopathic tyrants were toppled because of moral rebellion within own forces?
Countless times. It is trivial to find examples throughout history. Look up military coup and you'll find no end of examples of tyrants being deposed by their own military forces, often for moral reasons.
AI weapons == less collateral damage? (Score:4, Interesting)
when I was deployed to Iraq, we had a problem with RKG3 attacks on our MRAPS. at the time, it was one of the few things that could do real damage. RKG3 are hand thrown EFP devices. when the insurgents would attack, they would target the vehicles that had crew serve weapons pointing in the other direction. this would mean that the crew member on the weapon would not always see who threw the grenade. the lead and follow vehicle gunners would have their own fields of fire to scan and would probably miss the thrower as well. leading to confusion as to who is attacking. confusion, explosions == bad things.
an automated system that scans 360 degrees hundreds if not thousands of times a second, which can acquire, track and if need be eliminate the target, would surely cut down on collateral damage and innocent people getting killed.
Re: (Score:2)
i think you're correct and that has me concerned. war has become too sterile and easy. making them easier and even less costly for those who wage them, will make it even more likely we rely on war as the first solution.
war is supposed to be hell, that insures that few want to engage in it.
on a related note. this is why an all volunteer army in a bad idea. with no draft, most of the country is shielded from war. those who volunteer for war, know what they are risking.
Re: (Score:2)
a force of soldiers that obediently kill at his command, with no chance of moral rebellion within his own force.
Any bomb on a timer fits that description.
Drones (Score:5, Insightful)
One of the things that has consistently mystified me about Americans' complacency with drone warfare is the underlying assumption that our current monopoly on drones is going to last forever. If it's ok for the U.S. to use drones to assassinate "terrorist" anti-American agitators in Yemen, what are we going to say when China starts using drones to assassinate "terrorist" Chinese dissidents on American soil, or Europe, or elsewhere? For all intents and purposes, we're already using killbots, and the really important point here is that airborne killbots can be used (for now) with impunity across borders.
"American Exceptionalism" basically means we allow ourselves to commit war crimes with impunity.
Re:Drones (Score:5, Insightful)
Exactly. What is the difference between an automated system and one with a human at the helm when you can just replace the human with impunity if he decides he doesn't want to help you anymore?
Its not like some criminal gang where a defector could mean consequences. A defector from the drone murder program is just....replaced. Even if 100% of pilots became disgusted with the job and refused within a year.... it wouldn't even slow them down, it would just increase their training costs.
Right now, there effectively is no difference between the existing drone program and automated kill bots. The problem is what people want to do and are allowed to get away with. As long as they can murder with impunity, the methods which they use are unimportant.
Re:Drones (Score:5, Interesting)
Admittedly have never been an infantry man, pilot or any other sort of military man myself I still suspect its much easy for a guy sitting safely in chair to make a moral decision about a target, than it is for a guy in a life threatening situation to do so.
A drone operator can loiter around a target for a long time until he or she is confident said target is properly identified. A jock in a fighter-bomber does not have that luxury and also exists in constant fear someone is going to pop up with an anti air craft device, that will end his life. The drone operator has to worry an anti air craft device will ruin his afternoon with extra paper work. I known which one I'd rather imagine hovering over me deciding if I an enemy combatant or just a guy going out to milk the goats.
The separate question is does done warfare lower the barrier to entry such that conduct operations in theaters that would forgo if it meant having the infrastructure and associated costs of supporting large numbers of manned air craft in the area. This is over great concern. If we make warfare to easy we might find ourselves doing more of it. I am not buying the argument though that drones are equivalent to mindless kill bots or worse than the existing maned alternatives in any given situation all else being equal.
Re: (Score:2)
Re: (Score:2)
There is a higher political cost when you are putting soldiers' lives at risk though. If you make war "too safe" for one side, you risk lowering the barrier for aggression.
Re: (Score:2)
The main difference would be that humans operators can testify at a war crimes tribunal. We know that the Bush-Cheney administration was sufficiently scared of ending up in front of a tribunal that they felt the need to introduce the American Service-Members' Protection Act. It seems reasonable to assume that this fear also reduced the number of drone killings that they authorized.
The Obama administration tried to make the programs more secret, but that backfired again due to human operators. Stories like t
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
2. Our drones are effectively remotely piloted aircraft. Not "killbots". There is some chair jockey in a building in the Nevada desert who pilots the craft and fires the missiles and then goes home to be with his family after his shift is done.
True.
Just as an FYI though, it seems that being the pilot, is a job that comes with way more stress than was anticipated (or than the general public appreciates).
http://www.nytimes.com/2015/06/17/us/as-stress-drives-off-drone-operators-air-force-must-cut-flights.html [nytimes.com]
Re: Drones (Score:2)
Israel, China, and Pakistan already have military drones. Please, do try to keep up. https://en.m.wikipedia.org/wik... [wikipedia.org]
Re: (Score:3)
Targeted killings are nothing new and have been going on pretty much for as long as there have been governments. The US didn't start it, whether the US considers it OK is irrelevant, and nothing we can do will cause other nations to stop. There are usually much simpler and more effective ways of doing it than drones.
The only thing that would constitute "American exceptionalism" is if we unilaterally stopped doing
May as well ban rain (Score:5, Informative)
I just don't see the point. These will be developed, and no amount of banning them will stop it or even slow it down.
Re: (Score:2)
Re:May as well ban rain (Score:5, Insightful)
The moral high ground is a killzone, should hostilities break out.
Re: May as well ban rain (Score:2)
There is a very good reason for a ban. It will obviously fail to prevent development and manufacture. What a ban would realistically prevent is tactical deployment against a lesser than existential threat.
Re: (Score:2)
Earlier campaigns (Score:2)
Richard Vaughan's "No Evil Robots" [cs.sfu.ca]
Stop Killer Robots [stopkillerrobots.org]
We know one thing for sure (Score:2)
Convince the Pentagon (Score:3)
The only people who follow laws... (Score:2)
...are people who aren't likely to be the ones who trigger some form of global genocide.
Does anyone really expect governments will obey these laws? Would there be a way for more than a handful of tightly controlled people to even know until its too late? The pieces are very separable, they can be assembled by a relatively small number of people. It's not at all like a nuclear bomb, which always will look like a nuclear bomb, and quite a few people have to know they're designing and testing a device capable
Re: (Score:2)
The problem is not governments. Governments can be toppled, heads of states assassinated (if need be) or jailed.
The problem is when you get non-state entities that acquire an army of killerbots......
What stops a 15 year old script kiddie that hacks a battalion of these things and sends them on a rampage?
Re: (Score:2)
> What stops a 15 year old script kiddie that hacks a battalion of these things and sends them on a rampage?
I wanted to make a joke, because this is a great setup to one, but it frankly won't be funny in a few years.
The parts of the world that seem automated, are already exploited, no exceptions. The biggest and glaring sign of this is SWATting. Hackers identify that the government is responding with overwhelming terror force to phone calls. Hackers "hack" this system by simply spoofing a dangerous sc
Re: (Score:2)
You cannot legislate science...... (Score:2)
You cannot legislate knowledge away. At best, it will simply delay the development of autonomous weapons by a few years.
Worst it is will allow rogue nations and terrorist organizations to leap ahead which in itself can have disastrous consequences....
cyborg union just protetcting their turf (Score:2)
It's a little late folks.... (Score:5, Insightful)
That ship has sailed a LONG time ago... We've been making such weapons for decades.
What's a mine? What's a cruse missile? Proximity fused ground to air shells? Homing torpedos? What's all that "fire and forget" stuff we've been building?
I'm afraid the cows are ALREADY out of the barn on this....
Re: (Score:2)
The thing is, no one is combining the mine's logic of "acquire target near me, that is enemy" with the cruise missiles "fly to externally chosen target", and thinking that's a good idea. With the tech becoming just good enough now, the fact that these are being smooshed together is the new thing. They'll be sold as "thing that identifies enemies and attacks them", when in fact they will be "thing that can pick out humans and kill them at distance"- a mine that isn't content to chill for a few decades befo
Re: (Score:2)
Computers Are Getting Better Than Humans at Facial Recognition [theatlantic.com] If computers are better at identifying enemy targets than humans, why wouldn't you want a computer to pick the target? Isn't one of the selling points of autonomous cars that they'll lead to fewer accidents compared with human drivers?
Re: (Score:3)
The heck they aren't.... Anti ship missiles do EXACTLY that for years. Fly to waypoint, engage any target you see, and go boom. Cruse missiles likely have the same capacity, fly to waypoint, search for and engage a certain kind of target (say a mobile Ground to Air radar) in the area, go boom when you find it. Both are aimed in the general location of an enemy and turned loose to find their own targets.
Mines are selective in the targets they engage. Antipersonnel mines differ from antitank mines in ho
Re: (Score:2)
A modern naval mine, for instance, is deployed and waits for an activation to autonomously engage targets. Does that meet your criteria?
While there is some room to nitpick his examples, they're largely relevant despite your dismissal of them -- and that's part of the problem. For example, an autonomous homing artillery shell might not fit your definition as it requires human interaction to initially deploy it, but once deployed, it chooses its own targets. The same is true for many other potential uses of a
Re: (Score:2)
Out standing in his field perhaps?
Close the barn door... Or be prepared to wait until he comes home...
Already here (Score:2)
The US army and navy already have automated anti-aircraft and anti-missile systems deployed and in-use, as have many other countries.
Do these count in a ban of "robot" weapons systems?
Re: (Score:3)
I think the point is to ban autonomous weapon systems, not automatic.What's the difference? An automatic weapon system can destroy targets you choose, an autonomous weapon system can destroy targets it chooses.
useless (Score:4, Insightful)
You can't "ban" something that consists of little more than putting together some guns, some standard AI, and some standard robotics platforms. There is no way to detect violations of this ban. It's like trying to ban the use of electric motors in offensive weapons. Good luck with that.
The main purpose of such a ban is to make a bunch of people feel good about themselves and to let them demonstrate to the world what wonderful and important humanitarians they are.
Next up, perhaps (Score:5, Interesting)
Prominent world politicians urge adoption of new changes to the C++ standard concerning private inheritance and templates.
What this is trying to do is imply that because they have technical expertise in how dangerous AI-controlled weapons are, that technical expertise makes them experts about political decisions concerning weapons. It doesn't, and there is no more reason to pay attention to them than to the average guy in the street (who understands that some weapons are dangerous, and may have opinions on their use, but certainly doesn't get a national press release about it).
Think like a soldier in the next war for a moment (Score:5, Insightful)
... Okay... so... you have an option to use a kill bot against the enemy that wants to kill you... and if you go out there... you could be killed.
Or... you send in your terminator bot and worst case they scrag the robot.
What are you going to prefer here?
A lot of people offering opinions here are not speaking from that perspective. They're speaking often as not from the perspective of some civilian ideologue that knows they're not going to go to war.
I know that if I go to war... I am going to want the best weapons my society can make for me along with the best defenses the best training and ideally leaders that are not complete fuckwits.
That means I want the robots. I want them and I want them to be fucking vicious.
Go on youtube and you'll see US soldiers cheering when air support shows up and blows the fuck out of someone shooting at them.
https://youtu.be/1IcvjD4VVjY?t... [youtu.be]
Now... if you are a country that has the ability to build kill bots... and you might be on the firing line... do you or do you not want to use killer robots to kill your enemies?
You have to put your brain into war mode to understand the question.
My vote... is yes.
https://www.youtube.com/watch?... [youtube.com]
When I go to war... I go to WAR.
Re: (Score:2)
https://www.youtube.com/watch?... [youtube.com]
https://www.youtube.com/watch?... [youtube.com]
Have fun with that idea.
Your old garbage will be trashed before you even know you're under attack.
Relying on retrograde cold war tech for your front line is a mistake.
Re: (Score:2)
I think you're being paranoid by assuming the our own robots will turn against us or be a general hazard to our soldiers.
At worst, they'll be like land mines... sure... dangerous... if you walk through the minefield. Stay back a bit and you're fine.
the military isn't going to use weapons that put our own soldiers at risk.
Re: (Score:2)
login and I'll debate the point.
Re: (Score:3)
You're implying that we just killed a quarter million people with no context, reason, and that we did so intentionally.
No I didn't, I was doing it in the context of the Iraq war where they're understood to be excess deaths.
You're also attributing all deaths to our actions when the responsibility has to be spread around to include the taliban, various terrorist sponsors, and natural forces like famine etc that kill people without any direct human volition.
I glibly dismiss the question because it isn't intellectually valid.
If you want to talk about death tolls in war zones we can do that. But laying all the death's at our feet like we intentionally killed all those people, had no reason to in, and we are solely responsible is invalid.
It's a standard methodology. Over the period X deaths would normally be expected instead Y occur, Z=X-Y is roughly the number of excess deaths attributable to your actions. You're not as nearly guilty as someone who pulled the trigger but in a debate of whether an act contributed to the greater good the fact remains that Z lives were lost due to that act is completely relevant.
Wrong, we tried to actually rehabilitate Russia. We would not have funded their space program or made so many diplomatic gestures if we wanted to treat them like an enemy.
There was even serious talk about inviting them into NATO.
As to surrounding them with enemies... all we wanted to do was secure the self determination of past victims of their aggression. Our intention was not to threaten Russia but to give other nations a chance at freedom, modernity, and prosperity.
NATO was an alliance formed
Re: (Score:3)
quote tags would be so much easier to read.
""No I didn't, I was doing it in the context of the Iraq war where they're understood to be excess deaths.""
excess deaths?
First, we're not talking about Iraq. I told you that.
I wasn't talking about Iraq, I was referring back to a previous statement I'd used that happened to include Iraq.
Second, we're talking about Afghanistan.
Third, "excess deaths" what does that mean?
Exactly what it sounds like, the additional deaths that occurred because of the conflict.
Fourth, your cited kill number did not include context, it did not separate out people that would have died if there were no war, the actual causality figures are actually highly estimated and no one really knows what they are, you conflated people killed by the enemy with people killed by the US, you conflated soldier deaths with civilians, deaths caused by famine or disease were conflated with deaths from weapons, etc etc etc.
It's approximate, which is why there are large ranges given in the estimates (I've actually chosen conservative ones), but wars can certainly cause famine and disease and those deaths matter.
Anyway, we've come to the part of the discussion where I have to start looking things up.
In regards to the Afghan war, wikipedia puts the number at:
26 thousand.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Alright lets look at that source:
During the war in Afghanistan (2001–p
Re: (Score:3)
I don't how many times I have to make it clear to you that i'm not going to talk about iraq because its a fucking whine and I don't find it useful on the issue of general US foreign policy or geopolitics.
I wasn't talking about Iraq!!
Yes I mentioned it at the end of my last post but the vast majority was about other topics.
Your 250k number was mostly talking about Iraq so far as I know and the methodology on that number is a fucking joke as well. But you know what... I'm not talking about it.
Hmm, what did I say again?
"Afghanistan there's possibly in the range of 250K deaths, and a lot of the country is still under Taliban rule."
So that number is specifically about Afghanistan, your own source that contradicted you was about Afghanistan, the only time I said Iraq was in explaining that I'd already talked about a similar number for that war.
As to excessive deaths... I don't find this to be a useful statistic because it conflates all deaths into one number. Its more complicated than that and I don't appreciate over simplifications.
But taking only direct conflict casual
Everyone is overlooking a key point (Score:2)
Once autonomous weapons become commonplace, the 2nd Amendment will guarantee that any US Citizen should be able to own and (not) operate one for fun and self-defense. It will, if I read the NRA talking points correctly, make for the absolute safest place in the entire universe.
Re: (Score:2)
If you knew anything about the NRA you'd know that they only promote responsible use of firearms. Yes, you can neckbeard that an oxymoron, but at least half the voting population here still believes there are very important reasons why the right to bear arms needs to exist.
Re: (Score:2)
Actually, no. They lobby to eliminate restrictions on the acquisition of firearms of pretty much all types to nearly everyone. Their justification is typically that more guns = a safer society, principally because criminals will fear to pray on people who might have a weapon. It's a powerful message that speaks to the fear of every human that they could be preyed upon and/or find themselves helpless in violent situation. It's been extended to protecting others as part of patriotism.
Promoting the responsible
Cyberdyne Systems opposes this (Score:2)
The 3 Laws are Hard :( (Score:2)
Re: (Score:3)
Even if it did, still would not stop it's development.
How do you make it so that nobody in the world can program an AI capable of pulling a trigger? What about a baseball bat? How do you stop AI written in some 15 year old basement from firing a gun that does not have a traditional trigger?
Short answer: you cannot.....the geneie was let out of the battle many many years ago....We cannot stop it at this point unless we give up electricity....worldwide....And manage to destroy all generators, all solar, wind
Re: (Score:2)
No AI is required, simple control laws (very simple)
Re: (Score:2)
Re: (Score:2)
Aside from the improbable business side of it, with respect to the technology, both, but particularly Tesla are pioneers in their respective fields. Current day G