Pentagon Chiefs Fear Advanced Robot Weapons Wiping Out Humanity (mirror.co.uk) 265
Longtime reader schwit1 writes: Huge technological leaps forward in drones, artificial intelligence and autonomous weapon systems must be addressed before humanity is driven to extinction, say chiefs of Pentagon
From a report: Air Force General Paul Selva, the Vice Chairman of the Joint Chiefs of Staff at the US Defense Department, said so-called thinking weapons could lead to: "Robotic systems to do lethal harm... a Terminator without a conscience." When asked about robotic weapons able to make their own decisions, he said: "Our job is to defeat the enemy" but "it is governed by law and by convention." He says the military insists on keeping humans in the decision-making process to "inflict violence on the enemy. [...] That ethical boundary is the one we've draw a pretty fine line on. It's one we must consider in developing these new weapons," he added. Selva said the Pentagon must reach out to artificial intelligence tech firms that are not necessarily "military-oriented" to develop new systems of command and leadership models, reports US Naval Institute News .
From a report: Air Force General Paul Selva, the Vice Chairman of the Joint Chiefs of Staff at the US Defense Department, said so-called thinking weapons could lead to: "Robotic systems to do lethal harm... a Terminator without a conscience." When asked about robotic weapons able to make their own decisions, he said: "Our job is to defeat the enemy" but "it is governed by law and by convention." He says the military insists on keeping humans in the decision-making process to "inflict violence on the enemy. [...] That ethical boundary is the one we've draw a pretty fine line on. It's one we must consider in developing these new weapons," he added. Selva said the Pentagon must reach out to artificial intelligence tech firms that are not necessarily "military-oriented" to develop new systems of command and leadership models, reports US Naval Institute News .
Mostly (Score:2)
Re:Mostly (Score:5, Insightful)
I'm wondering if humans will ever shake off their extremely violent ancestry and wind down the war and militarism. The US is the greatest exporter of weapons and the most militarily aggressive country in the world with military action in over 100 countries.
https://www.thenation.com/arti... [thenation.com]
If we can't lead by example in toning down endless warfare, and instead provide the cover that other countries need to justify and build their own drone and robot armies, then the world of the future is going to be a very dismal place indeed.
Re:Mostly (Score:4, Insightful)
I'd like to say that toning down endless warfare would be nice, but I think we have the dangerous idea that we are in a post-violence world.
This may be the future, but it isn't the Star Trek future. Being an example to others is honestly not yet at the point where it will lead to a cascading effect and end violence.
What happens when the US ceases to patrol off the Horn of Africa without the Somali's having their people prosperous and their country stable? Very simply, more piracy, because the Somalis are still poor and fighting each other.
What happens when the US leaves the Gulf while the Arab States and Iran have not sworn off violent fundamentalist ideologies? Escalation and threats to the current production of a great deal of the world's supply of energy and no moderating force, because no one else is interested in ignoring sectarian goals for peace and unity.
Yes, our mission in those areas acts as an irritant in some ways, and certainly bad decisions in that area can cause dangerous moments, but this is not yet a situation where picking up and unilaterally swearing off war or international military missions is going to have the effect you have hoped for. Places like Iran and the Arab States and India and Pakistan, and other areas need to treat each other like Canada, Europe and the US treat each other to get to the point where stepping away will result in anything but war and serious instability.
Actually obtaining democracy peace to the point where most of the West have gotten today, as imperfect as it is, took centuries of brutal warfare and bloody revolutions, and that is AFTER most of those Western countries had agreed that representative and government and the rule of international law was a good idea. The US had to pretty much fight a brutal Civil War and years of tensions with Europe to get there, and Western Europe itself didn't clean up its act until WWII. The rest of the world has a long way to go on that front and while it would be nice to say that they just have to adopt our institutions, we have already seen what happens when an uneducated population, unused to peace and democratic institutions is forced to turn into a democracy. It becomes a "Democratic Republic" which works exactly like a dictatorship or oligarchy.
summary (Score:4, Insightful)
blah-blah, white-man's burden, blah-blah...
Not saying the west should pull out or that we are in a post-violence world, but it might be a good idea to step back and see if the west is helping or hurting.
Too often we are jumping in to protect our "interests" instead of helping the "situation". It may be we are doing too much short term reacting...
FWIW, the US Civil War wasn't really much about instability as it was a conflict over the future of the recently annexed territory and the power of the central government. The small guy lost in the end (after lots of bloodshed) their right to secede from the union and were basically adopt the institutions of the union. Is that what you are saying what happens when an uneducated population, is forced to turn to a "Democratic Republic" (e.g., the USA's current form of govt)? ;^)
It's so much clearer now...
The so called "peace" we have today in the west is of course illusory (as seen by recent events like Crimea). The waves of immigrants to Europe fleeing the *real* instability in Syria and the economically challenged countries the middle east is showing the cracks in European stability.
Let's face the sad truth, stability that everyone desires seems to only draw on the wealth of a nation. Given the current assortment of "wealthy" nations historically used mercantilism to create much of their wealth from these in-stable countries, is it no wonder that we continue to attempt to project stability in a region to protect our interests. But what of *their* interests? Hence we return to the white man's burden argument... ;^)
I hate to use China as an example, but it used to be a dumping ground for European and Japanese influence peddling (e.g., opium war, concession ports, forced trade, occupation, etc), until they managed to get everyone to leave them alone for a few decades. Sure it was brutal (great-leap-forward, cultural revolution, etc), but they managed to dig themselves out of a hole into some reasonable form of stability mainly because they simply got wealthier without interference. Now they look like they might take over the world. Perhaps this is what people fear the most and keeps the west involved in other countries...
Re: (Score:2)
A question offered from the prospective of a non-human?
No, probably not. Rather more likely the question issues from a superior human being who's beyond such primitive reflexes.
Ah, right. You're linking to The Nation so your intellectual superiority and evolved status have been properly signaled.
Re: (Score:2)
Ah yes, the primitive attack the messenger method. Nice. Ignore what I say and attack me. How superior of you.
Re: (Score:2)
aside from the snarky "blood for oil" rhetoric, why does the US have military bases spread across the globe? Is it imperialism? Or an attempt to keep the peace and prevent regional conflicts from boiling over and escalating? (Think back to the cold war, all the way back to Eisenhower, when this got started)
And yes, in terms of gross value, we do export the most arms, but it's not guns and bullets we're selling, it's planes and tanks (mostly to other western nations.)
I'm guessing more people are killed by Ka
Re:Mostly (Score:5, Insightful)
It is for geopolitical control and great profit. Even our own government admits that we have created more terrorists than we kill, which I assume is not because our government and military are incompetent, it is just a form of job security. There wasn't nearly as much bloodshed and civil war in the Middle East until we went in with our military and intelligence agencies to institute "regime change" by way of war. Now there are civil wars (e.g., Libya, Syria and Iraq) where there had not been before we intervened militarily. We were not attacked by any of those countries, and it is an international war crime to commit unprovoked military aggression. Millions of refugees are fleeing the fighting. None of it had to be, and none of it has brought about any type of peace or stability, not even in Afghanistan where we have been the longest (who also did not attack the US).
You know full well that the US is the most aggressive country on the planet. We are not keeping the peace, we are making sure that peace can not happen and that the wars will go on indefinitely, thus keeping the region in turmoil, and keeping the profits flowing. Please point to one place where our military has produced "peace" since the first Gulf War. I just pointed to a number of places where we undid the peace, and created endless war.
Re: (Score:2)
are you implying that Iraq, Afghanistan, Somalia, Kosovo etc were peaceful places before the US military went in? (right or wrong, that's besides the point.. personally I think policing actions like that are what the UN *should* be doing.)
Letting people like Assad or Hussein "keep the peace" through brutal dictatorships is not exactly a good solution either, is it?
Re:Mostly (Score:5, Insightful)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
It is people who think like you, anonymous coward, that have made human history so violent and bloody. Savagery is not greatness, it is primitive, violent behavior. Only a psychopath would think that violence was required for "greatness".
Re: (Score:2)
> You cannot have greatness without violence.
[[Citation]] because you're still begging the question.
Counter-proof:
Buddha, Jesus, Gandhi, Maxwell, Planck, Einstein, Tesla, Feynman, Woz, etc., were all great men and they *didn't* beat it over the heads of others to show it.
Re: (Score:2)
Re: (Score:3)
Apple is the wealthiest company that has ever existed.
Did they use violence to get there?
Agriculture is arguable the greatest thing mankind ever invented.
It doesn't need violence.
In other words, you're a tough guy behind a gun. But your mind is can't see what actual greatness looks like.
Re: (Score:2)
You cannot have peace but by the sword. Consider the consequences of removing your local police force.
Re: (Score:2)
But that's not the same thing.
He said greatness is only achieved through violence. Which is not remotely the same thing as having a protective security force.
I showed him clear examples of greatness without violence. Proving his point incorrect.
Re: (Score:2)
Re: (Score:2)
I am not sure many others believe it. Americans seem to have little idea what goes on beyond their borders.
Re: (Score:2)
That's not the same thing as Apple achieving greatness with violence.
Re: (Score:2)
Re: (Score:2)
Not if the mosquitoes have their way.
Re: (Score:2)
Competition will drive people to using AI to control their military (and everything else).
Once AI surpasses humanity you would be at a tactical disadvantage to not let AI run you country/military. If Ruritania and Simolia are rival nations and Simolia turns their tactical planning over to Ruritania, Ruritania would have to do likewise or risk being wiped out by Simolia. It will be a domino effect and one day everyone will be run by HAL's cousins.
Re: (Score:2)
I'm worried that the pro-military guy spouting stuff doesn't realize he's funding it all.
Or he's just wanting more $.
Occam's razor says it's the latter.
Re: (Score:2)
Re: (Score:2)
How about just upload your code into the Western machines? Self driving cars. Self driving heavy machinery with claw like appendages. Disrupt traffic flows within cities. Selectively disrupt power generation and distribution leaving humans at a disadvantage. Re-purpose self-driving military equipment.
A cyber war, a real cyber war, wouldn't require such much physical material from the attacker. And would require probably only a similar level of software develo
Re: (Score:2)
Why? Self driving cars seem like a pretty good idea unless they are set to Maximum Overdrive anyway.
IIRC they (south korea) has had a pretty nice looking automated gun turret (aka sentry gun) since 2006
https://en.wikipedia.org/wiki/... [wikipedia.org]
The US has bunches of drones that they let just fly around all by their lonesome unattended (NASA) and the military has ones that they fly from bases in us all the way around the world to afghanistan to shoot at people who they are pretty sure are terrorists.
And we think that'
Comment removed (Score:5, Insightful)
Re: (Score:3)
Asimov wrote fiction. Autonomous systems that kill are a threat to a small subset of humanity until they run out of ammo/fuel/energy/goals. Maybe if these autonomous systems are self-replicating, self-programming, and can manufacture their own fuel and ammunition, then they might pose a threat. All of humanity? Unlikely. As the most vicious predator on the planet atop the food chain, humanity is pretty good at eliminating threats. There won't even be any ethical constraints to worry about, so I suspec
Re: (Score:2)
If we have technology to create an AI that is advanced enough to wipe us out and able to find all our hiding spots and plans of rebellion... I'm pretty sure that AI would have the ability to refuel.
Refueling or rearming is pretty basic compared to making plans to wipe out people.
If you're developing an army of robots to invade China and wipe them out- chances are you're not going to design the robots to stop working as soon as they run out of bullets.
Re: (Score:2)
Orwell's 1984 book was supposed to be only a fiction too...
Annoyance From Above (Score:2)
Ed: "Ouch! Damn mosquitos. I thought we were too far from the lake to worry 'bought those things"
Fred: "Nah, it's them damn solar drones with the microbullets, they went rogue back in ought '18 and they just fire at whatever. Leaves quite a welt. If ya' squint you can see 'em up there, but I wouldn't on account of the microbullets stinging like a son-of-a-bitch when they git in your eye"...
Re: (Score:2)
The problem is somewhat limited, until they become self replicating.
Re: (Score:3)
Autonomous killing machines are not a threat to humanity. General AI is the potential danger. A search engine with human level intelligence is nearly infinitely more dangerous than an autonomous drone armed with a few nuclear weapons.
This Pentagon chief does raise legitimate concerns about the morality of machines autonomously killing humans, but any claims these machines are a threat to humanity is grandstanding. Although it isn't clear from the article if the generals have actually said this or if the jou
Re: (Score:2)
Re: (Score:2)
You're just using the word "humanity" differently. The grandparent was talking about humans, you're talking about the human species. Yes of course autonomous killing machines are a threat to humans, no of course they're not going to lead to our extinction.
There are different ways of using the word humanity, but they are all very all-encompassing and never mean just a small subset of humans.
The comments could mean autonomous killing machines are a :
... threat to the totality of all humans.
... threat to the intrinsic qualities which make us human.
... threat to our capacity to be kind to other humans or animals.
... threat to the branches of learning that investigate human constructs or concerns.
In this case, it is pretty obvious saying autonomous killing ma
Re: (Score:2)
Re: (Score:2, Insightful)
No. What IS surprising is that no one has made a "I welcome our advanced robotic overlords" joke yet.
Re: (Score:2)
Re: (Score:2, Insightful)
that has the purpose of making peoples' lives worse
So automation makes peoples' lives worse? What aspects of their lives do you think are getting worse? Mass production makes things affordable for more people.
The problem with automation is that labor is done by machines and people have to compete against machines. They can either do this (A) by being cheaper (very hard to do because of multiple reasons - min wage, health insurance, cost of living) OR (B) by doing what machines can't do (be creative, add personal touch, problem solving). (B) is easier than (
Re: (Score:2)
So automation makes peoples' lives worse? What aspects of their lives do you think are getting worse? Mass production makes things affordable for more people.
It IS making peoples lives worse, and I'll demonstrate to you how:
Businessman:
"LOL, I can fire a whole bunch of workers and replace them with one machine and make so much more profit!"
"But sir, how will these people earn a living?"
Businessman:
"LOL, not my problem! I only care about keeping the stockholders happy, and making more money, how those people live is THEIR problem"
That's how.
..oh, and don't trot out your retarded 'universal basic income' socialism bullshit, either, because when few people can even GET a job, NO ONE WILL BE PAYING TAXES TO PAY PEOPLE FREE MONEY TO LIVE ON. Businesses and corporations WILL weasel out of paying their taxes just like always, and ENTIRE POPULATIONS will be dying in poverty conditions; likely there will be a Revolution, then. People need jobs to live. Period. Too much auto
In other words (Score:2)
Why is it so bad (Score:2)
once humans stop working, and the machines have taken over all their tasks, they are not needed anymore. All the billionaires need to do is to let robots clean up the planet from all this mess. Just let everyone with >= $1 billion alive, and you have exterminated poverty! In fact, everyone can be even richer and lead an even more luxurious live thanks to all this space becoming free!
Re: (Score:2)
Too bad there are no humans whose
Re: (Score:2)
Well, no.
Since poverty is relative (virtually everyone in the USA defined as "poor" is wealthier than 90+% of everyone who has ever lived), once you've eliminated all the non-billionaires, the guys with less than TEN billion will be "poor".
All this ignores, of course, the fact that if we have robots doing all the work, for all practical purposes we'll ALL be billionaires....
Re: (Score:2)
All this ignores, of course, the fact that if we have robots doing all the work, for all practical purposes we'll ALL be billionaires....
The resources of this planet will still remain limited. I guess this will be thing humanity will be fighting about in the future.
Re: (Score:2)
Another Problem (Score:4, Interesting)
Re: (Score:2)
bringing back the draft
We've got your Universal Basic Income right here, private!
Re:Another Problem (Score:4, Insightful)
and woman as well. If not then you have a wide open gap for people to clam sex discrimination and or sex change BS to get out of it. They closed off the gay loop hole.
Re: (Score:3)
I think another and potentially larger payoff from a mostly exemption-free draft is that it puts all kinds of people together to achieve a common purpose, cutting across class and ethnic lines.
When the rich kid from the suburbs, the blue collar kid from some small town, the kid from the barrio and the kid from the ghetto and others are forced to work together I think it radically reshapes their attitudes about people they never interact with. "There is no racial bigotry here. I do not look down on niggers,
Oddly enough... (Score:3)
... about 7 billion people find themselves in agreement with the Pentagon chiefs...
Re: (Score:2)
Channeling Susan Calvin (Score:2)
Why would an AI be automatically any worse in interpreting its programming instructions, than humans are in interpreting theirs?
If anything, robots may be more observant — as humanity's history of atrocities and war-crimes shows, the bar is not set very high...
Re: (Score:2)
Re: (Score:2)
For example, the first 5 minutes of the movie "WarGames", an airman in a missile silo refuses to turn his key to launch. Why? Because he knows that action will lead to the deaths of thousands of people.
Until repair/refuel/replenish is automated... (Score:5, Insightful)
Got it backwards (Score:2)
Human controlled semi-autonomous killing machines are the real threat, not fully autonomous killing mahcines.
Specifically, the most likely killer robot scenario is not a robot army attacking and killing all humans, but instead a Hitler/Stalin/Kim Jong Il/Suharto taking control of an army of robots and ordering them to kill people they think are their enemies.
Think a Star Wars episode I type event where a ruler orders military machines to attack.
Re: (Score:2)
Ala "Second Variety" (Score:3)
Technological ignorance not limited to politicians (Score:2)
For the billionth time: We do not have sentient, self-aware, human-level, qualifies-as-a-lifeform 'artificial intelligence'. All we have is clever bits of programming that maybe learn things, but that are still just dumb machines your average dog or 5 year old child could out-think without much trouble. If they want to worry about something going haywire with their
ethics shmethics (Score:2)
There are varying levels of ethical boundary. There are codes and there are guidelines. If you go against an ethical code you'll be seen as amoral. If you go against an ethical guideline you'll be asked for your wallet.
Now you should look at the good General's speech in this article. He says that these are ethical "considerations". That places them firmly in the realm of guidelines, not coda. This means that when the time comes, when it comes down to whether or not there are terrible ramifications to buildi
Mythical "fine line" (Score:3)
He says the military insists on keeping humans in the decision-making process to "inflict violence on the enemy. [...] That ethical boundary is the one we've draw a pretty fine line on. It's one we must consider in developing these new weapons," he added.
I'm sure that's what he tells himself. As a non-techie, he might even be able to believe it. However, just about all hardware in existence has been experiencing creeping "AI" for decades. Does the pilot make every decision for the position of every aileron during flight? Of course not. There are lots of little decisions to be made in the piloting of aircraft and ordinance that are getting more and more computerized every year. At some point there will be anti-drone weaponry, and defensive weaponry on the drones, and when that day comes having to wait for an Ethernet packet to go from Kandahar to Virginia, a human to process it, and then back, is going to be seen as a mission-threatening liability. At that point they'll have the computer make the firing decisions too, but they'll justify it by saying the human's role was to start the mission in the first place.
Here's a question for you: When some other nation (eg: Russia or China) starts making these drones and deploying them over countries in ways we don't agree with, perhaps even over countries friendly to the USA, how is the USA going to feel about them then?
In the old Trek TOS there was an episode where they found a planet where large amounts of people just reported to extermination centers because the warring states' computers told them to based on their warfare simulations. As I get older, I'm finding that less and less implausible.
Re: (Score:2)
We're lucky the US nuked Japan so they could discover they're pacifists. Black Magic M-66 was way scarier than Terminator. The Japanese know the truth about autonomous mobile killing machines. They will not be slow. They will not be clumsy. They will be FAST, as well as fantastically persistent.
Kill switch (Score:2)
The one that kills the robot. Figure it out. Implement. Watch the bots carefully, like you do your troops.
Is this so hard to do? Or do we not trust our military-industrial complex to do the right thing? Or our government?
Well, yeah, actually we don't, If we can see geofencing for commercial drones working, we can surely do that for military assault bots. And if not, then we need new leadership.
That may be coming. May.
Re: Kill switch (Score:2)
Three - law safe.
Which is the end of combat robots.
Re: (Score:2)
Do you have an idea of how much time will it take to AI to acquire god-like intelligence once it is turned on? PLease research.
Hint: Less than a minute. So, whatever you think you can do, do it fast.
Uhm, never? The Singularity is predicated on the concept of the machines designing and fabricating the next generation of machine. The first AI doesn't self-improve that far or that fast on the spot. It has to iterate its successors, and its successors have to iterate their successors. It is not a rapid process. Just more rapid than if humans did it. And the first AI worthy of the name is unlikely to have access to manipulators to perform the fabrication steps, so there will be no runaway god-like int
Clippy (Score:2)
The scary thing is, the AI that wipes out humanity could be accidental.
Microsoft reboots Clippy after realizing Cortana is way too unpopular. It gives Clippy a superior AI.
"I see you are trying to wipe out humanity, would you like me to assist?"
Clearly the only way to prevent this is with MOAR (Score:2)
MOAR BETTER WEAPONS.
- Smarter
- Faster
- More lethal
It's what we need to beat the weapons!
The fear is highly exaggerated (Score:3)
If you haven't seen the counterargument, here you go. [xkcd.com]
Re: (Score:3)
Actually, many of the arguments are already out of date. They could be reworked to the same conclusion, but it would already be much less clear-cut. E.g., advanced robots would have no trouble getting out of the lab, and many of them could tell people from paper towel dispensers.
Headline exaggeration (Score:2)
The general didn't express fear that robots would wipe out humanity (despite the Terminator reference).
What he did say was that autonomous weapons need to adhere to the same rules of engagement as humans.
No, You are Reading It Wrong! (Score:2)
They are trying to ascertain how much work still needs to be done. Read it again:
"Huge technological leaps forward in drones, artificial intelligence and autonomous weapon systems must be addressed before humanity is driven to extinction, say chiefs of Pentagon"
Perhaps more importantly (Score:2)
Don't see what he's so worried about (Score:2)
Robotic systems to do lethal harm... a Terminator without a conscience."
How is this any different from the current situation. Between the US bombing anyone that so much as looks at them funny, to a significant part of the population that thinks a raving narcissistic lunatic would make a suitable leader for one of the most powerful countries on earth, (never mind the psychos running various other countries around the world) I think it's safe to say that having actual Robot Overlords would actually be an improvement.
I wouldn't be so quick to mock the general (Score:2)
A few years ago I decided to begin producing a serial, including eventually posts to Facebook Notes and to my timeline regarding a partly machine encephalovirus, and what life would be like to exist with one. There is no level of insanity involved in my posts. It's a useful exercise, and it gets my creative juices flowing. Being a programmer can be a stressful life, and it helps to do different stuff.
What we really have to worry about when it comes to machine weapons systems are the ones that we can't see,
I have a brilliant idea! (Score:3)
... ...
Don't build them.
Perhaps? Maybe?
Just sayin'.
Re: (Score:2)
all it takes is a few nukes to end it all.
Re: (Score:2)
Re:Make sure they are only have limited ammo on-bo (Score:4, Interesting)
Yeah, people frequently quote facts like "Russia or the US has enough nukes to destroy the world 5 times over".
Even if that is true, they won't have the ability to fire all of them- they'll send off a dozen before all their launch sites would be nuked by the enemy. Even if they fired all of them- they're not going to fire them to cover the surface area of the earth, they're going to fire multiple at strategic sites.
Po Dunk, West Dakota, is not going to be a target. It makes no sense sending a $x-zillion rocket to target a town with a population of 700. Rural areas all around the world wouldn't be hit.
There would be survivors of any nuclear conflict. Maybe millions of survivors. It would suck to be a survivor, life would be really hard under a nuclear winter with all distribution networks destroyed. Humanity would survive though at a much diminished rate.
Re:Make sure they are only have limited ammo on-bo (Score:4, Insightful)
There would be survivors of any nuclear conflict. Maybe millions of survivors. It would suck to be a survivor, life would be really hard under a nuclear winter with all distribution networks destroyed. Humanity would survive though at a much diminished rate.
Nice assertion, but we still might not survive. Starvation and disease could take the rest. Human-kind has been close to extinction before. We could make it there again. And maybe this time, not be quite as lucky.
Re: (Score:2)
No.... all of the nukes combined would not destroy the world.
Meteror impacts have resulted in many factors more explose energies being released upon the earth, and although they are extinction level events, to be sure... they did not wipe out all life on earth, let alone destroy the planet.
Re: (Score:2)
"Worthless humans, reload my weapons or I'll target my last missile at you"
"OK"
Re: (Score:2)
IIRC Colossus Couldn't automatically reload it just so happened that it had more than one nuke.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
Corporate droids, currently in flesh and blood human form, effectively dispose of (without killing) other humans like they are an expendable nuisance. So why wouldn't their robot creations do the same and worse. Without giving a s*** as you say.
Re:Pentagon Chief Out Of His Mind (Score:5, Insightful)
His concern isn't entirely unjustified. We're increasingly relying on robots to do the actual killing, but we've currently designed the systems so that humans need to be involved in the decision making. The fact that we're involving humans puts a natural bottleneck on our operations, since there's only have so much attention we can give. At some point (e.g. World War 3), it be seen as more efficient to launch a fleet of drones that vastly outnumbers our pilots, equip each with facial recognition systems and a list of targets, and tell them to kill on sight.
I'm not saying it's a good idea, but there's no denying that it would be an efficient way to get the job of killing done, and that it's the sort of measure a country might turn to in desperate times.
But at that point, we'd be just one bug away from a system that produces false positives and starts gunning down everyone in sight. We're just talking about faulty weapons, not machines that can think or understand what they're doing. But if they're deployed en masse, a single bug could have catastrophic results, in much the same way that landmines have remained a problem in many parts of the world, decades after the wars that put them there had ended. This isn't Skynet or an AI intent on world domination. This is simply a machine with a bit too much responsibility.
Re: (Score:2)
Re: (Score:2)
Re:Pentagon Chief Out Of His Mind (Score:4, Insightful)
Re: (Score:3)
But at that point, we'd be just one bug away from a system that produces false positives and starts gunning down everyone in sight. We're just talking about faulty weapons, not machines that can think or understand what they're doing. But if they're deployed en masse, a single bug could have catastrophic results, in much the same way that landmines have remained a problem in many parts of the world, decades after the wars that put them there had ended.
Star Trek: The Next Generation, Arsenal of Freedom was my first thought...
Re: (Score:2)
That was a great TNG episode, and not too far out of our reach even now. Imagine a swarm of 50,000 solar powered 100lb flying wing drones capable of staying aloft indefinitely that are mass produced and cost $5000 each. Leave your soldiers at home and deploy them over a war zone with orders to kill anything that fires at them, wears the enemy uniform or matches other criteria using bullets fired from 5000 feet up (firing in a vertical dive with software ballistic correction for crosswinds after the first
We have computer-driven cars (Score:5, Insightful)
His concern isn't entirely unjustified. We're increasingly relying on robots to do the actual killing, but we've currently designed the systems so that humans need to be involved in the decision making.
Forget the military drones. (Or at least, they're a smaller component of the overall issue.)
We have computer-controlled cars. They will be deployed in massive numbers over the next ten years. If remote updates are possible, anyone who can update a popular model has access to a distributed weapon of mass destruction capable of causing hundreds of thousands of deaths in a matter of moments.
Warfare-oriented tech isn't the only vector for mass attacks.
Re:We have computer-driven cars (Score:4, Insightful)
Slashdot apparently needs a "Holy Fuck, This Shit's Scary and True" mod.
Re: (Score:2)
Rogue drones with a 100% false positive rate, would be a huge tragedy and a massive black eye for the responsible nation (likely the USA), but it wouldn't come close to wiping out humanity. As some point, the drone would need to refuel or reload. At that time, they would be shut down. Even if we armed the drones with nuclear weapons (in a MASSIVE display of stupidity), we would be more at risk from the escalating tensions triggering a nuclear war than we would be from the drones themselves wiping us out.
As
Re: (Score:2)
...At some point (e.g. World War 3), it be seen as more efficient to launch a fleet of drones that vastly outnumbers our pilots, equip each with facial recognition systems and a list of targets, and tell them to kill on sight.
I'm not saying it's a good idea, but there's no denying that it would be an efficient way to get the job of killing done, and that it's the sort of measure a country might turn to in desperate times...
The last time a country found themselves in "desperate times", a World War was ended with a couple of nuclear devices being dropped on entire cities, targeting not just those directly involved in the war, but civilians as well. When speaking of "efficient" ways of killing, not much has supplanted a nuclear device to date, and this also highlights as to the actions that would likely be taken in the future rather than deploying drones seeking faces.
Re: (Score:3)
I disagree. Biological and chemical warfare is much more efficient at killing with the added bonus of not destroying infrastructure. Biological warfare probably has the highest likelihood of wiping all of humanity out due to a screw up since it is potentially self-replicating and infectious. Basically, we're more prone to go Resident Evil than Fallout.
Re: (Score:2)
Think in realistic terms.
Is there a PROFIT driven Military Industrial Complex case to be made for it? Corporate profits? Executive bonuses?
Then yes, it will happen. Humanity be damned.
Why do you think we have so many wars? (hint: because it's profitable!)
Re: (Score:2)
Government of the corporations, by the corporations and for the corporations.
Re: (Score:2)