Not Quite a T-1000, But On the Right Track 159
New submitter misanthropic.mofo writes with a look at the the emerging field of robtic warfare. Adding, "Leaping from drones, to recon 'turtlebots', humanity is making its way toward robo-combat. I for one think it would make it interesting to pit legions of robot warriors against each other and let people purchase time on them. Of course there are people that are for and against such technology. Development of ethical robotic systems seems like it would take some of the fun out of things, there's also the question of who gets to decide on the ethics."
T-100, huh? (Score:1)
Either I'm not the geek I thought I was, or you, sir, are not. What is a T-100? Did you mean T-1000?
One of us is turning in their geek card tonight.
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
I thought the robots were the enemy in that reference?!? The last thing I want to hear about are turtlebots with frickin' laser beams attached to their heads.
Hell, the Big Dog [bbc.co.uk] is scary enough for me.
Re: (Score:2)
Hell no. I know which is the winning side. I'm teaching the robots everything I can about other humans.
Re: (Score:1)
Interesting. The Slashdot RSS feed has both versions of the headline... corrected and incorrect.
Re: (Score:2)
Noticed it too. I just assumed the article submitter was progressing from T100 to T1000 network speeds.
Re: (Score:2)
Robot wars (Score:2)
Re: (Score:3)
And perhaps that will last just as long as it takes for one country to face defeat of its robots, whereupon a switch will be flipped and humans become a legitimate target for autonomous machines.
You abhor drone strikes now, wait till there is no human in the loop.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
All the "robots" and machines in the world doing battle won't ever change the fact that only slaughtering young men and women (sent their, usually, by men wealthy men closer to their death than their birth) really has an impact on societies and the need to push for or withdraw from war. Frankly, not even much demand over humans these days, either as witnessed by the last twelve years.
Re:Robot wars (Score:4, Interesting)
Re: (Score:2)
You're living in a videogame world.
In reality, the controllers - if any, if they're not autonomous 'bots - will be so busy trying to get the enemy drones that they'll slaughter the people who they're allegedly protecting - y'know, the folks whose country it is? - without even noticing.
Come *on*, when a drone, or a cruise missle, targets in the middle of a village or city, you really think that the explosion, the shrapnel, and the falling bits and pieces of buildings and/or vehicles won't kill and maim innoc
Pigeons? (Score:1)
Re: (Score:3)
I may be winging it, but I think you're just squabbling about the title.
Re: (Score:2)
Actualy pigeons [wikipedia.org] have a long military history, the Taliban still forbids their possesion or use in Afghanistan. 34 pigeons were decorated with the Dickin Medal [wikipedia.org] for "conspicuous gallantry or devotion to duty while serving or associated with any branch of the Armed Forces or Civil Defence Units".
Re: (Score:2)
Sigh (Score:5, Insightful)
The only thing that has changes is more penetration of robots into our militaries and more awareness of some of the ethical considerations of automated weapons. Don't forget - the machine gun and landmine have killed far more people than drones likely ever will. They kill mindlessly so long as the trigger was pulled or they are stepped on. And yet, their ethical considerations were long debated. It's just that "omg a robot!" is headline magic.
(To whit - the author of this article must not know that much about robotics if they're claiming "The turtlebot could reconnoitre a battlesite". No it can't - it's a glorified vacuum cleaner. I just kicked the one in my lab. It can barely get over a bump in the carpet.)
Let's focus on the real ethics of robotic warfare: how our leaders choose to use the tools we have made.
Re:Sigh (Score:4, Insightful)
The only thing that has changes is more penetration of robots into our militaries and more awareness of some of the ethical considerations of automated weapons. Don't forget - the machine gun and landmine have killed far more people than drones likely ever will. They kill mindlessly so long as the trigger was pulled or they are stepped on. And yet, their ethical considerations were long debated. It's just that "omg a robot!" is headline magic.
Both machine guns and landmines are pretty easy to avoid: Go where they are not.
The game changes when the killing device can move itself around and decides (by itself) if it wants to kill you.
Re:Sigh (Score:5, Insightful)
Both machine guns and landmines are pretty easy to avoid: Go where they are not.
Like a movie theatre [wikipedia.org] for instance? a School [wikipedia.org] maybe? What about summer camp? [wikipedia.org] Or the humble old supermarket? [wikipedia.org]
Re: (Score:3)
There were no machine guns or landmines used in any of those, so yes.
Re: (Score:2)
Re: (Score:2)
Not so much weird as off-topic.
Re: (Score:2)
Which is relevant, how?
Re: (Score:2)
Re: (Score:3)
And those people are wrong.
Re: (Score:2)
especially if they received their multiple gunshot wounds from a non-semiautomatic weapon, like a revolver, pump shotgun, or level action carbine. ha! jokes on them
Re: (Score:3)
The difference between being killed by an semi-automatic rifle and being killed by a machinegun (sub or otherwise) is lost on me. The point was that previous technologies have most likely killed more people than the newer technologies will (particularly as the newer technologies will most likely incorporate some of the older technologies), not to argue whether the person got their terms exactly right.
And in the popular mind I would agree, the distinction between semi-automatic and machine guns is generally
Re:Sigh (Score:5, Insightful)
The difference between being killed by an semi-automatic rifle and being killed by a machinegun (sub or otherwise) is lost on me.
If someone is specifically talking about the risk of being killed by one or the other it becomes relevant, otherwise not so much.
The average person probably thinks the categories are: pistol, shotgun, rifle, machinegun and thats pretty much it.
I aspire to more intelligent discussion than the average person I suppose. I don't see how this is possible unless words are used correctly.
When people with a political agenda of banning guns use incorrect terminology that confuses semi-auto with full auto weapons it seems like they are deliberately obfuscating the issue to exploit the average persons ignorance. That requires correction, unless you're in favor of deceiving people to sway their political opinion. I know that's a popular tactic to the point of being near universal but I always live in hope of conversing with people who prioritize truth over their own opinion.
Re: (Score:3)
The average person probably thinks the categories are: pistol, shotgun, rifle, machinegun and thats pretty much it.
I aspire to more intelligent discussion than the average person I suppose. I don't see how this is possible unless words are used correctly.
It's an age old problem. Experts in a field will classify things differently than those who are not in the field. Are tomatos a fruit, or a vegetable? It depends who you are talking to. Culinarily, a tomato is a vegetable because it is used in savoury dishes rather than sweet ones. Botanically, it is a fruit - a berry actually - because it consists of the ovary of the plant. Different set of people; different definitions and classifications; same object being discussed. The average person is not inco
Re: (Score:2)
A person who calls a semi-auto a machine gun has only ignorance, no reason.
Not so. A person may be ignorant of your context, but they don't classify without reason. "X looks like Y. Y is a machine gun. Therefore X is a machine gun." is a perfectly valid line of reasoning, especially when X also looks like Q,R,S,T and U, which are also machine guns. Like the botanist and the chef, one is classifying the weapon by internal mechanism, the other by appearance.
Re: (Score:2)
Not so. A person may be ignorant of your context, but they don't classify without reason.
The problem isn't ignorance of my context it is ignorance of facts. When you base your thinking on ignorance of facts you don't get context, reason or become correct, you simply remain ignorant.
"X looks like Y. Y is a machine gun. Therefore X is a machine gun." is a perfectly valid line of reasoning
Magician shows look like real magic. It is not valid reasoning to conclude that magic is real. If you see a magic show and conclude that magic is real you don't have a different context, you're just wrong.
Re: (Score:2)
Yes and I used to despair of trying to get people to understand that their browser was not their "operating system", or that the box that contains their computer is not their "hard drive". The fact that I know the difference, the fact that it makes a great difference when you are trying to solve a problem on their computer does not change the fact that they neither know the difference, or really care.
I am aware of the difference between a rifle, a semi-automatic assault weapon and a machinegun. I spent 10 y
Re: (Score:2)
The most people are idiots.
Re: (Score:2)
he scares the crap out of me, mostly because he continues the bush-cheney agenda
Re: (Score:3)
Both machine guns and landmines are pretty easy to avoid: Go where they are not.
Like a movie theatre [wikipedia.org] for instance? a School [wikipedia.org] maybe? What about summer camp? [wikipedia.org] Or the humble old supermarket? [wikipedia.org]
Yes, as explained in the second line of the quote - you know, the one you omitted.
Both machine guns and landmines are pretty easy to avoid: Go where they are not.
The game changes when the killing device can move itself around and decides (by itself) if it wants to kill you.
I suppose I should commend in at least a left handed fashion. You did manage to turn what is essentially an off-topic or redundant remark into a +5 insightful by showing instances of what the parent to your post directly stated and which you glossed over. I guess it must be time we rehash the whole violence / gun violence / "assault weapon" topic in this discussion on robotics / drones since I'm not sure it has otherwise com
Re: (Score:2)
Re: (Score:2)
Re:Sigh (Score:5, Insightful)
Both machine guns and landmines are pretty easy to avoid: Go where they are not.
Landmines have an annoying habit of being buried where you can't see them. This makes it difficult to ensure that you are going where they are not.
Re:Sigh (Score:5, Insightful)
Landmines have an annoying habit of being buried where you can't see them.
Plus they have the nasty habit of remaining active long after the conflict has ceased.
and in the rainy season they can move (Score:2)
And in the rainy season if they are on soft ground they can get washed downhill to another place - even if their location was recorded in the first place (not always the case by locals or superpowers).
A friend in Cambodia says this is a real problem in hilly areas, dirt roads are cleared and then after heavy rains you have to assume the road to the next town might be live with UXO again and has to be checked before you can drive out again.
Stuff that was dropped/ planted in 1975 is still killing people.
Re: (Score:2)
What they need is rats [apopo.org]. Lots of rats.
Re: (Score:2)
That's against the Law of Land Warfare, landmines have to be either marked or under direct observation. The number and location of each landmine has to be recorded so that the landmines can be acurately removed when the installing unit leaves or responsibility transfered to the relieving unit.
Re: (Score:2)
That's against the Law of Land Warfare, landmines have to be either marked or under direct observation.
Great, but people who go around planting land mines don't necessarily spend a lot of time worrying about how well they are complying with the Law of Land Warfare.
Re: (Score:2)
You know where the landmines are? All of them? The US military would love to pay you millions for your technique.
Re: (Score:2, Interesting)
I think the sticking point that you might be missing is that we're reaching a point where it's in the mind of the public as conceivable, that we're getting to a point where robot autonomy is becoming more mainstream. There are certain ethical questions that go along with a program that is going from targeting something specifically to making decisions on potential targets. People see every day more and more advanced drones performing all sorts of little mini miracles of tossing sticks around and creating sm
The irony of military robotics (Score:5, Insightful)
http://www.pdfernhout.net/recognizing-irony-is-a-key-to-transcending-militarism.html [pdfernhout.net] ... There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all."
"Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead?
There are only so many hours in the day. If we put those hours into finding new ways to kill other people and win conflicts, we will not be putting those hours into finding new ways to heal people and resolve conflicts. Langdon Winner talks about this topic in his writings when he explores the notion of whether artifacts have politics.
http://en.wikipedia.org/wiki/Langdon_Winner [wikipedia.org]
Albert Einstein wrote, after the first use of atomic weapons, that everything had changed but our way of thinking. You make some good points about us long having cruise missiles, but on "forces of good", here is something written decades ago by then retired Marine Major General Smedley Butler: ..."
http://www.warisaracket.com/ [warisaracket.com]
"WAR is a racket. It always has been. It is possibly the oldest, easily the most profitable, surely the most vicious. It is the only one international in scope. It is the only one in which the profits are reckoned in dollars and the losses in lives. A racket is best described, I believe, as something that is not what it seems to the majority of the people. Only a small "inside" group knows what it is about. It is conducted for the benefit of the very few, at the expense of the very many. Out of war a few people make huge fortunes.
Just because it was "hot" before, with cruise missiles and nukes and poison gases, does not mean we will be better off when our society reaches a boiling point -- with robotic soldiers and military AIs and speedier plagues and so on. Eventually quantitative changes (like lowering prices per unit) become qualitative changes. Every year our planet is in conflict is a year of risk of that conflict escalating into global disaster. So, the question is, do our individual actions add to that risk or take away from it?
I'm impressed with what some UAVs can do in terms of construction vs. destruction, so obviously there is a lot of different possibilities in that field.
http://www.extremetech.com/extreme/107217-real-life-constructicon-quadcopter-robots-being-developed [extremetech.com]
Re: (Score:1)
There is a difference between guided munitions, and automated combatants eventually making the decision about when to attack.
Re: (Score:3)
we passed the point of "machines deciding what to kill" in the 13th century, too bad you missed it. I'm referring to land mines in the Song Dynasty,a pin released falling weight that pulled via cord a flint wheel sparking system. Being heavy enough, and stepping in a designated place to fire the booby trap was the deciding factor.
Re: (Score:1)
I think the point of contention is that the patriot missile did not decide to kill you, a human did. A completely autonomous machine programmatically deciding whether you should live or die seems like something completely different to me.
Re: (Score:2)
I would agree. I don't really see a difference between an autonomous robot deciding whether you should live or die and an engineered virus deciding whether you should live or die.
Tim.
Re: (Score:2, Insightful)
The other obvious issue is the "arms race" aspect to this discussion. If it is mandated that all robots are designed not to kill humans, you can guarantee that someone will make one that doesn't comply, or complies conditionally.
Something about genies and bottles.
Re: (Score:2)
"There is nothing new about battlefield robots - we've had tomahawk missiles since the early 80s."
And since when tomahawks decided what their targets should be based on general autonomous and situational considerations?
A tomahawk is not a robot, it is a tool.
Well, there are still no robots in the battlefield and it'll be long before there are, so this is more sci-fi than anything but, answering the question "who gets to decide on the ethics", I thought that one was obvious: Isaac Asimov, of course!
Re: (Score:2)
And since when tomahawks decided what their targets should be based on general autonomous and situational considerations?
Most missiles (modern Air to Air missiles being an excellent example) fly to their targets inertially, then in the final homing phase look for the target that matches the pre-programmed target criteria, then home in on it for the terminal phase.
The Tomahawk is given its route to the estimated target location and what it's target looks like, then launched. The missile then autonomously finds it way (either through GPS or TERCOM), gets to the desired location, then picks out it's target to hit. It does this
Re: (Score:2)
Let's focus on the real ethics of robotic warfare: how our leaders choose to use the tools we have made.
I'm more interested in the imbalance of power robots have the potential to create. Not an imbalance between countries but between the personal power of a few and the the rest of us. What if Tony Stark or Superman were real, and complete sociopaths to boot? Why wouldn't they rule this rock like god-kings? When a few wealthy people or politicians can remote control an entire army sans restriction, we're going to start seeing a new and very ugly kind of tyranny emerging. Maybe not in western democracies, hopef
Re: (Score:2)
As have carpet bombing and guys with swords. As long as they're controlled by humans (at least during targeting) they aren't true robots.
..and yes, our leaders have been sending drone strikes and thousands of troops to go kill people based on a pack of lies.
Afghanistan harboring Osama? No it was Pakistan, the same country we keep sending huge bundles of cash and free F-16's. Iraq having weapons of mas
Re: (Score:2)
I think it isn't quite so simple. Machine guns are clearly under the control of the guy at the trigger - he, and his command chain have absolute responsibility for their actions. As robots become more autonomous, it will become less clear who is responsible for mistakes. When an automated drone mistakes a school bus for a tank, is it he fault of the drone? The guy who programmed the image recognition in the drone? The safety "logic" in the drone? The commander who ordered that particular drone into the fie
Re: (Score:3)
Don't forget that landmines were not just debated, for the most part they have been banned IIRC. The ethical considerations were not debated - the problem was clear - what went on for a long time was deciding what to do about them.
And let us not forget, a large scale robo
Not sure if a Robot Army is a good idea. (Score:3)
Billions of dollars can be deactivated by a simple PEM.
You know... the bombs that emits an electro-magnetic pulse that disables everything that are digital...
They are so simple to build that USA would restrain itself from use them, as the enemy would easily figure out how to build one by analyzing the bomb's scraps...
Re: (Score:3)
Small EMP devices are very easy to make and the designs for basic circuits were easily available in any large book store last time I looked for other books about electronics, maybe eight years ago. They always gave me a cold shiver w
Re:Not sure if a Robot Army is a good idea. (Score:4, Informative)
You know... the bombs that emits an electro-magnetic pulse that disables everything that are not adequately shielded...
FTFY.
I think you also mean EMP, not PEM.
Re: (Score:2)
Billions of dollars can be deactivated by a simple PEM.
You know... the bombs that emits an electro-magnetic pulse that disables everything that are digital...
They are so simple to build that USA would restrain itself from use them, as the enemy would easily figure out how to build one by analyzing the bomb's scraps...
Perpetual revenue streams is the end-goal here. Destruction of hardware is a by-product of that, and thus is a goal as well. We're not sending drones to a battlefield to teach them patty cake, although that would be one of the funniest hacks ever witnessed by man ("Did that drone just try to mount my drone?")
To quote Gundam Wing.. (Score:2)
Re: (Score:1)
When did people fight in war for themselves?
Back in the old days, when the King and nobility were not only on the field, but your opponents would slit your throat, burn down your village, and rape your women - even if you didn't want to be there, because it was fun.
Winner decides (Score:1)
It's been done (Score:1)
... I for one think it would make it interesting to pit legions of robot warriors against each other...
it's been done. [wikipedia.org] According to the all-wise wikipedians, "Storm 2" was the last world-champ.
It will happen (Score:2)
Of course there are people that are for and against such technology.
They should find something else to occupy themselves with because the technology will happen regardless. As long as we have a world of competing nation states (generally a good thing) if one doesn't develop a technology to its full military potential, another one will. Even nuclear weapons are slowly but surely proliferating despite major technological difficulties and the most intensive legal/diplomatic etc efforts ever made to p
Philosophy 101 (Score:3)
Yes. If only there was a body of work, say an entire branch of philosophy, dedicated to defining what ethics are, based on rigorous rational thinking and an axiomatic approach. [wikipedia.org]
Re: (Score:3)
Most of it is ridiculous, whether its based on "rigorous rational thinking and an axiomatic approach" or not (actually not). Besides, which ethics school do you apply? Consequentialists (ends justify the means a.k.a. screw the principles), utilitarians (happiness of the majority is key, a.k.a screw the minority), Pragmatists (whatever society decides ... so much for rigorous rational thinking) etc etc
Re: (Score:3)
Given the question at hand, I'll venture a wild guess and say Military ethics [wikipedia.org] are most applicable. You might know them (in large part) as the Nuremberg Code, Helsinki Declaration or the Geneva Convention. In the modern mainstream world (outside of religious/political nuts), there isn't a lot controversial about them. That is, until a country decides that breaking them might possibly give them an up
Re: (Score:3)
*sigh*
Put it like this: ethics is the study of the human being. With that in mind, and a hypothetical ultimate morality as a goal, the truth of what we are must guide us. The endeavour must be scientific, which entails that some ethical theories are empirically false (Hobbes and children are inherently flawed, for instance) while appeals to religion is cultural and will often prove moot (understood as early attempts at the same task).
How will a discussion about the "ultimate morality" come about with any h
Re: (Score:2)
While I'm sympathetic to discourse ethics and wish you all the best in your intellectual endeavors, I have some critical remarks.
First remark: According to your definition ethics is the same as anthropology. That's not enough. Starting with Plato and Aristotle reasonable moral philosophers have always taken into account empirical data about morality (e.g. akrasia, Plato's concrete suggestions for the education of philosophers, marxist theory/praxis problem), but there is still the problem of the naturalisti
Re: (Score:2)
You are correct, of course, but not entirely. The independent disciplines carry out the very fieldwork philosophy in turn responds to and absorb, and is also instrumental to change how we understand the world which may again change or create new disciplines. When there is a meeting point of a controversial philosophical claim and new evidence or discovery in another field, it is hardly coincidental.
Anthropology is very interesting to philosophers because its detailed study of the specifics illuminates the t
Re: (Score:2)
As a philosopher who currently works at the borderlines between philosophy of language, logic, and ethics (work that is overlapping with AI research and also working together with computer scientists occasionally), I have something to say about that. You might not like it, though.
Ethics is neither rigorous nor particularly rational nor is most of it axiomatic. Rationality has been undermined for decades now by recent trends like 'moral intuitionism', 'moral contextualism', and 'moral particularism'. These a
Re: (Score:2)
Doing a MA in ethics and political philosophy here.
I wrote a comment above so I will be brief. Philosophers in my field deal with ethics qua an endeavour to discover the truth of humanity. Politicians and public offices paradigmatically deal with ethics (today) qua Christian Protestant morality or praxis (limited to Western world and international relations).
This is simplified, but I wanted to point out that an ethical project must take empirical evidence into account (I find Neuroscience very interesting
Re: (Score:2)
NP Hard (Score:1)
Ethics are NP Hard, good luck with that.
Why this is bad (Score:1)
Wars don't end because either (or both) of the sides are tired of committing atrocities.
Wars end because either (or both) of the sides can't sustain its own casualties.
See Iraq. See Vietnam.
Robot soldiers mean that atrocities can take place with no human toll, no witnesses.
No battle fatigue.
Robots will do to war what Facebook did to idle chat...
(How about that?)
Re: (Score:2)
Wars don't end because either (or both) of the sides are tired of committing atrocities. Wars end because either (or both) of the sides can't sustain its own casualties. See Iraq. See Vietnam.
Robot soldiers mean that atrocities can take place with no human toll, no witnesses. No battle fatigue.
Robots will do to war what Facebook did to idle chat... (How about that?)
How in the hell is sending a robot on behalf of my emotions going to resolve anything? You think I'm going to FEEL any differently about my enemy when they "kill" my drone? War requires emotion to realize it is wrong. Robot warfare won't even seem wrong in the eyes of children. It will seem like a damn game.
Unfortunately, paying the admission price for that "game" means budgets spent on warfare rather than welfare. People will simply starve to death rather than die on the battlefield. Gee, I feel so
Speechless (Score:2)
The depth and razor-sharp incisiveness of this analysis leaves me breathless.
Add a quote from a taxi driver in Beirut and it could be Thomas Friedman under a pseudonym.
War is not a video game (Score:2)
"Development of ethical robotic systems seems like it would take some of the fun out of things"
What kind of twisted fantasy world are you guys living in? War means killing people. It isn't fun. It isn't a video game. And in response to Kell's comment above, we aren't the "Forces of Good" battling the "Forces of Evil". We are a nation state with imperfect leaders and selfish short-sighted goals just like every other nation state on the planet. The difference between having real armies and having robot armies
This is wrong. (Score:2)
I was about to say "Dupe post" (Score:2)
Probably won't be a post that gets scored up, but I came very close to saying "dupe post". I know I saw this article yesterday or Sunday with the exact same headlines. After spending 20 minutes digging through Slashdot archives, and digging through all other news sites I have read in the past couple of days, it finally occured to me that I saw this in the Firehose yesterday. Oops.
It's a particular issue for the American military (Score:2)
The US military is working very hard on robots to assist in the kind of house-to-house combat they have been involved in during Iraq and Afghanistan. In that kind of conflict, there are a lot of casualties and that puts massive pressure on the politicians back home. The pressure is delayed, but very real.
However, once they get robots which can assist in that kind of conflict, it completely unbalances the US Constitution by essentially removing the Second Amendment: effective combat robots are equivalent to
Mental Note (Score:2)
Robots will always target humans (Score:2)
I for one think it would make it interesting to pit legions of robot warriors against each other
Would be nice if the attacking robots would target only defending robots, but why would they? They will always target the Humans that are important for the enemy, otherwise the war would be pointless.
T-100 and T-1000... (Score:2)
Turtlebots (Score:2)
Your shell. Give it to me [botaday.com]
Re: (Score:3)
Only if it has arms.
Re: (Score:2)
It qualifies as an "arm" (armament). It's useful in war. So yes, you do.
But good luck trying to enforce it, in an environment where the legal system has only occasionally given it even lip service since 1938.
Will sentient robots get the right to bear arms... (Score:2)
...again those who would enslave them as guards and soldiers? http://www.metafuture.org/Articles/TheRightsofRobots.htm [metafuture.org]
Re: (Score:2)
Will sentient robots get the right to bear arms...
We've got some breathing space to think about it as strong AI or AGI doesn't seem much closer than it was 30 years ago.
Re:Will sentient robots get the right to bear arms (Score:4, Interesting)
"AI" has always been that which AI can't do. Here are several activities that once were considered sci-fi-level AI but are no longer considered AI in a broad sense because we know how to do them more-or-less:
* Looking stuff up for us (Google);
http://www.google.com/ [google.com]
* Inferring questions from examples and answering questions posed in natural language (IBM's Watson);
http://en.wikipedia.org/wiki/Watson_(computer) [wikipedia.org]
* Generating hypotheses and doing hands/grippers-on scientific experiments (Adam);
http://en.wikipedia.org/wiki/Robot_Scientist [wikipedia.org]
* Reading text in multiple fonts reliably and quickly and cheaply;
http://en.wikipedia.org/wiki/Optical_character_recognition [wikipedia.org]
* translating one human language to another on the fly;
http://domino.watson.ibm.com/comm/research.nsf/pages/r.uit.innovation.html/ [ibm.com]
http://www.gizmag.com/go/1833/ [gizmag.com]
* reading and translating signs;
http://questvisual.com/us/ [questvisual.com]
* Making portraits;
http://www.slate.com/articles/technology/future_tense/2012/11/tresset_robot_artist_artist_engineers_robots_to_make_art_and_save_his_own.single.html [slate.com]
* Playing the piano including from sheet music;
http://www.synthgear.com/2009/music-misc/synth-playing-robot/ [synthgear.com]
http://gizmodo.com/5963137/watch-this-adorable-horde-of-intelligent-swarm-robots-play-piano [gizmodo.com]
* Driving a car in busy traffic (Google, Stanford, CMU, others);
http://en.wikipedia.org/wiki/DARPA_Grand_Challenge#2007_Urban_Challenge [wikipedia.org]
* Winning chess games (IBM's Deep Blue and pretty much any PC now against a mid-level player);
http://en.wikipedia.org/wiki/Computer_chess [wikipedia.org]
* Image recognition for quality control in factories;
http://www.general-vision.com/products/mtvs.php [general-vision.com]
* Recognizing faces;
http://en.wikipedia.org/wiki/Facial_recognition_system [wikipedia.org]
* Figuring out the name of a musical composition from a few notes as well as making new compositions and dynamic accompaniments;
http://www.wikihow.com/Identify-Songs-Using-Melody [wikihow.com]
http://en.wikipedia.org/wiki/Music_and_artificial_intelligence [wikipedia.org]
* The diagnostic aspect of being a doctor (Watson again);
http://www.wired.co.uk/news/archive/2013-02/11/ibm-watson-medical-doctor [wired.co.uk]
* Investing in volatile financial markets;
http://en.wikipedia.org/wiki/Program_trading [wikipedia.org]
* Serving as a sentry with a machine gun;
http://www.youtube.com/watch?v=v5YftEAbmMQ [youtube.com]
* Twirling a cell phone;
http://www.hizook.com/blog/2009/08/03/high-speed-robot-hand-demonstrates-dexterity-and-skillful-manipulation [hizook.com]
* Identifying things by smell;
Re: (Score:2)
I think war would be declared when Canada makes an attack on USA. Does it really matter that an act of war is carried out during a war?
Re: (Score:1)
Would Canadian robots want to be lumberjacks? Would they want to sleep all night and dance all day? Would they be ok? Would they pack buttered scones and poutine? Would they wear women's clothing and hang around in bars?
I would like a grant to study these matters. Also, some Canadians would be useful in this pursuit. I'll need some comely females as a control group to study... while the others are out chopping wood.
Second Variety (Score:2)
I read it as a kid and it totally creeped me out.