Social Robots May Gain Legal Rights, Says MIT Researcher 288
dcblogs writes "Social robots — machines with the ability to do grocery shopping, fix dinner and discuss the day's news — may gain limited rights, similar to those granted to pets. Kate Darling, a research specialist at the MIT Media Lab, looks at this broad issue in a recent paper, 'Extending Legal Rights to Social Robots.' 'The Kantian philosophical argument for preventing cruelty to animals is that our actions towards non-humans reflect our morality — if we treat animals in inhumane ways, we become inhumane persons. This logically extends to the treatment of robotic companions. Granting them protection may encourage us and our children to behave in a way that we generally regard as morally correct, or at least in a way that makes our cohabitation more agreeable or efficient.' If a company can make a robot that leaves the factory with rights, the marketing potential, as Darling notes, may be significant."
nonsense like this (Score:2, Insightful)
makes me want to damage social robots to prove a point
pandering to morons.....
Re: (Score:2)
Only bicentennial robots will get that privilege.
Re: (Score:2)
Re: (Score:2, Insightful)
I don't get why we should be moral to robots, which can be replaced without any loss, while at the same time we can keep great apes, which are in all ways similar to humans in cages.
They use tools, they speak, they can learn to read and write, they understand abstract concepts, they have memories, they mourn their dead, they cry, they can be angry, happy and sad.
Yet we are allowed to be limit their freedoms, take their homes and use them for medical trials. Why?
Because of anthropocentric arrogance. There is
Re: (Score:2)
I can see the bumper sticker now: "I'm a gorilla... and I vote!"
Re: (Score:2)
So they should have rights because they're similar to us, yet you accuse others of anthropocentric arrogance?
Re:nonsense like this (Score:4, Insightful)
No he accused us of double standards. His comment was that because higher primates demonstrate higher thinking, sentience, rationality, communications and abstract thought, it should be morally reprehensible to use them in an inhumane fashion. The fact that people have and do use one another that way sort of puts a finer point on his argument.
We are a moral species who commits immoral acts. In an attempt to broaden moral practices do we include other species and even machines as surrogates for human beings, to make certain we've bred moral behavior into the populace. This is a valid and important question. If we become emotionally attached to our machines (you guys know who you are... beware metal fever), does it behoove us to treat them with respect up to and including rights to insure that we treat one another with that same level of dignity. Seems like a long way to go, just to assure that we behave like higher life forms, but I'd consider it if it improved human behavior. In the end, I don't know what it will take to get people to behave, but I'm open to ideas.
Re:nonsense like this (Score:5, Interesting)
As long as its your robot you can parts it out do your heart's content.
But if I send my robot down the street to get groceries, I don't want someone yanking its memory modules or salvaging its servos just because it was running around loose and had no feelings.
We really don't have many laws that cover a device that runs around in public spaces doing errands and perhaps spending money (digital or otherwise).
Yes its property, and my property rights may still apply, but I'm not sure that's enough to prevent someone from declaring it abandoned property and partsing it out on the spot.
There are more imminent questions that need to be answered:
Are they licensed like cars to be in public spaces? Carry and spend money? Carry weapons? Plug in and recharge when they need? Be searched by police at will? Will Police disable and memory strip my Asimo [honda.com] just because it might have recorded a police beatdown while passing a dimly lit alley?
Re: (Score:2)
existing law already covers the situation where someone's property is vandalized or stolen. "rights" is not a relevant question at all, it's just a machine. it can't feel pain or terror as even a small animal can.
Re: (Score:2)
Vandalized or stolen doesn't come close to covering it.
Can it be searched? Its in plain view. It has no expectation of privacy. Will police "taps" be allowed?
Is a warrant needed?
Can I clone its memory cards if it walks across my property? Is it mine if it walks across my property?
How does anyone know its not a walking bomb?
Can shopkeepers refuse it access, or does it have implied "rights" of access like service animals in service to their owners?
Re: (Score:2)
The cops will say 'no' but the Constitution says 'yes',
yes, no,
take it apart,
yes, easily remedied with a HERF pulse.
Yup, that 'bout covers it...
Re: (Score:3)
Can you point out where in the constitution it says they need a warrant to search my robot that was walking around in public?
Fourth Amendment [wikipedia.org] you say?
Maybe you should read up a little bit. If they can stop and frisk YOU for nothing more than walking while black, they can certainly detain and drain your robot.
Re: (Score:2)
Define pain.
Re: (Score:2)
We already have laws for that.. What you complain about is covered by property rights. Such machines 'might' require licensing (though I'm not a fan of this because license fees quickly become expensive taxes), but otherwise, law enforcement stripping your 'robot' is no different than stripping your laptop or any other digital device. This would only be yet another example of a much larger problem that exists right now in reality-land.
Bottom line: We don't need idiotic 'machine rights' to soothe people who
Get it in now (Score:2)
http://www.youtube.com/watch?v=g_hF_RhD-xE [youtube.com]
You may not be able to do this in 10 years...
Pets have rights? (Score:2)
Re: (Score:2)
Re: (Score:2)
Animals don't have rights.
There are laws that forbid cruel and unusual treatment of animals and there are various property rights that an owner has regarding pets or other animals that he owns.
Re: (Score:3)
What's the difference between a law that protects animals from cruel and unusual treatments, and a right of not being subjected to such treatments?
Re: (Score:3)
Re: (Score:2)
Violence is okay if you call it an educational slap. Make sure to give a few to your wife, too. Educational slaps all around!
Re: (Score:2)
oh please.. give me a break. a spanking is not a beating.
Re: (Score:3)
knock it off with your disingenuous rubbish. Your pantywaist bullshit has created, now, several generations of overly sensitive cowards who can't handle direct confrontation. lack of this basic skill is what's causing americans to lose their liberties for the sake of $GROUP_X's feelings.
Kant's argument (Score:4, Insightful)
Kant's argument is pretty unfashionable these days, since it rejects the idea that animals have rights for their own sake. It's still the best one, IMO, but good luck selling this to university ethics departments.
Re: (Score:2)
Kant's argument ignores the fact that neurologically pain is essentially the same process in any mammal. We don't protect animals because of some selfless altruism towards the inhuman. We protect animals because we recognize that they are like us.
Robots are people, my friend. (Score:5, Funny)
... says the Roboplican nominee.
Oh yes (Score:2)
Re: (Score:2)
Aperture (Score:2)
Rest assured that all lethal military androids have been given a copy of the Laws of Robotics. To Share.
Give robots rights, but when lethal military androids are built, I'd rather have them bound to the Laws of Robotics.
Re: (Score:2)
Yeah, like that's going to work. Imagine a robot making a morality judgment. We can't seem to stop killing each other, even with cool automation.
The programmers should assume the liability. Robot shoots person. Robot goes to jail? Is that threat going to stop a robot? Instead, put the programmer that wrote the code to jail, for he/she is as complicit as a human in its place.
Imagine: Windows Robots. Debian Robots. With Guns.
Re: (Score:2)
why the programmer why not the general who commands them if you punish the programmer what about the engineer who designed the 'bot's or the manufacturer. This would be the same as a gun you don't sue winchester for shootings you prosecute the shooter.
Re: (Score:2)
In war, the general is responsible. He "pulled the trigger". If it's automated, then the individual endowing logic "pulls the trigger". If there's a hell, a special place is reserved for weapons makers.
In the case of Winchester, they're enablers, but under the law, aren't liable. But if Winchester builds a bot that shoots a harmless person, then it's manslaughter. If it targets, for instance, a person with a mustache, then it's murder. If it defends itself, then it's murder, because "itself" isn't a human.
Re: (Score:2)
why the programmer why not the general who commands them if you punish the programmer what about the engineer who designed the 'bot's or the manufacturer. This would be the same as a gun you don't sue winchester for shootings you prosecute the shooter.
the general pushing the button is doing the programming. sure, his program might consist of a single "kill all" command but it's still a program the robot is following.
Re: (Score:2)
What kind of morality judgement would be needed? "Do not harm humans" and "Do what humans say unless you would have to harm one" sound pretty simple to me.
Re: (Score:2)
Then there's that Zero Law: 0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
Our computers already do this. The box is open. Name me a weapons system that doesn't have a microprocessor inside.
Do robots ask if the general's judgment is correct? Is this war justified?
My preference: robots aren't human, and deserve no protections. They're not sentient. They are metal and polymers and guts driven by programs written by humans and executed by individuals.
Those that wrote the code
Re: (Score:2)
Do robots ask if the general's judgment is correct? Is this war justified?
For the love of $DIETY, let's hope not. [wikipedia.org]
Yeah. That movie was terrible. Ask a silly question, get a silly movie.
Re: (Score:2)
The fact that "Do not harm humans" is, in fact, a surprisingly non-simple command (and that's not even also considering the "or through inaction allow a human to come to harm" subclause, which is a whole nother can of worms), is the whole driving force for like 95% of all of Asimov's Robot short stories...
Meanwhile, yes, until we actually do have robots of sufficient, nondeterministic programming complexity that you might actually imagine they're really intelligent and self-aware, which is unlikely to be an
Someone read I, robot? (Score:3, Interesting)
OR WHAT THE FUCK?
it's scifi nonsense better left for fiction for now.
"Patrick Thibodeau is a senior editor at Computerworld covering the intersection of public policy and globlization and its impact on IT careers. He also writes about high performance computing, data centers including cloud, and enterprise management. In a distant life, he was a weather observer in the Navy, a daily newspaper reporter, and author of a book about the history of New Britain, Conn." He also likes to write bullshit articles and somehow tie Apple into them. who am I kidding, it's computerworld - it's nothing but bullshit.
first make the goddamn cognitive robot that can feel pain, then we'll talk. can your car feel pain because there's a bit counter for faults in it? it can't. once the robots can make a compelling argument that they're cognitive then we're living sci-fi future and can look at the issue again. doesn't this jackass understand the huge leap from simple algorithms in siri to true AI ? why the fuck would you make your robot cognitive to the point that it matters if it has rights even if you could - for sadistic reasons? in which case you certainly wouldn't give it any rights.
next up the movement for rights of rocks - because rocks might have feelings too you know..
Re: (Score:3)
next up the movement for rights of rocks - because rocks might have feelings too you know..
I'm an animist you insensitive clod!
Re:Someone read I, robot? (Score:4, Insightful)
next up the movement for rights of rocks - because rocks might have feelings too you know..
I'm an animist you insensitive clod!
Ironically, calling somebody an insensitive clod is offensive and mineralist. Why can't rocks, clods, and earthy lumps of all shapes and colors just get along?
don't get touchy feely because it acts human (Score:5, Insightful)
Anthropomorphizing a machine because it mimics human behavior and then using that to justify giving it rights is a poor idea.
At some point in the distant future, when we arrive at the 'blade runner' level of replicant, then the issue can be picked up again. But don't put the cart before the horse.
Re: (Score:2)
the replicants were grown and assembled from parts, but still biological and presumably with much human DNA, so a replicants rights I see as different issue
So I'll be buying non-social robots (Score:2)
I don't want to have to respect the rights and feelings of my vacum cleaner, trash disposal, meal preparer, or grocery shopper. If these devices are designed and built for a purpose they should make my life easier.
If I specifically want a butler type robot that caters to my needs and needs higher level functions, maybe I'll be ok with social robotics, so long as he keeps the secret that I'm batman.
What about the robot you keep around that sits on your couch and loses at madden/halo/callofduty to make you fe
Click-bait (Score:4, Insightful)
This looks like click-bait, but I just can't help myself.
In our capitalist society, robots already have limited rights by virtue of the fact that they're private property and they're still going to be expensive (for a little while at least). That fact alone gives more protection to robots than most dogs, from outsiders who may want to harm our pets, or damage our robots.
And I don't see a law protecting a robot from its own owner anytime soon. Cruelty to a robot is not even going to be considered an issue. Now, if we're talking about a visually impaired person having his prosthetic camera-eye forcibly ripped out of his head, then yes, that would be hell of cruel, but cruel to the visually impaired disabled person, not necessarily cruel to the tool.
Re: (Score:2)
robots already have limited rights by virtue of the fact that they're private property
Those "rights" are the owners', not the robots'.
Kant (Score:2)
"The Kantian philosophical argument for preventing cruelty to animals is that our actions towards non-humans reflect our morality — if we treat animals in inhumane ways, we become inhumane persons. This logically extends to the treatment of robotic companions."
Is Kant's argument actually the basis for why our society recognizes some rights of animals? Probably not. Thank you for overlooking the far more compelling arguments of Descartes, Locke, Rousseau, Bentham, Martin, Schopenhauer, Darwin, Cobbe, K
Too pendantic (Score:5, Insightful)
The only way I see a robot who needs some legal rights will be if some system becomes self-aware and wants to walk around inside a robot body.
Re: (Score:2)
Sounds to me more like a reason to kill it.
Why would I have to bend for the robots? (Score:2)
I don't want robots that I can't disassemble completely without emotional distress. And if that they would come the solution is not fixing the laws bt fixing the machines.
Uh... (Score:2)
if we treat animals in inhumane ways, we become inhumane persons. This logically extends to the treatment of robotic companions.
Except the logic of that first sentence is wrong. Inhumane people treat animals inhumanely. The treatment does not CAUSE the inhumane persons (yes inhumane treatment OF the persons often causes an inhumane person, but you know what I mean). Yes we can make laws to stop people from ACTING inhumanely - but they will still be inhumane people, and once they think/know they can do inhumane things without getting caught, they will do so.
Really I think the best we can hope for is that these inhumane people do t
We All Know How This One Ends (Score:2)
It's a machine... (Score:3)
If I buy a car, I can take it home and legally pound on it with a sledge hammer, cut it up with a blow torch, use it for target practice, etc. I could not legally do this with a pet because of animal cruelty laws.
Why should a robot be different than any other machine?
Rule 34 Stross and laws and CP (Score:2)
Nobody mentioned Rule 34 by Stross and the product the toymaker was making/selling?
We're going to have laws long before we have rights, and the laws are going to be things like banning virtual / simulated CP. Long after its all "ruled" and "regulated" and "lawed" up, maybe we'll begin to debate rights.
Have to agree with the No F'n Way crowd (Score:2)
LBGT people (Score:4, Insightful)
Social is not the issue. Autonomous mobile is. (Score:5, Interesting)
The cited article is rather lame. But there's a real issue here that we're going to reach soon. What rights do mobile robots, like self-driving cars have?
As a practical matter, this first came up with some autonomous delivery carts used in hospitals. Originally, they were programmed to be totally submissive about making people get out of their way. They could be stalled indefinitely by people standing and talking in a corridor, or simply by a crowd. They had to be given somewhat more aggressive behaviors to get anything done. There's a serious paper on this: "Go Ahead, Make My Day: Robot conflict resolution by aggressive competition (2000) " [psu.edu]
Autonomous vehicles will face this problem in heavy traffic. They will have to deal with harassment. The level of aggressive behavior that will be necessary for, and tolerated from, robot cars has to be worked out. If they're too wimpy, they'll get stuck at on ramps and when making left turns. If they're too aggressive (which, having faster than human reflexes, they might successfully pull off), they'll be hated. So they'll need social feedback on how annoyed people are with them to calibrate their machine learning systems.
I don't know if the Google people have gotten this far yet. The Stanford automatic driving people hadn't, last time I checked.
Experiments (Score:2)
So does this mean I'll need to stop experimenting on robots?
Re: (Score:3)
Re: (Score:2)
And then there is also corporations established by the social robots. I wonder how soon these would get special rights too...
Special? They can probably already donate to elections .. what's your definition of catastrophe?
Re:No. No. Fuck no. (Score:4, Interesting)
Re: (Score:3)
too late for A.I. Gore but maybe in time for Mitt Robney!!
PS for those about to be pissed off... its a joke
Comment removed (Score:5, Insightful)
Re:No. No. Fuck no. (Score:5, Insightful)
Nice post you wrote there. I think the same.
It's really idiotic that our productivity has never been so high, yet employed people are being forced to work more and more hours but getting paid less and less. Meanwhile there is a growing mass of unemployed and the social benefits are being cut really hard throughout all the developed world. It just doesn't make sense and it can't end well. It's unsustainable.
It all comes down to the ownership of the means of production. If they're privately owned, the robots will work to make a few people very very rich while the rest of the populace lives in a Mad Max wasteland. If the ownership is socialised, the benefits of the increased productivity will be shared by every one. People can work less hours, or not work at all, and still be able to make a living.
And before people start ranting about "nobody will want to do anything, then", I call your attention to some examples: In Ancient Greece, the slaves did all the work. That's when unoccupied citizens created Democracy and made great advancements in human knowledge, like Mathematics and Philosophy. Most of the great scientists and philosophers up until the XX century were rich heirs that had nothing to do and didn't want to manage their family businesses. They ended up advancing the human knowledge. Sure, when machines do all the work, lots of people will choose to indulge in sex and drugs, but lots of them will work hard in whatever field they like, be it philosophy, politics, history, science and technology, medicine, arts, music, whatever makes us humans and not just animals. And they will compete among each other, not for a piece of bread, but for glory, fame, babes, whatever. Merit will not suddenly vanish.
Re: (Score:3)
If we are gonna avoid major wars and upheaval we are just gonna have to accept the fact that many individuals being born now, and I would argue quite a few living right now, will have to be paid to not work for the rest of their lives. Not because they are lazy or don't want to work
As someone who is lazy and doesn't want to "work" this can't come quick enough for me, but sadly I don't think it's going to be before I retire in twenty years or so.
The problem is, of course, that it will require a move towards some form of communism/socialism and I can't see that happening in places like the US without a serious revolution.
It would just be nice if we could learn from the lessons of history and prepare for the future in a more orderly and planned way, but this is almost certainly wis
Re: (Score:3)
Exactly, until such time as Robots have consciousness and feel pain from abuse, there is nothing inhumane about damaging a robot.
Now, you might have violated someone elses property rights by doing so, but if you own the robot, then there's nothing morally reprehensible about robot death matches.
Re:No. No. Fuck no. (Score:5, Interesting)
The problem is defining 'consciousness' and 'pain'. There's already a robot that can sense damage to its body [cornell.edu]. Is that pain? If not, why not?
Re: (Score:3, Interesting)
If I can reprogram it to "forget" that it ever happened, did it really happen?
Re:No. No. Fuck no. (Score:5, Insightful)
If a person suffers from late stage Alzheimer's, is it OK to beat them up?
Re: (Score:2)
They do remember it, just not well. You can never reliably restore them to the state before you beat them up (I am talking mentally, not physically). For my robot, I can reliably restore it to the last known snapshot.
Re: (Score:3)
So if we could reliably restore their memories, it would be OK?
Re: (Score:3)
If a person suffers from late stage Alzheimer's, is it OK to beat them up?
If you're a dominatrix and they are your paying customer, then I would say, yes, it is OK to beat them up. Depending on your religion/moral/social beliefs you may disagree with this example, but it's nonetheless accurate.
Context is everything in a discussion like this. The question that needs to be asked, is does the entity, human or robotic, want to be beaten up? Or rather, in the case of their not making the decision, would beating the entity up violate the entities desires? ie, that it doesn't want to be
Re: (Score:2, Insightful)
Ok, I'll bite.
Humans unlike corporations, are people. They live, they die, they experience fear, and repeated pain has crippling side effects, that we all bear the cost for.
My Volt has detected my damaged trunk latch for thousands of miles. It's just a reading from a sensor that registers a different color for me to see and changes the car's behavior (it won't autolock when the trunk is unlatched). I will fix it at the next payday. No one who observes this behavior will "feel empathy for the car's pain"
Re: (Score:2, Informative)
You haven't "bitten", since you didn't answer my question.
But in any case, just because it doesn't work for your car doesn't mean it doesn't work for anything. TFA is about social robots, not dumb cars.
Take the Tamagotchi; even for a crude device like it was, back then there were many kids (and some not-so-kids) who felt a real emotional connection to the machine. In fact, during v1 - when the thing couldn't be paused - there were actually people making money by babysitting them.
Now, you may think that's ju
Re: (Score:2)
Anthropomophizing machines will not enable you to create better policy than any other fantasy belief system.
Besides, they really hate when you do that!
Re: (Score:2)
Re: (Score:2)
Well, pain is a pretty effective evolutionary development, overwhelming immediate aversion to damage, followed by a lingering reminder that a part was damaged.
Re: (Score:2)
See subject.
"Bite my shiny metal Like!"
why not? (Score:2)
Re: (Score:2)
Its certainly damaged my respect for MIT.
Re: (Score:3)
"if we treat animals in inhumane ways, we become inhumane persons. This logically extends to the treatment of robotic companions"
Non sequitur, surely.
Re: (Score:2)
Only those who fight for their rights have rights. Ever try to put a cat in a cage it didn't want to go into? That cat has rights. Robots that are not programmed to know about rights, and therefore cannot fight to protect them, do not. If cows were to start fighting for their rights, then we'd need to find another food source. :)
Re: (Score:2)
Re: (Score:3)
Only those who fight for their rights have rights.
Not really. Babies and mentally or physically impaired people also have rights.
Robots that are not programmed to know about rights, and therefore cannot fight to protect them, do not.
So if someone programs a robot to lecture you on rights and get real nasty if you don't agree with it, it does have rights? Sounds like just a crazy machine to me and an inhumane disconnection might be in order.
Also, there are many examples of groups of people who are fighting for rights and not getting anything. In the least robot rights should be considered only when human rights are properly figured out.
Re: (Score:2)
Re:Do beef cows have rights? (Score:5, Insightful)
That's an interesting view, but not share by society, otherwise small children wouldn't have any rights.
Re: (Score:2)
children, mentally/physically impaired people, etc. inherit the rights that we "humans" have fought for. They are after all human as well. "Pets" have collectively found some rights - mostly via compassion (that's a Good Thing!) and some by the fact that they fight to avoid doing things they don't want to do.
As for whether society agrees with this, ask women's rights activists, gay rights activists, the slaves, etc. In the end, they all had to stand up and fight for their rights in one way or another (ph
Re: (Score:2)
Why the hell is this scored 0? It is a damn good question!
Re: (Score:3)
Because many Slashdotters have a kneejerk reaction against anything that sounds remotely vegetarian or vegan, even if it's pertinent.
Re:Do beef cows have rights? (Score:5, Insightful)
They certainly don't have as many rights as horses, or house cats, or puppies, do they?
And that is exactly the point made in the article: that robotic "companions" may eventually be granted this additional protection, not because they are fundamentally different from a Roomba or a toaster oven, but because WE attach to them in a fundamentally different way than we would a Roomba or a toaster oven - we anthropomorphize them and project emotional and mental states onto them; we grow attached to them, and in some way, extending legal protections to them is a concession to OUR OWN emotions FOR the other thing, more than any inherent quality of the thing itself.
Re: (Score:2)
They certainly don't have as many rights as horses, or house cats, or puppies, do they?
You have obviously never yourself, or known another person who stole/killed/injured/otherwise damaged a cow they did not own.
Re: (Score:2)
Robots don't have the capacity to suffer? Neither do corporations, but they're "people, my friend." I wouldn't put it past SCOTUS or some other branch of the US government to grant rights like this. Stranger things have happened.
Re: (Score:2)
Re:what if i wanna take it apart? (Score:5, Funny)
Re: (Score:3)
Re: (Score:2)
Oh no, he got it square in the front bumper... ran right over that puppy.
Re: (Score:2)
That's where the EULA kicks in. That shiny new self-mobile computer you just bought? Well, it has rights, but you only have a (very) limited license to use it. Try to do something outside of that spicified in the license, and off to the coporate prison complex with you. You loose your rights, and (what you thought was) your property gets licesend out to the next John to continue generating income for its real owners.
Re:what if i wanna take it apart? (Score:4, Interesting)
Generally the execution of animals is totally acceptable, it's primarily torture and torturous environments that are not, and even then, mostly if people can see it.
Putting a pet to sleep (even with a home brew method) is pretty much completely legal (in the US). Certain types of competitive breeders cull well over 90% of their stock.
Re: (Score:2)
"but your honor, it was an S&M bot. It was screaming "don't stop! don't stop! ..."
Re: (Score:2)
Thanks, now I'm imagining "Molon labe!" being said in robot voice.
Re: (Score:2)
Johnny5 is finally safe from the evil sadists who would say he isn't alive!
Amazing that I had to scroll to the bottom of the page to get to the inevitable Short Circuit reference...