Ethical Questions For The Age Of Robots 330
balancedi writes "Should robots eat? Should they excrete? Should robots be like us? Should we be like robots? Should we care? Jordan Pollack, a prof in Comp Sci at Brandeis, raises some unusual but good questions in an article in Wired called 'Ethics for the Robot Age.'"
Ethical Questions (Score:4, Insightful)
In fact, a lot of the potential problems he alludes to seem to stem from human fears about things humans can or have done to each other in the past. I think that what we really need to be concerned about is creating a new form of "life" that is too much like us without the knowledge we've gained so far.
Think about it. We build this system that can do the thinking of 5000 human years in a day, but he doesn't have the KNOWLEDGE to necessarily back it up. What then? We've got a brand new self-interested lifeform that just evolved 1.5 million years in thirty seconds. I mean, Mr. Roboto may come to the logical conclusion that xyz group needs to be euthanized because it's interfering with abc group without, it would appear, any benefit. For example, if you have all these people in southeast asia who might get dangerously ill and spread disease to otherwise healthy people, isn't the most logical conclusion to either quarantine them and let them die, or to euthanize them so they don't suffer.
Well.. sort of, but that doesn't go well with human motivations and desires, something the robot may not have taken into consideration because it lacks the knowledge of human history that's shaped us to this point and caused us to come to the conclusion that it's best to HELP them, not rid the world of them.
I think machines ought to be barred from rapid critical human thinking until we have stepped through the process with them. The problem might become that the computer can outthink humans by so many orders of magnitude that we can't error check the process in development because there's too much data coming out for humans to walk through.
All that said, perhaps the future lies in alleviating some of the bottle necks to human thinking and expanding our capabilities in new ways by merging with machines. In that way, the human can throttle the computer, and the computer can tap the human's experiences and knowledge in order to come up with a wider range of "logical" conclusions than might otherwise be possible within the limited scope of programming directives.
Re:Ethical Questions (Score:2)
Re:Ethical Questions (Score:2)
Personnally I take that to mean that we leave robots doing manual tasks as remote controlled devices with no real intelligence and we won
Re:Ethical Questions (Score:3, Interesting)
Giving something true AI is going to be kind of difficult - not impossible - but difficult. It has to have the ability to adapt and to learn (the new SONY robot, while advanced, is not that advanced - it just responds to variables).
Once we give robots true AI, lets hope we instill some sort of values in them - otherwise we might have some naughty children who can kick our butts.
Re:Ethical Questions (Score:5, Funny)
Okay, I'm going to leave now...
Undimensional Ethical Systems (Score:3, Insightful)
The answer is in having a multidimensional ethical system. One such previously published system suggested these dimensions (paraphrased)
This list is incomplete. Feel free to add others as desired. Working out the formulas for balancing the parameters and vectors in order to achieve the highest overall and most pos
Thinking it through (Score:3, Insightful)
Lord knows we've done the opposite with computers -- making it up as we go along, screwing each other with IP, DRM, shoddy software and locked-into architecture for the maximized benefit (profit) of a few.
How does any rational person see us proceding with robots/cyborgs any differently?
I foresee patents, robots running on Windows (you'll know, because they have to be rebooted frequ
Best? For whom? (Score:2)
"Best? For whom?"
- Your Robot
Ethical questions about what's "best" between two species only get answered by the fitter of the species.
There's increasing evidence that we're the dominant lifef
Re:Best? For whom? (Score:4, Insightful)
It is, if you happen to be a homo sapien.
Re:Best? For whom? (Score:3)
The question is, did our ancestors outcompete the neanderthals or did we wage war?
Re:Best? For whom? (Score:3, Interesting)
Taken together this would indicate that we outcompeted them for resources. H.Sap. was using thrown spears when H.Neand. was using thrusting spears (because that's what their bodies were designed to do well). This meant that H.Sap. would be able to get more animals from a given area than H.Nean
Re:Best? For whom? (Score:4, Informative)
Max
Re:Best? For whom? (Score:4, Interesting)
What the hell??? Neanderthals were specifically adapted to the cold-weather climate of Europe, and it's a fact they made and used furs as clothes, fashioned jewelry and spears, and so forth. There is no evidence whatsoever that they were any less intelligent than homo sapiens. Not a single smidgeon, regardless of the re-revisionism back to the thinking of the early 1900's that seems to be in vogue.
The only rational explanation I've seen for why homo sapiens won out is a) Neanderthals probably didn't breed as fast or as frequently as homo sapiens did (given the smaller number of skeletons of children found as compared to their human cousins), and b) there's little evidence that Neanderthals warred with one another, and a great deal of evidence that homo sapiens did. This makes sense; social conflict that devolves to violence among humans can be non-deadly, but among Neanderthals - who were much, much stronger than any human, even Arnie - a single violent act could easily lead to death. One punch to the face by a Neanderthal and you don't just have a broken nose; you have a crushed skull and your brains oozing out all over the ground.
Relative levels of intelligence most likely had nothing to do with the demise of Neanderthals. It's more likely that low breeding rates and a lack of will to commit organized, regular genocide were the culprits. Homo sapiens weren't brighter; they just bred like rabbits and were more violent.
Max
Re:Ethical Questions (Score:3, Insightful)
The author talks about robots manning call centers. This, IMHO, is an absurd use of humanoid robots. It would be infinitely more practical to make an "intelligent" telephone or EPABX than it is to employ a humanoid robot to answer phones all day long. The same holds true for most other cases. Even if you take a hazardous job such as mining, i'm sure that specialized machines with specific domain intelligence in mining
Re:Ethical Questions (Score:3, Interesting)
Re:Ethical Questions (Score:3, Insightful)
I'm bored, so here are my answers... (Score:5, Interesting)
Should robots eat?
If they must eat, they should eat. I'm not sure I would like our food supply to be in competition with a bunch of robots. I would rather them simply sunbathe to sustain their daily energy requirements. I mean... let's try to perfect the human condition not worsen it. Imagine a billion hungry robots. They aren't going to sit around and take it like poor starving nations seem to do. They will revolt and imprison us! They'll take what they need. If they do not, they'll be at the very least competing with humanity for survival. Who do you think would win that battle?
Should they excrete?
If they must. Otherwise, wouldn't it be better if they recycled the energy?
Should robots be like us?
What like depressed and self destructive? Not sure I would want a bunch of those competing with the already self destructive people who exist in the world. Don't we have enough war? Don't we have enough excesses? Do we need robots to be this way? Who knows... maybe there could be a good reason for it, but like TreeBeard, I'm going to have to pretend that because I don't understand it, that it could be correct.
Should we be like robots?
If the programming is good, then yes, we could stand to be more like good programmed robots who obey their masters. But what about the arts? What about creative expression and free will? These are highly valued archetypes and many human beings would fight to the death to preserve them. Maybe it would be cool to have implants that augment human development positively. But I think it should be up to the person. No matter how large your data storage capacity is, or how fast you can process data -- wisdom will always be the true litmus test.
Should we care?
If we should, we won't. I think we should care about people and society and protecting freedom, but because I feel this way, it makes it very promising for someone to try and deprive me of this in order to gain something I have. So if I don't care, then it doesn't matter and I am more free. I care about evolution, being that the evolution towards a more robotic usage will be the most likely direction of humanity, but I do not have that level of intelligence to know what is the right direction of evolution. Not even a God has that level of intelligence (which is likely why we have free will, if you believe in religion and God). We are able to evolve, as we always have, through necessity.
However, Einstein said that humanity would have to be able to augment our physical forms with robotics in order to pioneer deep space. He said there would be no other way to handle the forces of nature out that way. So I guess the question is... do we want to die off on this rock, or do we want to live?
If you want to live, then support robotics and the direction of humanity towards that paradigm.
Re: (Score:2, Funny)
Re:I'm bored, so here are my answers... (Score:4, Funny)
Should robots be like us?
What like depressed and self destructive?
Marvin, is that you?
Re: (Score:2)
Re:I'm bored, so here are my answers... (Score:2)
I'll tell you what - if you want to "obey your masters", go right ahead. I think I'll pass on that particular option for human happiness.
Re:I'm bored, so here are my answers... (Score:2, Interesting)
If they must eat, they should eat...
I think it more likely that rich ppl would feed thier robots before they fed poor ppl.
Should robots be like us?
What like depressed and self destructive?
If Aqua Teen Hunger Force has taught us anything it should be that depressed and self destructive non humans
Re:I'm bored, so here are my answers... (Score:2)
Why would we willingly build into robots the limitations of humans? Think about how much of our schedules are based around our needs to eat and use the bathroom. Things like "Will I be back in time for dinner?" or "Should we have a restroom break at 10 or 10:30 in the meeting?" Every rest stop on a highway is based upon the physical limitations of a human's need to eat and pee every so often. Rather than trying to imbue these qualities in robots, I wish they'
Re:I'm bored, so here are my answers... (Score:3, Funny)
Comment removed (Score:4, Funny)
Re:Will Smith (Score:2)
Good question (Score:5, Funny)
More important question is "Who cleans it up?"
See ya later (Score:2)
Re:Good question (Score:2, Funny)
Re:Good question (Score:2)
Tribble-bots (Score:2)
Re:Good question (Score:2)
Should they excrete?
It'd be better if they'd extrude.
EricJavaScript is not Java [ericgiguere.com]
Rochester, MN airport (Score:3, Funny)
I have always wanted a picture of those doors!
PS: It could be gone now (it's been 15 years)
huh? (Score:5, Funny)
Excrete what?
Re:huh? (Score:2)
Well, second generation Terminators sweat and have bad breath. Personally, no thanks; but there's probably at least a small niche fetish market for them.
The real questions (Score:3, Insightful)
The real questions we should be asking are: is it ethical to make people believe they need to work harder than their parents to get less when physical products are easier than ever to produce? Is it ethical for both parents to work so much that they never see their kids?
Re:The real questions (Score:2)
actually I had this guy as a professor. I found his work to be quite interesting... hes not an entirely out there type as academics go... and I assure you that there are no Ivory towers on brandies' campus.
I understand your feelings about it, but just because there are practical real world questions that need to be asked about society doesnt mean we shouldnt be looking forward.
besides... do you really want a CS professor working of what are really social policy questions... he'd probably just video confe
Re:The real questions (Score:2)
Excellent mod parent way up (Score:3, Interesting)
In this book, Watts goes into great detail about robotics and the social implications of them, and how we live in a time that could easily make life totally fun and easy for everyone, regardless of nation/race/culture/creed. He says that the development of robotics will achieve this someday and that the ramifications of doing so could only be positive if applied corre
Re:The real questions (Score:5, Insightful)
Sorry, but he seems more like a wannabe academic-wanker who wishes he were in an ivory tower. Believe me, I've known some academic wankers in ivory towers, and he's not qualified.
Considering "should robots eat?" as some sort of a deep or important ethical question is absurd. Why on earth *would* they eat? "Should they excrete?"?! Excrete what?! Why even speculate about the possible byproducts of 'robots' which don't exist yet?
How are these issues of ethics, rather than an engineering issue? And should 'robots' be given patents? WTF?!
It sounds like this guy is a little out of his element here. Ethics is a complicated subject. So is engineering. Predicting how the introduction of technology will impact the environment and political climate on a global scale is no easy matter, but apparently some CS professor from Brandeis thinks he's got a real handle on it.
The whole article sounds like a 10 year old talking about, "In the future, we might create giant robots who would fly and shoot people, but if we did this, we can only assume they would poop a previously-unknown and highly toxic material. So, we might want to be careful about making flying super-robots." Great. Glad he's on the case.
Play it safe (Score:3, Funny)
ethics vs good manners (Score:2)
Sorry but these are questions of social mannerisms, not ethics. And I hope the second one is NOT used socially.
Re:ethics vs good manners (Score:3, Insightful)
Re:ethics vs good manners (Score:2)
Humans do not enslave fellow humans because they are functionally, physically, intellectually, and emotionally equivilent to themselves. All arguments for the abolishment of slavery are based on the concept of human rights.
A robot, designed by man, to serve man, does not have rights. The legal system would treat it in much the same way that it treats animals. Animals are the undesputed property of their owner, and the law only intervenes in cases of cruelty or when one persons property d
Re:ethics vs good manners (Score:2)
Sorry but these are questions of social mannerisms, not ethics. And I hope the second one is NOT used socially.
Agreed, these aren't really ethical questions. The question of whether we *should* design robots that mimic the human form is an ethical question if you believe in the divinity of humans. I suppose that one could make an ethical argument out of all the questions the author asks if you're a fundamentalist.
Even the robot carrying a weapon is not an ethica
Re:ethics vs good manners (Score:2, Informative)
Kyle's mom is a big fat ugly bitch, but Cartman's mom is a dirty slut with a scheisse fetish
screw robots (Score:2)
Whether we should fear robots? (Score:2)
In most media representations, machines eventaully become a clear and present danger (whether we mistreat them, they find us nonessential, etc etc take your pick).
But to me, the flaw in that is why would we create something that would hate us? Why would we create something and hate it? Sure there's fear of the unknown, but why is that real danger? Is it that to truly allow robots to grow, we need to loose
Re:Whether we should fear robots? (Score:2)
Re:Whether we should fear robots? (Score:2)
Ethical Question? (Score:3, Informative)
Market (Score:2, Insightful)
Now considering past market characteristics, that is either a good thing or a bad thing dependant on your point of view.
But the most important question remains... (Score:2)
Should a hammer have ethics (Score:3, Insightful)
A robot is a tool. Any attempt to insist that they should have ethics is anthropomorhising them far beyond what they are or will ever be. Asking if a robot should have ethics is like asking if a hammer should have ethics.
Wrong, Tim Taylor (Score:2)
There's a big difference here between something which is being designed specifically to act somewhat like a human, and a lump of metal with no decision-making abilities of any kind (let alone moving parts!)
Re:Wrong, Tim Taylor (Score:2)
Re:Wrong, Tim Taylor (Score:3, Interesting)
If the robots were programmed to, they could. Or at a minimum, you have to admit, they can be programmed to look like they make ethical decisions. You can't do that with a hammer. A hammer does not sense its environment and make any sort of decisions on it, no matter how rudimentary.
Re:Should a hammer have ethics (Score:2, Insightful)
Not that the article was about robots themselves having ethics really though. It was more about how we should apply ethics to creating robots.
missing options, er, poll (Score:2)
shouldn't this have been published as a series of slashdot polls?
Dumb Dumb Dum Dum (Score:4, Insightful)
This is so silly it numbs my mind. If future roboticists use internal combustion engines on their robots, they are morons. Fuel cells, solar cells, rechargable batteries
what? (Score:2, Insightful)
I hate to tell you Mr. University Professor, but any robot that does something uses energy, and that energy comes from somewhere. Whether my tin-man friend eats its energy via food or gets it from a battery, it's still competing for resources with me.
Give us your flesh (Score:2)
Laws of robotics (Score:5, Funny)
robots (Score:2, Interesting)
The submitter asked: (Score:5, Funny)
Isn't that what Public Schools are for?
The real question . . . (Score:5, Insightful)
This is a nice simple article on some interesting questions, but it barely scratches the surface of all the concerns we're likely to face in the next 50 years. A few alone:
When is someone responsible for a machine that functions independently, but that they configured?
What resources will be affected by robotic production. Do we really NEED these robots?
When a human and a robot work together on something, who gets the blame for failure?
Of course anyone here can come up with more.
The problem is that as technology improves around us, more people aren't asking these questions, and even less are coming up with useable answers.
The future is coming. I wish we weren't watching "Who's your Daddy" while it approaches.
Excretions (Score:2)
A robot requires a power source. If it runs on electricity it may not "excrete" directly but the power plant that generates the electricity "excretes" quite a bit.
IIRC, large power plants will produce less polution than lots of smaller generators that together generate the same power output. So maybe the question is not whether, but where the excretion occurs?
Other power sources may be better (or no worse) if distributed. Perhaps someone can design a home robot that runs on domestic refuse (table scrap
The only important question is... (Score:3, Interesting)
If not: They are a machine/tool/etc. What they are like and what they do depends only on who made them, who owns them, and applicable laws governing the use of similar personal effects as scooters, computers, videocameras, etc.
If yes: They can do and be whatever the hell they want under applicable laws currently governing the humans (that is to say, they should have the same rights and accountability as any of us)
That is all.
Re:The only important question is... (Score:2)
Legal Affairs also considers AI Rights (Score:2, Insightful)
Oblig Futurama reference (Score:2)
Quick answers... (Score:2)
No.
Should they excrete?
No.
Should robots be like us?
No.
Should we be like robots?
No.
Should we care?
No.
C'mon people... aren't we getting a little carried away here?
Re:Quick answers... (Score:2)
No.
Should robots... (Score:3, Insightful)
These questions seem... unnecessary. (Score:2, Interesting)
Speaking of human-robot relations, the fear of robots realizing they're superior to humans and killing us all is interesting. If it turns out they succeed in doing that, then apparently they were super
Re:These questions seem... unnecessary. (Score:2)
So, no, we shouldn't eliminate mosquitos.
3 Laws? (Score:2)
Re:3 Laws? (Score:2)
Should robots pay taxes? (Score:2)
What about the next generations robots (year 2250) that work in factories instead of humans? The people that use to work in the factories have no jobs and no income and therefor pay no taxes.
Who is going to pay for the streets? Or the pentagon budget?
[funny]
This street was sponsored by Cisco Systems and GM.
This war was sponsored by Boeing, a subsidiary of Microsoft Defense Systems.
[/funny]
Age of robots or a robot's age? (Score:2)
Should robots be preprogrammed to die? Should they be mortal (in the aging sense)?
Are there strange societal issues that might come up when a concious entity is able to live for a 1000 years and tell very accurate stories and accounts from centuries past?
Re:Age of robots or a robot's age? (Score:2)
Will we have a moral obligation to extend a robot's life if we can do so?
Moore's Law (Score:2)
Why does everybody have to screw this "law" up? Moore observed in 1965 that the number of transistors per square inch doubled every 18 months.
http://www.webopedia.com/TERM/M/Moores_Law.html [webopedia.com]
Answers (Score:3, Funny)
Chicken bones and guts.
Should they excrete?
Mickey D's chicken nuggets.
Should robots be like us?
Eat Mickey D's chicken nuggets? I hope for them that they won't have to.
Should we be like robots?
I wouldn't want to excrete chicken nuggets.
Should we care?
Oh yeah, the bots and us could create an eternal yin yang of bad food.
I see your questions and raise you another. (Score:2)
After reading that I do ask myself, why should there be only one way to build and program robots? Why not have some robots that eat and other that don't? Why not have robots that are more like people and others that are nothink like them?
And I lied, I raised three questions
Something I've Had on My Mind (Score:2)
It seems to me that robots are finite state machines (unlike humans, I think) and should have no more rights than a toaster. Of course, if any of them can solve the Halting Problem, I'd be ready to give them voting rights et al.
Stupid questions (Score:3, Insightful)
Should robots excrete?
Stupid questions. Unless someone invents a 100% efficient perpetual-motion machine, robots, like any system, will have to consume energy and will produce waste byproducts.
Duh.
They should be banned from.... (Score:2)
Should we care? (Score:2)
If we had the opportunity, would it be beneficial to outsource all of our jobs to some advanced alien race for less than it costs us to do it ourselves?
At what point do we consider the economic consequences of our actions?
I believe robots are good for the economy, but not if it most people lose their jobs. And I think it would be great to automate people out of a job, as long as we had the social framew
I'm going to write some ethics essays... (Score:2)
Ethics for the 'Television' Age
Ethics for the 'Microwave' Age
Ethics for the 'Anthropomorphized Labor-Reducing Tool' Age.
What about . . . (Score:2)
Are there inappropriate names for robots?
Robbie? Data? Marvin? Vivian? (Okay, I just don't
like the name Vivian for guys.)
Not even worth asking (Score:3, Insightful)
We, as humans, should stop trying to play god to create sentient beings. Robots as tools are much more useful to us. But, you say, you want something that can independently think and do stuff for us. What you are looking for here are "slaves". Beings that can do their own things but still obey you.
Why do we even bother with all of this? If you don't make a super intelligent robot that can learn and independently think, then you don't need the 3 laws. You don't need to worry about the robots killing all humans and taking over the world. All of these problems that sci-fi say we will be afflicted with because we want to play god and be lazy.
We are doomed.
Yet Another Ethical Question (Score:2)
Re:3 laws (Score:2, Funny)
Re:3 laws (Score:2, Insightful)
Re:3 laws (Score:2)
Basically, having the first law being the absolute one means that robots will always do what the robots conclude is best for the humans.
Ohms Law?? (Score:2)
Re:No. (Score:2)
To have a cheap and effective way of producing bricks. I mean, hink about it, instead of having to dig up tons of dirt and doing all sorts of things with it, you could have an army of robots that eat mud, "digest" it, and SHIT BRICKS!
Re:No. (Score:2)
There are few energy storage systems with the energy density of hydrocarbons like petrolium or alchohols. They would "metabolize" the liquid hydrocarbons, and exhale water vapor and CO2, much like their masters.
With enough research, you could develop a "gut" that will process everything from vegatable oil to high-proof whiskey. The concept behind "Bender" the robot from Futurama really isn't that far off.
(Though, when processing any fue
Re:Should robots be allowed to ... (Score:2)
Bender: Clem Johnson? That sack of skin wouldn't have lasted one pitch in the old Robot Leagues. Now, Wireless Joe Jackson, there was a blern-hitting machine.
Leela: Exactly. He was a machine designed to hit blerns. I mean, come on, Wireless Joe was nothing but a programmable bat on wheels.
Bender: Oh, and I suppose Pitch-O-Mat 5000 was just a modified Howitzer?
Leela: Yep.
Re:He raises some interesting points (Score:2)
Fortunately, I keep a spare copy of "I dated a robot!" in the VCR at all times.