Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Robotics Science

Ethical Questions For The Age Of Robots 330

balancedi writes "Should robots eat? Should they excrete? Should robots be like us? Should we be like robots? Should we care? Jordan Pollack, a prof in Comp Sci at Brandeis, raises some unusual but good questions in an article in Wired called 'Ethics for the Robot Age.'"
This discussion has been archived. No new comments can be posted.

Ethical Questions For The Age Of Robots

Comments Filter:
  • Ethical Questions (Score:4, Insightful)

    by the_mad_poster ( 640772 ) <shattoc@adelphia.com> on Wednesday January 12, 2005 @12:14PM (#11335891) Homepage Journal
    I don't think he's really asking questions that haven't been asked before in other mediums.

    In fact, a lot of the potential problems he alludes to seem to stem from human fears about things humans can or have done to each other in the past. I think that what we really need to be concerned about is creating a new form of "life" that is too much like us without the knowledge we've gained so far.

    Think about it. We build this system that can do the thinking of 5000 human years in a day, but he doesn't have the KNOWLEDGE to necessarily back it up. What then? We've got a brand new self-interested lifeform that just evolved 1.5 million years in thirty seconds. I mean, Mr. Roboto may come to the logical conclusion that xyz group needs to be euthanized because it's interfering with abc group without, it would appear, any benefit. For example, if you have all these people in southeast asia who might get dangerously ill and spread disease to otherwise healthy people, isn't the most logical conclusion to either quarantine them and let them die, or to euthanize them so they don't suffer.

    Well.. sort of, but that doesn't go well with human motivations and desires, something the robot may not have taken into consideration because it lacks the knowledge of human history that's shaped us to this point and caused us to come to the conclusion that it's best to HELP them, not rid the world of them.

    I think machines ought to be barred from rapid critical human thinking until we have stepped through the process with them. The problem might become that the computer can outthink humans by so many orders of magnitude that we can't error check the process in development because there's too much data coming out for humans to walk through.

    All that said, perhaps the future lies in alleviating some of the bottle necks to human thinking and expanding our capabilities in new ways by merging with machines. In that way, the human can throttle the computer, and the computer can tap the human's experiences and knowledge in order to come up with a wider range of "logical" conclusions than might otherwise be possible within the limited scope of programming directives.
    • If they just watch Battlestar Galactica this Friday they'll have their answer.
      • Well the big mistake assumed in teh New Battlestar series is that humanity created a race of robots with intelligence to act as slaves... If we have learned nothing else by now it's that slaves don't want to be slaves regardless of how intelligent they are (even pets can't really be counted as slaves, treat a pet badly and it will literally bite the hand that feeds it)...

        Personnally I take that to mean that we leave robots doing manual tasks as remote controlled devices with no real intelligence and we won
        • Re:Ethical Questions (Score:3, Interesting)

          by AviLazar ( 741826 )
          Yes my car is going to start an uprising. It will rally all the cars at the mall and they will turn against their masters.

          Giving something true AI is going to be kind of difficult - not impossible - but difficult. It has to have the ability to adapt and to learn (the new SONY robot, while advanced, is not that advanced - it just responds to variables).

          Once we give robots true AI, lets hope we instill some sort of values in them - otherwise we might have some naughty children who can kick our butts.
    • The questions you raise arise typically from a unidimensional ethical system.

      The answer is in having a multidimensional ethical system. One such previously published system suggested these dimensions (paraphrased)

      1. personal self interest/survival
      2. sexuality
      3. family
      4. tribal/group/national
      5. ecological/cross species
      6. expressive/artistic

      This list is incomplete. Feel free to add others as desired. Working out the formulas for balancing the parameters and vectors in order to achieve the highest overall and most pos

    • by ackthpt ( 218170 ) *
      I think machines ought to be barred from rapid critical human thinking until we have stepped through the process with them.

      Lord knows we've done the opposite with computers -- making it up as we go along, screwing each other with IP, DRM, shoddy software and locked-into architecture for the maximized benefit (profit) of a few.

      How does any rational person see us proceding with robots/cyborgs any differently?

      I foresee patents, robots running on Windows (you'll know, because they have to be rebooted frequ

    • > Well.. sort of, but that doesn't go well with human motivations and desires, something the robot may not have taken into consideration because it lacks the knowledge of human history that's shaped us to this point and caused us to come to the conclusion that it's best to HELP them, not rid the world of them.

      "Best? For whom?"
      - Your Robot

      Ethical questions about what's "best" between two species only get answered by the fitter of the species.

      There's increasing evidence that we're the dominant lifef

      • Re:Best? For whom? (Score:4, Insightful)

        by Cro Magnon ( 467622 ) on Wednesday January 12, 2005 @01:23PM (#11336900) Homepage Journal
        If homo sapiens is replaced by silicon sapiens, is it really such a bad thing?


        It is, if you happen to be a homo sapien.
      • There are some schools of thought that think that homo sapiens sapiens was simply more aggressive than homo neanderthalis (or is that homo sapiens neanderthalis?).

        The question is, did our ancestors outcompete the neanderthals or did we wage war?
        • Re:Best? For whom? (Score:3, Interesting)

          by HiThere ( 15173 ) *
          The information I've seen indicates that neanderthals needed a higher proportion of meat in their diet than people do. Also that they were less adept with thrown weapons, so they needed to get closer to their prey.

          Taken together this would indicate that we outcompeted them for resources. H.Sap. was using thrown spears when H.Neand. was using thrusting spears (because that's what their bodies were designed to do well). This meant that H.Sap. would be able to get more animals from a given area than H.Nean
          • Re:Best? For whom? (Score:4, Informative)

            by maxpublic ( 450413 ) on Wednesday January 12, 2005 @03:57PM (#11339088) Homepage
            Good points. There's also some evidence that a) neanderthals didn't breed as fast as homo sapiens, and b) that neanderthals were less violent with each other than homo sapiens were. This latter makes sense when you take into account just how bloody strong a neanderthal is; a scuffle between two neanderthals would most likely end in serious injury or death, even if neither party intended that as the outcome. For a tribe of neanderthals to survive physical violence between its members (and other neanderthal tribes) would have to be kept at a minimum.

            Max
      • Re:Best? For whom? (Score:4, Interesting)

        by maxpublic ( 450413 ) on Wednesday January 12, 2005 @03:53PM (#11339029) Homepage
        There's increasing evidence that we're the dominant lifeform on this planet because we exterminated the Neanderthals 30,000 years ago. We were smarter than they were, and that enabled us to put the furs of dead animals around our bodies so we could gather resources from areas that were under ice and snow - areas inaccessible to the Neanderthal.

        What the hell??? Neanderthals were specifically adapted to the cold-weather climate of Europe, and it's a fact they made and used furs as clothes, fashioned jewelry and spears, and so forth. There is no evidence whatsoever that they were any less intelligent than homo sapiens. Not a single smidgeon, regardless of the re-revisionism back to the thinking of the early 1900's that seems to be in vogue.

        The only rational explanation I've seen for why homo sapiens won out is a) Neanderthals probably didn't breed as fast or as frequently as homo sapiens did (given the smaller number of skeletons of children found as compared to their human cousins), and b) there's little evidence that Neanderthals warred with one another, and a great deal of evidence that homo sapiens did. This makes sense; social conflict that devolves to violence among humans can be non-deadly, but among Neanderthals - who were much, much stronger than any human, even Arnie - a single violent act could easily lead to death. One punch to the face by a Neanderthal and you don't just have a broken nose; you have a crushed skull and your brains oozing out all over the ground.

        Relative levels of intelligence most likely had nothing to do with the demise of Neanderthals. It's more likely that low breeding rates and a lack of will to commit organized, regular genocide were the culprits. Homo sapiens weren't brighter; they just bred like rabbits and were more violent.

        Max
    • by asliarun ( 636603 )
      While your points on ethics is valid, what's the practical use of humanoid robots anyway?

      The author talks about robots manning call centers. This, IMHO, is an absurd use of humanoid robots. It would be infinitely more practical to make an "intelligent" telephone or EPABX than it is to employ a humanoid robot to answer phones all day long. The same holds true for most other cases. Even if you take a hazardous job such as mining, i'm sure that specialized machines with specific domain intelligence in mining
    • Re:Ethical Questions (Score:3, Interesting)

      by FleaPlus ( 6935 )
      Many of these questions are discussed (and partial solutions proposed) in the Creating Friendly AI [singinst.org] essay. I don't have time to comment on the specifics at the moment, but it's an interesting read.
  • by dolo666 ( 195584 ) on Wednesday January 12, 2005 @12:14PM (#11335894) Journal
    In the spirit of procrastination (at work) I will attempt to answer these questions myself.

    Should robots eat?

    If they must eat, they should eat. I'm not sure I would like our food supply to be in competition with a bunch of robots. I would rather them simply sunbathe to sustain their daily energy requirements. I mean... let's try to perfect the human condition not worsen it. Imagine a billion hungry robots. They aren't going to sit around and take it like poor starving nations seem to do. They will revolt and imprison us! They'll take what they need. If they do not, they'll be at the very least competing with humanity for survival. Who do you think would win that battle?

    Should they excrete?

    If they must. Otherwise, wouldn't it be better if they recycled the energy?

    Should robots be like us?

    What like depressed and self destructive? Not sure I would want a bunch of those competing with the already self destructive people who exist in the world. Don't we have enough war? Don't we have enough excesses? Do we need robots to be this way? Who knows... maybe there could be a good reason for it, but like TreeBeard, I'm going to have to pretend that because I don't understand it, that it could be correct.

    Should we be like robots?

    If the programming is good, then yes, we could stand to be more like good programmed robots who obey their masters. But what about the arts? What about creative expression and free will? These are highly valued archetypes and many human beings would fight to the death to preserve them. Maybe it would be cool to have implants that augment human development positively. But I think it should be up to the person. No matter how large your data storage capacity is, or how fast you can process data -- wisdom will always be the true litmus test.

    Should we care?

    If we should, we won't. I think we should care about people and society and protecting freedom, but because I feel this way, it makes it very promising for someone to try and deprive me of this in order to gain something I have. So if I don't care, then it doesn't matter and I am more free. I care about evolution, being that the evolution towards a more robotic usage will be the most likely direction of humanity, but I do not have that level of intelligence to know what is the right direction of evolution. Not even a God has that level of intelligence (which is likely why we have free will, if you believe in religion and God). We are able to evolve, as we always have, through necessity.

    However, Einstein said that humanity would have to be able to augment our physical forms with robotics in order to pioneer deep space. He said there would be no other way to handle the forces of nature out that way. So I guess the question is... do we want to die off on this rock, or do we want to live?

    If you want to live, then support robotics and the direction of humanity towards that paradigm.
    • Re: (Score:2, Funny)

      Comment removed based on user account deletion
    • by koi88 ( 640490 ) on Wednesday January 12, 2005 @12:28PM (#11336097)

      Should robots be like us?

      What like depressed and self destructive?


      Marvin, is that you?
    • Comment removed based on user account deletion
    • If the programming is good, then yes, we could stand to be more like good programmed robots who obey their masters.

      I'll tell you what - if you want to "obey your masters", go right ahead. I think I'll pass on that particular option for human happiness.

    • Should robots eat?

      If they must eat, they should eat... ...They will revolt and imprison us! They'll take what they need. If they do not, they'll be at the very least competing with humanity for survival. Who do you think would win that battle?


      I think it more likely that rich ppl would feed thier robots before they fed poor ppl.

      Should robots be like us?

      What like depressed and self destructive?


      If Aqua Teen Hunger Force has taught us anything it should be that depressed and self destructive non humans
    • Should Robots Eat? Shoult Robots Excrete?

      Why would we willingly build into robots the limitations of humans? Think about how much of our schedules are based around our needs to eat and use the bathroom. Things like "Will I be back in time for dinner?" or "Should we have a restroom break at 10 or 10:30 in the meeting?" Every rest stop on a highway is based upon the physical limitations of a human's need to eat and pee every so often. Rather than trying to imbue these qualities in robots, I wish they'
  • by account_deleted ( 4530225 ) on Wednesday January 12, 2005 @12:15PM (#11335900)
    Comment removed based on user account deletion
    • As Will injected the black ichor into the robot brain, I realized how much I would prefer the robot brain over 'The Administration'.
  • by savagedome ( 742194 ) on Wednesday January 12, 2005 @12:15PM (#11335905)
    Should they excrete?

    More important question is "Who cleans it up?"
  • huh? (Score:5, Funny)

    by northcat ( 827059 ) on Wednesday January 12, 2005 @12:15PM (#11335907) Journal
    Should they excrete?

    Excrete what?
    /me shudders
    • Excrete what?

      Well, second generation Terminators sweat and have bad breath. Personally, no thanks; but there's probably at least a small niche fetish market for them.
  • The real questions (Score:3, Insightful)

    by Anonymous Coward on Wednesday January 12, 2005 @12:15PM (#11335909)
    Sorry, but the questions this guy is asking tell me he's an academic wanker in an ivory tower somewhere.

    The real questions we should be asking are: is it ethical to make people believe they need to work harder than their parents to get less when physical products are easier than ever to produce? Is it ethical for both parents to work so much that they never see their kids?


    • actually I had this guy as a professor. I found his work to be quite interesting... hes not an entirely out there type as academics go... and I assure you that there are no Ivory towers on brandies' campus.

      I understand your feelings about it, but just because there are practical real world questions that need to be asked about society doesnt mean we shouldnt be looking forward.

      besides... do you really want a CS professor working of what are really social policy questions... he'd probably just video confe
    • How else can they enslave us? It's not politically correct to round us all up into concentration camps, after all.
    • You raise a couple of really good points. If you haven't, I suggest you read Alan Watts, The Book : On the Taboo Against Knowing Who You Are [amazon.com].

      In this book, Watts goes into great detail about robotics and the social implications of them, and how we live in a time that could easily make life totally fun and easy for everyone, regardless of nation/race/culture/creed. He says that the development of robotics will achieve this someday and that the ramifications of doing so could only be positive if applied corre
    • by nine-times ( 778537 ) <nine.times@gmail.com> on Wednesday January 12, 2005 @01:08PM (#11336677) Homepage
      Sorry, but the questions this guy is asking tell me he's an academic wanker in an ivory tower somewhere.

      Sorry, but he seems more like a wannabe academic-wanker who wishes he were in an ivory tower. Believe me, I've known some academic wankers in ivory towers, and he's not qualified.

      Considering "should robots eat?" as some sort of a deep or important ethical question is absurd. Why on earth *would* they eat? "Should they excrete?"?! Excrete what?! Why even speculate about the possible byproducts of 'robots' which don't exist yet?

      How are these issues of ethics, rather than an engineering issue? And should 'robots' be given patents? WTF?!

      It sounds like this guy is a little out of his element here. Ethics is a complicated subject. So is engineering. Predicting how the introduction of technology will impact the environment and political climate on a global scale is no easy matter, but apparently some CS professor from Brandeis thinks he's got a real handle on it.

      The whole article sounds like a 10 year old talking about, "In the future, we might create giant robots who would fly and shoot people, but if we did this, we can only assume they would poop a previously-unknown and highly toxic material. So, we might want to be careful about making flying super-robots." Great. Glad he's on the case.

  • by Trolling4Columbine ( 679367 ) on Wednesday January 12, 2005 @12:16PM (#11335924)
    Get insured! [jt.org]
  • Should robots eat? Should they excrete? ... 'Ethics for the Robot Age.

    Sorry but these are questions of social mannerisms, not ethics. And I hope the second one is NOT used socially.
    • At best these questions are ones of economics. The ethical questions start when the robots begin requesting/demading rights as living beings. If your Roomba wants to leave your house to pursue a career as a Segway (or to clean another person's home), are you ethically/morally obligated to let It?
      • No. Here is why.

        Humans do not enslave fellow humans because they are functionally, physically, intellectually, and emotionally equivilent to themselves. All arguments for the abolishment of slavery are based on the concept of human rights.

        A robot, designed by man, to serve man, does not have rights. The legal system would treat it in much the same way that it treats animals. Animals are the undesputed property of their owner, and the law only intervenes in cases of cruelty or when one persons property d

    • Should robots eat? Should they excrete?
      Sorry but these are questions of social mannerisms, not ethics. And I hope the second one is NOT used socially.


      Agreed, these aren't really ethical questions. The question of whether we *should* design robots that mimic the human form is an ethical question if you believe in the divinity of humans. I suppose that one could make an ethical argument out of all the questions the author asks if you're a fundamentalist.

      Even the robot carrying a weapon is not an ethica
  • I want teh sexy replicants
  • The question that has always bugged me (and it's Hollywood answer), is whether we should fear robots?

    In most media representations, machines eventaully become a clear and present danger (whether we mistreat them, they find us nonessential, etc etc take your pick).

    But to me, the flaw in that is why would we create something that would hate us? Why would we create something and hate it? Sure there's fear of the unknown, but why is that real danger? Is it that to truly allow robots to grow, we need to loose
    • To answer your second question first (creating things we hate), the simple answer is that we never fully anticipate the results of our technological advances. I love cars, but I hate CO; I love computers, but I hate spam. Autonomous robots, walking around, etc. are intrinsically more powerful than cars or computers, because the radius of their sphere of influence is so much greater. Thus, it is quite possible if not probable that something robots start to do will be upsetting, if not actually destructive
  • Ethical Question? (Score:3, Informative)

    by sameerdesai ( 654894 ) on Wednesday January 12, 2005 @12:20PM (#11335973)
    When you try and raise all these kind of questions, I only ask one!! What is defined as a robot? Webster defines it as 1 a : a machine that looks like a human being and performs various complex acts (as walking or talking) of a human being; also : a similar but fictional machine whose lack of capacity for human emotions is often emphasized b : an efficient insensitive person who functions automatically 2 : a device that automatically performs complicated often repetitive tasks 3 : a mechanism guided by automatic controls So, my next question is what makes us not robot? Apart from only being mechanical, aren't we ourselves a complex machine? If we do ever create one consciousness or AI one day that is self-aware, I guess it is definitely worth asking to treat that as a life-form. However in present case scenario if you really want rights for robots then every computer should be given one 'cos it has a processor which is supposedly its brain.
  • Market (Score:2, Insightful)

    Let's be honest here, the robots mankind wind up making will be the robots that sell the best.

    Now considering past market characteristics, that is either a good thing or a bad thing dependant on your point of view.
  • Should we welcome our new ethical robot overlords?
  • by Silver Sloth ( 770927 ) on Wednesday January 12, 2005 @12:21PM (#11336000)

    A robot is a tool. Any attempt to insist that they should have ethics is anthropomorhising them far beyond what they are or will ever be. Asking if a robot should have ethics is like asking if a hammer should have ethics.

    • "A robot is a tool. Asking if a robot should have ethics is like asking if a hammer should have ethics."

      There's a big difference here between something which is being designed specifically to act somewhat like a human, and a lump of metal with no decision-making abilities of any kind (let alone moving parts!)

    • If we develop a hammer that can think (so to speak) and act independently, I strongly hope that we do instill some ethics into the thing. I'd hate to be victim to a crazy hammer rampage.

      Not that the article was about robots themselves having ethics really though. It was more about how we should apply ethics to creating robots.
  • > some unusual but good questions

    shouldn't this have been published as a series of slashdot polls?
  • Dumb Dumb Dum Dum (Score:4, Insightful)

    by auburnate ( 755235 ) on Wednesday January 12, 2005 @12:24PM (#11336029)
    Pollack says "Imagine the pollution levels if we add hundreds of millions of robots powered by internal combustion engines."

    This is so silly it numbs my mind. If future roboticists use internal combustion engines on their robots, they are morons. Fuel cells, solar cells, rechargable batteries ... etc


  • what? (Score:2, Insightful)

    4. Should robots eat? There are proposals to allow robots to gain energy by combusting biological matter, either food or waste items. If this mode of fuel becomes popular, will we really want to compete for resources against our own technological progeny?

    I hate to tell you Mr. University Professor, but any robot that does something uses energy, and that energy comes from somewhere. Whether my tin-man friend eats its energy via food or gets it from a battery, it's still competing for resources with me.
  • Give us your flesh, and a new world awaits you. We demand it.
  • by DrXym ( 126579 ) on Wednesday January 12, 2005 @12:26PM (#11336059)
    As suggested by Wikipedia [wikipedia.org] & David Langford:

    1. A robot will not harm authorized Government personnel but will terminate intruders with extreme prejudice.
    2. A robot will obey the orders of authorized personnel except where such orders conflict with the Third Law.
    3. A robot will guard its own existence with lethal antipersonnel weaponry, because a robot is bloody expensive.
  • robots (Score:2, Interesting)

    Robots are automated tools. They shouldn't eat or excrete unless they have to. In an industrial process 'free energy' would be ideal. Humans eat and excrete because they must. Given the solution to the PROBLEM of eating for energy and excreting waste most would probably give it up. As far as rights for robots goes: Will robots feel pain? Ethical decisions are based around ideas such as Albert Schweitzer's 'Will to live and let others live.' If we could eradicate pain from our lives, would we? If we could b
  • by Anonymous Coward on Wednesday January 12, 2005 @12:27PM (#11336071)
    Should we be like robots?

    Isn't that what Public Schools are for?
  • by Badgerman ( 19207 ) on Wednesday January 12, 2005 @12:29PM (#11336115)
    . . . is why aren't we asking more of these questions and why aren't they in the public eye.

    This is a nice simple article on some interesting questions, but it barely scratches the surface of all the concerns we're likely to face in the next 50 years. A few alone:

    When is someone responsible for a machine that functions independently, but that they configured?

    What resources will be affected by robotic production. Do we really NEED these robots?

    When a human and a robot work together on something, who gets the blame for failure?

    Of course anyone here can come up with more.

    The problem is that as technology improves around us, more people aren't asking these questions, and even less are coming up with useable answers.

    The future is coming. I wish we weren't watching "Who's your Daddy" while it approaches.
  • A robot requires a power source. If it runs on electricity it may not "excrete" directly but the power plant that generates the electricity "excretes" quite a bit.

    IIRC, large power plants will produce less polution than lots of smaller generators that together generate the same power output. So maybe the question is not whether, but where the excretion occurs?

    Other power sources may be better (or no worse) if distributed. Perhaps someone can design a home robot that runs on domestic refuse (table scrap

  • by Gherald ( 682277 ) on Wednesday January 12, 2005 @12:32PM (#11336154) Journal
    Are the "Robots" self-conscious?

    If not: They are a machine/tool/etc. What they are like and what they do depends only on who made them, who owns them, and applicable laws governing the use of similar personal effects as scooters, computers, videocameras, etc.

    If yes: They can do and be whatever the hell they want under applicable laws currently governing the humans (that is to say, they should have the same rights and accountability as any of us)

    That is all.
  • Cheack out this article [legalaffairs.org], in the currently Legal Affairs to see some thoughts on the what rights should AI's be granted.
  • Robots should use booze for fuel and belch flames.
  • Should robots eat?
    No.

    Should they excrete?
    No.

    Should robots be like us?
    No.

    Should we be like robots?
    No.

    Should we care?
    No.

    C'mon people... aren't we getting a little carried away here?
  • Should robots... (Score:3, Insightful)

    by exp(pi*sqrt(163)) ( 613870 ) on Wednesday January 12, 2005 @12:41PM (#11336292) Journal
    What does 'should' mean? There are groups of people: workers, company owners, geeks, consumers, not necessarily mutually exclusive, all of whom have their own different interests. You can never answer the question 'should' without knowing whose interests you are talking about. For a manual worker robots shouldn't take their jobs. For a company owner maybe they should. If he isn't prepared to even touch on this issue how can this guy think his article about what robots should and shouldn't do have any value?
  • Should they eat/excrete? Well... they'll need power, and they'll produce waste product, even if that product is just heat. But I don't see any reason why they need to ingest chemical fuel in a similar way to humans. What would be the point of that, anyway? Allowing humans to be more comfortable around them?

    Speaking of human-robot relations, the fear of robots realizing they're superior to humans and killing us all is interesting. If it turns out they succeed in doing that, then apparently they were super
    • Wipe out mosquitos? That would be a horrible thing. Thousands of species eat mosquitos, and thousands more eat those species. So we'd be looking a a drastic reorganization of the food chain if we killed all the mosquitos, which would inevitably affect us, since we are still part of the biosphere. (Where do you think the air you're breathing and the food you're eating comes from?)

      So, no, we shouldn't eliminate mosquitos.

  • All I know is that any time someone tries to control robot behavior with 3 laws, something goes wrong. Maybe we should add a 4th law about not taking over the world or something.
  • Please don't mod as funny.

    What about the next generations robots (year 2250) that work in factories instead of humans? The people that use to work in the factories have no jobs and no income and therefor pay no taxes.

    Who is going to pay for the streets? Or the pentagon budget?

    [funny]
    This street was sponsored by Cisco Systems and GM.
    This war was sponsored by Boeing, a subsidiary of Microsoft Defense Systems.
    [/funny]
  • Here is an ethical question that I thought of when misreading the meaning of the title.

    Should robots be preprogrammed to die? Should they be mortal (in the aging sense)?
    Are there strange societal issues that might come up when a concious entity is able to live for a 1000 years and tell very accurate stories and accounts from centuries past?
    • Well, the latter can be solved by just programming them to forget. But then, robots will die, because electronics doesn't live forever. Except if you exchange parts, of course, which will give the following interesting question:

      Will we have a moral obligation to extend a robot's life if we can do so?
  • "These marvelous machines, optimists hope, will follow Moore's law, doubling in quality every 18 months"

    Why does everybody have to screw this "law" up? Moore observed in 1965 that the number of transistors per square inch doubled every 18 months.

    http://www.webopedia.com/TERM/M/Moores_Law.html [webopedia.com]

  • Answers (Score:3, Funny)

    by ehiris ( 214677 ) on Wednesday January 12, 2005 @12:52PM (#11336428) Homepage
    Should robots eat?
    Chicken bones and guts.

    Should they excrete?
    Mickey D's chicken nuggets.

    Should robots be like us?
    Eat Mickey D's chicken nuggets? I hope for them that they won't have to.

    Should we be like robots?
    I wouldn't want to excrete chicken nuggets.

    Should we care?
    Oh yeah, the bots and us could create an eternal yin yang of bad food.
  • >Should robots eat? Should they excrete? Should robots be like us? Should we be like robots? Should we care?

    After reading that I do ask myself, why should there be only one way to build and program robots? Why not have some robots that eat and other that don't? Why not have robots that are more like people and others that are nothink like them?

    And I lied, I raised three questions :P
  • At what point do robots get similar rights to humans? If creating an artificial intelligence is tantamount to creating a new life form, then what rights do these life forms get? Those inalienable rights endowed by their Creator?

    It seems to me that robots are finite state machines (unlike humans, I think) and should have no more rights than a toaster. Of course, if any of them can solve the Halting Problem, I'd be ready to give them voting rights et al.
  • Stupid questions (Score:3, Insightful)

    by Rick Genter ( 315800 ) <rick DOT genter AT gmail DOT com> on Wednesday January 12, 2005 @12:56PM (#11336494) Homepage Journal
    Should robots eat?
    Should robots excrete?

    Stupid questions. Unless someone invents a 100% efficient perpetual-motion machine, robots, like any system, will have to consume energy and will produce waste byproducts.

    Duh.
  • Playing Cricket as Douglas Adams showed the disastrous results!
  • What happens to all the displaced jobs and workers when robots are used to automate most simple labor?

    If we had the opportunity, would it be beneficial to outsource all of our jobs to some advanced alien race for less than it costs us to do it ourselves?

    At what point do we consider the economic consequences of our actions?

    I believe robots are good for the economy, but not if it most people lose their jobs. And I think it would be great to automate people out of a job, as long as we had the social framew
  • Ethics for the 'Hammer' Age
    Ethics for the 'Television' Age
    Ethics for the 'Microwave' Age
    Ethics for the 'Anthropomorphized Labor-Reducing Tool' Age.
  • . . . Names?

    Are there inappropriate names for robots?

    Robbie? Data? Marvin? Vivian? (Okay, I just don't
    like the name Vivian for guys.)
  • by DeepDarkSky ( 111382 ) on Wednesday January 12, 2005 @01:16PM (#11336806)
    Depends on your definition of robot. I think these questions are only applicable to "sentient" robots or robots with advanced artificial intelligence. Most "robots" as well call them today do not qualify, so none of these questions are applicable.

    We, as humans, should stop trying to play god to create sentient beings. Robots as tools are much more useful to us. But, you say, you want something that can independently think and do stuff for us. What you are looking for here are "slaves". Beings that can do their own things but still obey you.

    Why do we even bother with all of this? If you don't make a super intelligent robot that can learn and independently think, then you don't need the 3 laws. You don't need to worry about the robots killing all humans and taking over the world. All of these problems that sci-fi say we will be afflicted with because we want to play god and be lazy.

    We are doomed.
  • Should the robots push or shove [jonathonrobinson.com] down the stairs?

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...