Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Robotics Medicine Technology

Should a Service Robot Bring an Alcoholic a Drink? 162

An anonymous reader writes: We've come to a point where care robots are being used to assist people with illnesses and mobility problems. They can bring medicine, a glass of water, food, and other items that a person may have trouble getting to on their own. But what limits should we set on these robots? Should they be able to deliver alcoholic beverages? If so, should they refuse to serve them to certain people, like children or alcoholics? The issue is complicated further because these robots may have been purchased by the patient, by the doctor or hospital (which sent it home with the patient to monitor their health), or by a concerned family member who wants to monitor their relative. The latest poll research by the Open Roboethics Initiative looked at people's attitudes about whether a care robot should prioritize its owner's wishes over those of the patient.
This discussion has been archived. No new comments can be posted.

Should a Service Robot Bring an Alcoholic a Drink?

Comments Filter:
  • by Doug Otto ( 2821601 ) on Wednesday February 25, 2015 @12:05PM (#49127677)
    Yes and make sammiches.
    • As long as you don't demand that one provide you with sex...

      But then, that's a whole other ethical bucket of fish.

      • But then, that's a whole other ethical bucket of fish.

        And/or sausages.

      • I seriously hope that robot sex is more sophisticated than just a bucket of fish, ethical or not.
      • by bill_mcgonigle ( 4333 ) * on Wednesday February 25, 2015 @12:50PM (#49128189) Homepage Journal

        As long as you don't demand that one provide you with sex...

        But then, that's a whole other ethical bucket of fish.

        Nobody thinks there's an ethical problem with me "forcing" my lawnmower to spin its blade and murder the grass, or torturing my refrigerator by chaining it to a wall and making it go "brrrr" all day.

        Machines do what their owners want, end of story - there are no ethical issues unless they affect other people.

      • So, tell me ... what are the ethical issues on, say, a dildo?

        Do you need consent from the dildo? Does the dildo need to be of the age of consent?

        If you wanted to hump your pillow, are there ethical issues?

        Seriously, we're talking about machines. They're tools, they don't have legal rights.

        Until such point we're talking about things which meet some (as yet undefined) level of sentience ... I think the issue of discussing the 'ethics' of how you use an inanimate object is complete and utter crap.

        • by TWX ( 665546 )
          Good, you listened that it's always, "a dildo," and never, "your dildo."
    • by antdude ( 79039 )

      You forgot to use sudo command [xkcd.com]!

  • Fridge door handle (Score:5, Insightful)

    by TerryC101 ( 2970783 ) on Wednesday February 25, 2015 @12:06PM (#49127697)
    The robot shouldn't be tasked with this judgment any more that the latch on a fridge door should be asked to keep you an your diet.
    • A latch? Try using a bear trap. It's much more effective.

      The original question is so silly, it leaves me at a complete loss at the moment. What kind of crap is this? Do we have alcohol vending machines now?

      • by Kjella ( 173770 ) on Wednesday February 25, 2015 @12:17PM (#49127829) Homepage

        Do we have alcohol vending machines now?

        Actually when I studied in Germany in the school cafeteria they had beer in the soda vending machine. Good times.

      • Do we have alcohol vending machines now?

        Yes.
        http://espn.go.com/mlb/story/_... [go.com]

      • by JaredOfEuropa ( 526365 ) on Wednesday February 25, 2015 @12:22PM (#49127883) Journal
        They do in Tokyo; I saw a vending machine there dispensing bottles of single malt whisky. The interesting comparison with my own country isn't that they have these and we don't, but that over here such a machine would not make it past the first Friday night before being trashed and robbed.
        • Yeah, well, Japan doesn't count. Over there, they might as well have a vending machine corollary for Rule 34 ("there is a vending machine for it -- no exceptions").

        • by Lennie ( 16154 )

          Vending machines in Japan seem to carry pretty much anything. It's surprising what they can come up with.

          Like vending machines with live crab.

      • Do we have alcohol vending machines now?

        Now? We used to have them in the USAF in the barracks. Beer only, but still, there it was.

        They got rid of them in 1988-89 IIRc.

        • by Cenan ( 1892902 )

          It's not too long ago they got rid of the light beers in the vending machines at LEGO, too many drunk truck drivers after lunch I guess. I am unsure if there was ever regular beer in them.

      • What kind of crap is this? Do we have alcohol vending machines now?

        I think Japan has had them for years.

        How prevalent they are and where ... I have no idea.

    • by geekmux ( 1040042 ) on Wednesday February 25, 2015 @12:17PM (#49127831)

      The robot shouldn't be tasked with this judgment any more that the latch on a fridge door should be asked to keep you an your diet.

      Yes, slippery slope indeed. If we humans have a hard time discerning if a person is technically "obese" (as rated by the ever-popular BMI scale), how exactly is the robot supposed to tell the difference and not serve the fat human too much food? What happens when the diabetic is served too much sugar? Who's liable? Far too much sue-the-manufacturer bullshit going on to eliminate that risk altogether. We have a long way to go with liability reform before this ever comes to light, which is sad. Yet again technology stifled by greed and politics.

      • by Cenan ( 1892902 )

        I guess, instead of trying to solve the generic problem we could solve the specific problem. If the robot is caring for a diabetic, then configure it to not serve sugar. Don't try to make the robot discern whether one or the other action is the preferred one, tell it which is.

      • If we humans have a hard time discerning if a person is technically "obese" (as rated by the ever-popular BMI scale)...

        No, "we humans" don't. Only the kind of idiot doctor who doesn't understand logic.

        BMI started out as an index for use by doctors for calculating dosages. And it was pretty useful for that, though perhaps not ideal. But, they expanded its use to all kinds of different, inappropriate things. Things they should have known were inappropriate. Like obesity.

        Simple example: a body-builder's BMI is off the charts. If one used it to measure obesity, they'd be so far off as to very seriously endanger someone's

    • by DarkOx ( 621550 ) on Wednesday February 25, 2015 @12:18PM (#49127847) Journal

      Is it quite that simple? I think a machine should obey its owner to the limits of its capability to do so. For instance your laptop should not let me unlock your desktop session should it? Even if you left it with me meeting room while you went to get some water?

      It should however let you unlock it. Maybe if you have so configured it, I should be able to logon as guest and use a web browser but not install software or access your personal files.

      The care bot should be the same way. It ought to do what its owners tell it. If I buy a care bot to look after my elderly mother I would want to generally program it to obey her instructions, but maybe I would want to put in a deny list and some event triggers, like if the request includes "chocolate cake" kindly decline and remind her she is diabetic, suggest it could whip up some nice meringues dusted with coco powder if she really wants chocolate.

         

      • by itzly ( 3699663 )

        For instance your laptop should not let me unlock your desktop session should it?

        If I told you the password, why not ?

    • The ethical conundrum about robots serving alcoholic drinks. This is a very American issue.
      1. Having a robot serve drinks seems to be a superficial luxury. Just so they don't have to get off their Asses and get a beer out of the fridge? Or will they be used to save money at a big expensive party where the robots will be more then mobile tables. However will it save money? Probably not, as people would probably like the interaction with the hostesses. Or would it just be a robotic bartender programed to m

    • by Altus ( 1034 )

      Its not about judgement, it is about programming. We are not asking the robot to make a judgement call we are asking who's judgement the robot should follow.

      If I buy my elderly grandfather a caretaker robot and I program the robot to bring him juice but not beer (because I know he shouldn't be drinking due to meds), what should it do when he asks for a beer. I would say that the robot should obey the wishes of the owner, not the patient in this case but it should probably not prevent the patient from gett

    • by AK Marc ( 707885 )
      So robots should not be programmed with the 3 laws? Because this looks to me to violate the first law. Giving an alcoholic alcohol causes harm.
      • So robots should not be programmed with the 3 laws?

        Hell no!!! Did you even read the book? The whole point of just about every book by Asimov was that you should never, ever, under any circumstances create robots that follow the Three Laws of Robotics.

    • by plopez ( 54068 )

      For a very hard core alcoholic NOT drinking can be very dangerous. If said alcoholic quits, or even reduces, alcohol intake seizures can result causing serious damage. The only way to properly handle it is by proper detox.

  • What about shots to a pregnant person?
    • Surprisingly legal... There's even a US State or two that has a statute prohibiting discrimination by refusing to serve the little missus.

      That's about right, huh ladies? Pregnant and, now, no drinking either!

  • by l0ungeb0y ( 442022 ) on Wednesday February 25, 2015 @12:10PM (#49127727) Homepage Journal
    If the robot is owned and operated by a person or organization other than the patient, then the patient should have no say. I fail to see the point here. If anything, it would be very difficult to create a robot that could determine if someone is drunk, much less is a drunk. So most likely, any nursing robot would refuse to serve booze to any patient since that would be a far easier option to implement.
    • by Rei ( 128717 ) on Wednesday February 25, 2015 @12:42PM (#49128075) Homepage

      The summary did call the person in question the robot's owner.

      I think the robot should obey the owner's wishes and get them the drink. But it should sigh audibly when asked to and mumble under its breath while giving it to them. Maybe occasionally snipe at them in a passive-aggressive manner. "Should I cancel all productive activities that you had scheduled on your calendar for today?" "Would you like vodka in a glass or should I set it up as an IV drip into your arm?" "Would you like me to make a bunch of regrettable drunken Facebook posts for you, or would you rather do it yourself?"

      • by hawkfish ( 8978 )

        The summary did call the person in question the robot's owner.

        I think the robot should obey the owner's wishes and get them the drink. But it should sigh audibly when asked to and mumble under its breath while giving it to them. Maybe occasionally snipe at them in a passive-aggressive manner. "Should I cancel all productive activities that you had scheduled on your calendar for today?" "Would you like vodka in a glass or should I set it up as an IV drip into your arm?" "Would you like me to make a bunch of regrettable drunken Facebook posts for you, or would you rather do it yourself?"

        "Here I am, brain the size of a planet..."

      • Don't they already have an exercise app on the iPhone that gets all passive-aggressive when you don't use it?
    • it would be very difficult to create a robot that could determine if someone is drunk

      That's a solved problem.

      http://gmailblog.blogspot.com/... [blogspot.com]

    • ...So most likely, any nursing robot would refuse to serve booze to any patient...

      The title of this thread speaks about an alcoholic.

      .
      An alcoholic is not, by definition, a patient. So a nursing robot could refuse to serve alcohol to any patient, but that still does not answer the question posed by this thread's title.

    • Something to look forward to in my old age - to be treated like a child by a robot proxy. Where's the anonymous right-wing troll when we need him?

  • by ArcadeMan ( 2766669 ) on Wednesday February 25, 2015 @12:10PM (#49127729)

    Fry: "Why would a robot need to drink?"
    Bender: "I don't need to drink. I can quit anytime I want!" (burp!)

  • ... people will serve robots alcohol. Or the robots will just take it.

    I know it's true because I saw it on a Fox [cc.com] TV channel.

  • by colenski ( 552404 ) on Wednesday February 25, 2015 @12:13PM (#49127763) Homepage
    The incredibly underrated Robot and Frank [imdb.com] explores this theme in a crime caper, wrapped in a buddy movie, wrapped into a science fiction story, wrapped in Asimovian robot philosophy. Well worth the time.
    • Fox's cancelled Almost Human did the same from the cop's point of view. Very Asimov.

      It was decent, but I knew it was doomed from the start due to Fox. Came for Karl Urban but stayed for Michael Ealy. The rest of the supporting cast was effectively flat, though.

  • Shouldn't they always serve the operator? I mean, if it didn't, I would exchange it for one that does. Now, if it is a free robot, well, just don't get caught hacking it.

    • Agreed. Also, with things like a robot dog [theonion.com] there could be other things you could do with them that perhaps a regular service animal shouldn't do in our society.
    • Shouldn't they always serve the operator?

      yes, probably, maybe, sometimes, usually - there can be complex extenuating circumstances.

      Robot & Frank [rottentomatoes.com] is a pretty good SciFi exploration of such issues. I think the writers imagine the likely future trade-offs well.

  • Guys always wanted robots that can fetch a beer from the fridge. The "beer test" is more popular than the Turing Test.

    Removing that feature is like removing flying from a flying car.

  • Oblig. (Score:5, Funny)

    by Megahard ( 1053072 ) on Wednesday February 25, 2015 @12:15PM (#49127791)

    Human: "Get me a drink"

    Robot: "I'm sorry,sir, you've had too many."

    Human: " Sudo get me a drink"

    Robot: "Ok"

    • by davidwr ( 791652 )

      Human: "Get me a drink"

      Robot: "I'm sorry,sir, you've had too many."

      Human: " Sudo get me a drink"

      Robot: "Ok"

      Robot: And stop calling me Sudo.

  • What if I wear a glove and server alcohol to a minor? Are the gloves responsible? The same with a robot
  • Yes (Score:5, Insightful)

    by nitehawk214 ( 222219 ) on Wednesday February 25, 2015 @12:16PM (#49127819)

    In my state bartenders are legally obligated to not serve "visibly drunk" patrons. Though only the nicer bars actually follow this rule, and it is more in place so they can easily boot out unruly drunks or bar entry for people that are already wasted before they show up.

    A robot bartender in a commercial environment would either need to be able to follow all the same rules or be operated by someone that does.

    The question is... If you are in your own home, does the robot count as a bartender, or is it an appliance? My guess is the latter, the responsibility belongs to the operator.

    Though it would be amusing to see the door to the refrigerator refuse to open for a drunk person.
    "I'm sorry Dave, I think you have already had enough to drink."
    "Hey buddy, can you come in to my house and open my fridge for me?"

    • by Kjella ( 173770 )

      The question is... If you are in your own home, does the robot count as a bartender, or is it an appliance? My guess is the latter, the responsibility belongs to the operator.

      Liquor licenses apply just to the sale of alcohol, if I'm at a private party and mix a round of drinks I don't need to follow any regulations except those that generally apply like serving alcohol to minors. And if a minor orders it from the robot, I shouldn't be in any more trouble than if they go to my fridge and grab one. I guess they could require "alcohol lockers" the way they do "gun lockers" around here, but we're not there yet.

      • Re:Yes (Score:4, Funny)

        by nitehawk214 ( 222219 ) on Wednesday February 25, 2015 @12:50PM (#49128185)

        Agreed, I also believe a professional bartender serving at a private party does not have these rules either.

        Either way we are years away from a completely automated robot serving in a public location without human supervision.

        Currently we are more likely to end up with this [imgur.com].

    • A bartender is under no obligation to refuse an alcoholic a drink. An alcoholic might arrive in the bar, 15 years sober and ask for a drink. How is the barman to know? Either way, it's not illegal for him to serve the alcoholic.

      Robots should be programed to obey the law. There is no law against serving alcoholics, there is a law against serving people underage.

      Your point seems to be different to the topic of the article, which isn't about serving people already dangerously drunk.

      • No, robots should be programmed to do their function, within the constraints of who is authorized to operate it.

        If we start having robots which are trying to interpret the law ... we need to immediately destroy them, because we'll have invented robot lawyers, and the world will be ending.

        How about we just don't try building robots which are smarmy assholes?

      • It has nothing to do with being an alcoholic. The law is about someone who visibly appears drunk. And the bar can be held liable. [phillytriallawyer.com]

        Which I think is bullshit, it isn't the bar's fault, especially considering how many people can look perfectly normal but be too drunk to drive. But that is probably just a part of PA desperately clinging to Prohibition.

        • The Article, and thus the original topic of discussion, is about serving an alcoholic, not serving drunk people.

    • by mjwx ( 966435 )

      A robot bartender in a commercial environment would either need to be able to follow all the same rules or be operated by someone that does.

      A robot bartender will be programmed with a set of rules that will look very similar to the rules set out for human staff.

      It will be the same with a robot carer. They will be programmed to the same parameters as a human carer would receive. So someone with chronic liver issues will be denied alcohol, someone who has mobility issues will be able to order the odd she

      • The difference between a human and robot bartender is that when you've had a few too many the robot bartender can not be bargained with, reasoned with, it doesn't feel pity or remorse or fear and it absolutely will not stop until you've been served a diet soda.

        Being served diet soda has never been so frightening before.

        Robots: conquering the human race... with diet soda.

  • Comment removed based on user account deletion
  • by gman003 ( 1693318 ) on Wednesday February 25, 2015 @12:23PM (#49127907)

    Making decisions like this requires consideration of the consequences, which is the very definition of sapience.

    If the robot is non-sapient, but simply has a configured list of users who it may or may not serve alcohol, the decision was made by the person who configured it. This would be an acceptable solution, although cumbersome and inflexible. Probably wouldn't work well enough for public bartending, but a robo-butler could work this way.

    If the robot is sapient, it would be capable of making such decisions on its own. In fact, you might see robots refuse to serve alcohol at all, claiming moral reasons. On the other hand, you might see libertarian robots refuse to *not* serve someone alcohol, if they value people's right to self-determination. This would also be acceptable, but we are nowhere near this level of AI.

    If the robot is non-sapient, but still expected to identify children and alcoholics on its own, problems will result. Detecting children is possible, with some false-positives (it's hard to tell a 20-year-old from a 21-year-old by appearance) and false-negatives (dwarfs/midgets/little people/hobbits/whatever the current PC term is), but how do you detect an alcoholic by their appearance?

    The obvious solution for non-sapient robots requiring more flexibility than simple whitelists/blacklists, since alcohol is already a controlled substance, is to have robots require you to present ID for alcohol, and perhaps add a feature to IDs to show "recovering alcoholic, do not give alcohol" if we decide that's something that's important. Then again, we've not felt the need for that yet, with human bartenders, so maybe this whole debate is over something we've already as a society decided isn't an issue.

    • by qwijibo ( 101731 )
      I couldn't help reading that as "librarian robots" and wonder if libraries are so under utilized these days that they need to become like a Starbucks with alcohol to keep people coming in.
  • Should a robot serve alcohol. Yeah, that's right at the top of society's ethical priority list, just a notch below if Robots should look like people.
    • The fact that this is where we are in the discussion of robots makes even more skeptical of how much robots are going to play a role in the next 20 years. It would have to be a pretty sophisticated robot to even understand the decision factors. If I ask a robot to bring me alcohol, does it know I intend to consume it? What if I mean to bring the bottle of wine as a gift? If I ask the robot to bring me motor oil, should it assume I might drink it? If I ask it to kill a chicken for dinner will it comply? If I
  • by gurps_npc ( 621217 ) on Wednesday February 25, 2015 @12:31PM (#49127975) Homepage
    Robots that disobey their owner would be dramatically wrong on multiple levels.

    At the same time, their owners should be legally responsible for the orders they give the robot.

    So if the owner can effectively order the robot to selectively serve alcohol only to adults that are not already intoxicated, then the robots should serve alcohol.

    If the robot can not make that determination, then it should not be allowed to serve alcohol.

  • I would be all for them just because they could be rigged to shoot a flaming jet of alcohol at robbers.
  • Detecting drunk people is the tough part.
    Use mass spec for instant breathalyzer?
    Analyze behavior?

    In the long run it's like self-driving cars: eventually it'll be so much better than human-operated that insurance companies will lobby to mandate it.

  • This is kind of a silly question.
    If development has taught me anything, it is that you can't account for every use case, and unless you're releasing something particularly tiny, you can't have a contingency for every foreseeable misuse.
    Easiest solution: The robot provides one drink for one coupon or ticket. It is up to the owner of the robot to control how those tickets are distributed and managed.
    Machine for the automated part, while a human handles the human problems.
  • as long as I don't have to tip the robot
  • Is it here because there are robots in the story?

    If the purpose is to pose the question

    If so, should they refuse to serve them to certain people, like children or alcoholics?

    Then the answer is yes, the reason the person could just get the shit themselves so what difference does it make if a robot gets it for them?

  • c.f. Isaac Asimov and his laws of robotics, for example "I, Robot" (not the unrelated movie of the same title) Whatever I read so far by Asimov (not THAT much I admit) centered around such robo-ethical questions and how to circumvent them

    So, according to the robot laws:

    No, it SHOULD not as it would endanger a human.
    Thinking this through to the end would mean that a robot should never serve any drug (down to coffee) to a human.
    But yes, it would serve alcohol to an alcoholic as he would be kept from checking

    • Take 5min to read this short essay by Asimov [tufts.edu], you won't be dissapointed. Asimov was more than just the guy who wrote about fictional robot laws, for example, he was also well known skeptic. Not the modern anti-science kind, a real skeptic, spelt the old fashioned way!

      None of it is about robot ethics, it's a metaphor about the folly of thinking that a list of rules, such as the ten commandments, could ever encapsulate all the vagaries of human morality.
      • Take 5min to read this short essay by Asimov [tufts.edu], you won't be dissapointed. Asimov was more than just the guy who wrote about fictional robot laws, for example, he was also well known skeptic. Not the modern anti-science kind, a real skeptic, spelt the old fashioned way!

        None of it is about robot ethics, it's a metaphor about the folly of thinking that a list of rules, such as the ten commandments, could ever encapsulate all the vagaries of human morality.

        Well, Kant with his categoric imperative managed that with even a single rule. (But kind of cheated as it was kind of recursive)

        Besides, thanks for the hint. I know I should have read more Asimov (as I always liked what I read) but somehoe never could quite adjust to the style somehow.

        Naturally, the theories we now have might be considered wrong in the simplistic sense of my English Lit correspondent, but in a much truer and subtler sense, they need only be considered incomplete.

        But the importantthing is to keep the basic humility and remember that no matter if your current knowledge is incomplete or plain wrong, one day it will be amended or rectified. And by his own logic, the diffrence between remembering that and assuming your current theory is "right" in the simplistic way, is much much larger than between "wrong" or "incomplete"

  • What if we extend this line of thinking to driver-less cars? If certain behavioral patterns are detected (such as driving to the liquor store 3x) should the car have a moral imperative to NOT take the person to buy booze?

    (On aside, I'd feel better with the robot fetching a drink for an alcoholic -- than the alcohol driving themselves (this is before driverless cars mind you) to the store to buy more.)

  • This is neurotic navel gazing. Take responsibility for your own actions, which includes getting drunk in the first place because you know before the first drink that this will lead to suspension of judgement. If you choose to use a tool like a robot to get you a drink, that's your decision, even if it kills you. What next - a controlling nanny state that raises the drinking age to 21 or it makes it illegal to jaywalk?

    • Agree, what good is a robot maid that won't fetch a beer? - However if we follow your logic to the extreme, you see a drunk fall overboard on a party cruise and you do nothing because it's his own damned fault?
  • by cheetah_spottycat ( 106624 ) on Wednesday February 25, 2015 @01:07PM (#49128353)
    Apparently it's perfectly fine to send killer robots to murder random unwanted people around the globe at the command of a single person with no parliamental control, no charges, no sentence, no judges, no jury, no defense and against all governing international laws. But serving alcohol to its owner is a problem because, oh my god, it might not be healthy? ARE YOU FUCKING KIDDING ME?
  • I don't see a new issue here. Caretaker robots have more potential for risk and harm than a ladder does, but that's it. It's a matter of making sure liability law is up to snuff.

    For me the real problem is the liability generated by acting unsolicited on a patient's behalf. If a robot starts discerning what is good for a patient and acting on that basis, then the provider, the manufacturer, and perhaps the robot itself will become liable for the choices of the robot that turned out to be bad. If the liabi
  • Alcohol withdrawal can be fatal for alcoholics.
    http://en.wikipedia.org/wiki/A... [wikipedia.org]

    If you can make a robot able to understand the moral implications of it's actions then you won't need to program it with rules like this. If you can't then regard the robot as a tool and regulate its use appropriately. Children are not allowed drivers licenses. Why would a responsible person allow a child to use a robot that could be fatal to the child?

  • Comment removed based on user account deletion
  • "A robot may not injure a human being or, through inaction, allow a human being to come to harm."

    So no.

To do nothing is to be nothing.

Working...