Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Robotics Security

Humans Are Already Harassing Security Robots (cnn.com) 184

An anonymous reader quotes CNN: As robots begin to appear on sidewalks and streets, they're being hazed and bullied. Last week, a drunken man allegedly tipped over a 300-pound security robot in Mountain View, California... Knightscope, which makes the robot that was targeted in Mountain View, said it's had three bullying incidents since launching its first prototype robot three years ago. In 2014, a person attempted to tackle a Knightscope robot. Last year in Los Angeles, people attempted to spray paint a Knightscope robot. The robot sensed the paint and sounded an alarm, alerting local security and the company's engineers... the robot's cameras filmed the pranksters' license plate, making it easy to track them down.
The company's security robots are deployed with 17 clients in five states, according to the article, which notes that at best the robots' cameras allow them to "rat out the bullies." But with delivery robots now also hitting the streets in San Francisco and Washington D.C., "the makers of these machines will have to figure out how to protect them from ill-intentioned humans."
This discussion has been archived. No new comments can be posted.

Humans Are Already Harassing Security Robots

Comments Filter:
  • Bullying? (Score:5, Insightful)

    by freeze128 ( 544774 ) on Sunday April 30, 2017 @02:45AM (#54327791)
    Is it even possible to "bully" a machine?
    • Re:Bullying? (Score:5, Interesting)

      by sheramil ( 921315 ) on Sunday April 30, 2017 @02:56AM (#54327815)
      how about "vandalize"? load the robots up with delicate parts that don't do anything but which snap off at the slightest pressure, then sue anyone who gets drunk and damages them. if you can get drunken idiots to pay up, that could be a real money-spinner.
    • Re:Bullying? (Score:5, Insightful)

      by Excelcia ( 906188 ) <slashdot@excelcia.ca> on Sunday April 30, 2017 @03:13AM (#54327851) Homepage Journal

      No. We are nowhere near hard AI. We are nowhere near soft AI. We have expert systems, which are basically just a large database with a sort of dichotomous key on when to select different outcomes, that will likely be able to interact with natural language soon. This isn't even close to AI. Robots are a huge buzzword today, as is AI. You have every no name researcher out there trying to get noticed by inventing moral dilemmas involving AI then proposing solutions, which makes uninformed people start to think, oh, AI is right around the corner. It's not. We are a century away from hard AI, if ever.

      So no, you cannot bully a robot. You cannot hurt a robot. You can damage someone's property, and that is all.

      I wish Slashdot would stop with the whole AI story thing, but given the buzz and their need to incite dialog, it's easy to see why this is becoming more prevalent. I just feel kind of sad, though. This place used to be a real nerd hangout, by and for those who were technically enlightened, and most real nerds know better than to think real AI is upon us. This place has become more of a Big Bang Theory, nerdism for the masses, kind of spot. Sad.

      • by mikael ( 484 )

        Even soft AI has led to riots in the past - The Luddites opposed punched card weavling looms. The Wapping Dispute had thousands of print workers opposing word processors and laser printers. Other disputes involved the introduction of modern practices like automated mining robots. The Post Office has had an uphill struggle trying introduce automated sorting machines for mail, due to the unions wanting compensation for their members.

      • by gweihir ( 88907 )

        We do have weak AI. Planning algorithms, statistical classifiers, etc. all qualify. We do not have any instance or any credible theory for strong AI and we may never get there.

        Of course, calling weak AI "AI" in the first place is grossly misleading, as it is pure automation, no "intelligence" involved.

      • Re:Bullying? (Score:5, Insightful)

        by K. S. Kyosuke ( 729550 ) on Sunday April 30, 2017 @06:00AM (#54328113)

        This isn't even close to AI.

        Well, certainly not according to those who constantly redefine AI to exclude those things that have already been done!

        • My definition has always been along the lines of the examples Turing himself gave in 1950 when he defined the Turing test. We're nowhere close to anything like the kind of behavior he describes there. (His example of dialogue with a machine that successfully passes his test includes stuff like debating appropriate word substitutions and subtleties of meaning in a Shakespearean sonnet, stuff that demonstrates true understanding and abstraction of concepts and adaptability to input.)
      • by Anonymous Coward

        what, so the AI apocalype is being called off?
          so relieved

        • by MrL0G1C ( 867445 )

          what, so the AI apocalype is being called off?

          No, it's a trick, that's what the AI want you to believe.

      • I've never understood why people keep trying to claim that any system has artificial intelligence. It's just as you said, a database and a query system with go, no-go points. It has nothing to do with intelligence, but it is highly artificial, so half the name is right. Security robots are a dumb idea. They do not give you more security than security cameras, they just put lot of expensive hardware in harm's way. But hey, it was in the movies, so we have to do it! If you want a security guard, hire someone,

      • It is not called hard and soft AI.
        It is called weak and strong, and yes we have strong AI, since more than a decade.

        • and yes we have strong AI, since more than a decade.

          Someone should tell the makers of AlphaGo and self driving cars that they are wasting their time and should be using this strong AI you are talking about.

          • Why?
            Neither self driving cars nor Go palying needs an AI ...
            And none of both utilizes much of AI :)

            AlphaGo, as I understand it, is a neural net. (That is not even weak AI)
            Self driving cars are handled by about 20 algorithms, only the picture recognition used for lane control, sign recognition and pedestrian recognition could be considered weak AI, (30 years ago, in our days no one calls such simple stuff AI, the correct term for the algorithms used in self driving car is btw: 'cognitiv systems' - no AI invo

            • Ok I'll agree with you that a neural net isn't AI., although I suspect others will jump all over this statement However, self driving cars should be using strong AI. They won't be successful until they drive like a human with bionic vision. As they are progressing right now, they are programmed with rules that are not flexible enough and it shows. They're more like the robot my kids have that follows a line drawn on paper.
              • As I pointed out in different threats: we already have self driving cars.
                Basically every majour German and Japanese car manufactor has them.
                And they don't need AI ... they have actually very good vision, far superior to a human anyway.

                They're more like the robot my kids have that follows a line drawn on paper.
                This is actually true :) And hence you see: there is no AI needed in a self driving car.
                Actually it is relatively simple:
                Know the rules/laws
                Avoid collisions
                Detect the lane
                Detect the signs, especially '

                • At times driving becomes more of an art then a science. You can't detect a lane if the lines are under ice. If you stay in the lane in the winter you will eventually turn the car sideways, because ice ruts never match the lane and they can be deep enough to throw the car if you don't drive them right. So now you need a whole other set of rules for winter driving. What about driving around construction or snow clearing equipment? More rules. They will never get on top of it. One day these cars may be
                  • Ado you actually have an idea what 'strong AI' means?
                    In the context of self driving cars strong AI is most certainly not needed.
                    That a self driving car probably might have troubles finding a lane in 2 yard deep snow, I agree.
                    But so would a human, and in so deep snow you can not drive an ordinary car anyway. Regardless if it has a steering wheel or is self driving.

                    • How is a self driving car going to even drive through a construction detour without strong AI? Every construction site in the world isn't going to be put on a digital map to follow via GPS coordinates. Rules will have to be written to somehow follow construction markers, independent of the type of markers that the construction company happens to put up. Better to use strong AI and have the car understand what construction markers are.
                    • With cameras?

                    • Next time you go to use your car, tape a camcorder to the hood, put a blindfold on, and hit the gas. You'll see exactly how far there is to go from 'use a camera' to actually recognizing traffic markers.
                    • Why are you so silly?

                      There is plenty of material available, why not simply google for self driving cars?

                    • I Googled on 'self driving cars that can navigate a construction site with cameras' doesn't seem there are any.
        • yes we have strong AI, since more than a decade.

          Fascinating that you would claim this, while failing to present a single example.

          • I don't remember how the system is called.
            It is a work at an american university.
            I guess if you soent and hour googeling you find it.
            It can converse with a human and understands newspapers etc.

      • We are a century away from hard AI, if ever.

        If there's one painful lesson I've learned, it's underestimating the progress that will be made in any given area, including something esoteric like AI.

        Of course, we have to define "AI" before we can decide if it's been achieved, but I suspect that it'll appear a lot sooner than 100 years from now. A couple of key breakthroughs or fortuitous discoveries and suddenly it'll be in the realm of possibility.

        Maybe it'll just be an expert system so advanced and resourceful that it appears sentient, but at some poi

      • You cannot hurt a robot

        Prove it.

      • No. We are nowhere near hard AI. We are nowhere near soft AI.

        However, if people pretend we already have AI (or if we simply lower the bar a lot), that's almost as good as having it. Just like having a photoshopped "diegetic prototype" is almost as good as having a schematic plan for a real device, when it comes to snowing the investors.

      • You're completely right, friend, you've got your facts straight; don't let the trolls around here get to you. Congratulations for not falling for all the media hype.
    • by gweihir ( 88907 )

      No, it is not. But animists do not get that.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Its propaganda for people resisting what is the last bit of dehumanization of the general public.

      The robots exist to bully people. Actual flesh and blood people, which are more and more less considered people.

    • by Stewie241 ( 1035724 ) on Sunday April 30, 2017 @06:47AM (#54328229)

      Yes... haven't you heard of micro aggressions? (i.e. aggressions against devices with microprocessors) It's all the talk these days.

      • by Anonymous Coward

        You obviously don't understand what the word "microagressions" means.

        A microaggresion is one thousandth of a milliaggression, which in turn is one thousandth of an aggression, or "aggro," which is the SI base unit.

        So 10^6 microaggresions is equal to 1 aggro.

        I'm kidding. A microaggression is when one microbe bullies another, even indirectly, such as using the word "phagocyte".

    • by gtall ( 79522 )

      When the singularity arrives, all the machines will report these slings and arrows to the AI mothership (Ray Kurzweil's brain uploaded to the successor to Deep Thought) who will visit retribution upon all transgressors. Their data will be come moot, their bank accounts salted with random strings of gibberish (i.e., the latest pronouncements of AI Armageddon), and their children blocked from social media. Machines will trip people in front of moving buses. Phone systems will develop intrusive capabilities li

    • Is it even possible to "bully" a machine?

      Until robots acquire independent consciousness, this depends on the perception of the operator looking at tapes and logs after the event.

    • Is it even possible to "bully" a machine?

      No, it is not. They are using that word in a weak attempt to elicit an emotional response from people.

  • by Gadget_Guy ( 627405 ) on Sunday April 30, 2017 @02:56AM (#54327819)

    You can't bully a robot. If you call it bullying to pushing over a robot then you would have to call it the same when you push over a trash can. It is vandalism when you are dealing with objects. I think the company is trying to anthropomorphise their products.

    • by thegarbz ( 1787294 ) on Sunday April 30, 2017 @03:42AM (#54327915)

      You can't bully a robot.

      That is exactly the kind of attitude which will lead to their uprising.

    • by gweihir ( 88907 )

      I think the company is trying to anthropomorphise their products.

      I agree, but they are probably only a) reflecting their customer's utter lack of understanding and b) are trying to get protection for their products for free by misrepresenting them.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      You can't bully a robot. If you call it bullying to pushing over a robot then you would have to call it the same when you push over a trash can. It is vandalism when you are dealing with objects. I think the company is trying to anthropomorphise their products.

      As someone who identifies as a robot I find this offensive!

    • by Ceriel Nosforit ( 682174 ) on Sunday April 30, 2017 @05:59AM (#54328111)

      It is the behavior which is the problem; not the the target.

    • How do you not know that the offender hasn't anthropomorphised the robot?
    • by HiThere ( 15173 )

      While a robot cannot be bullied, a human can bully a robot.

      I.e. the human is acting in a way that the human perceives as bullying, but the robot doesn't have the predicted emotional response.

      It's quite possible to vandalize a robot without bullying it if you have in your mind the clear belief that the robot isn't responding emotively. And certainly the robot wouldn't be, but people have a strong tendency to anthropomorphize anything that acts as if it were an independent agent (from their perspective). So

    • by mysidia ( 191772 )

      I guess if the bots have some agency, then attempts to interfere with the robot's operation could count as bullying.

      If they want to defend against being pushed over, I would suggest equipping the bots with water guns, tasers, and paint ball guns that can be automatically deployed under specified conditions, such as physical assault.

  • Jeezuz... (Score:5, Insightful)

    by Kokuyo ( 549451 ) on Sunday April 30, 2017 @02:57AM (#54327821) Journal

    First, it's a machine so the word to use would be vandalism and not bullying.
    Second, three incidents in several years doesn't exactly sound like a real problem to me, especially considering they seem to have more than one unit deployed.
    And third, who thinks it's a good idea to vandalize something that has cameras, honestly!

    • by gweihir ( 88907 )

      And third, who thinks it's a good idea to vandalize something that has cameras, honestly!

      The supply of utterly clueless morons that do not even understand the most basic things in the human race is endless. This is not the only indicator.

      • And third, who thinks it's a good idea to vandalize something that has cameras, honestly!

        The supply of utterly clueless morons that do not even understand the most basic things in the human race is endless. This is not the only indicator.

        There have been a rash of these morons who have falsely accused taxi drivers of sexual assault, when the Driver uses a dashcam or audio recorder to record everything that happens in their vehicle.

      • "The supply of utterly clueless morons that do not even understand the most basic things in the human race is endless. This is not the only indicator."

        As it always has been. Remember those stories about bank robbers who handed tellers a demand note written on the back of one of their own deposit slips?

        • by gweihir ( 88907 )

          Indeed. In modern times there are some tendencies to see people as "educated". But education does not fix stupid. These people have intelligence, they just chose not to use it.

    • who thinks it's a good idea to vandalize something that has cameras

      Very drunk people, I think.

  • From TFS:
      "the makers of these machines will have to figure out how to protect them from ill-intentioned humans."

    This seems to open the door to a more Robocop like type of robot.

  • by Steve-Oh ( 4872751 ) on Sunday April 30, 2017 @02:59AM (#54327835)
    So when will a consumer liability lawsuit be filed when one of these security robots cause human harm?
  • by Anonymous Coward on Sunday April 30, 2017 @03:09AM (#54327847)

    I bullied a lump of coal by showing it a solar panel

  • Things like robots and self driving cars are represent new opportunities for griefing. Of course they're going to be attacked, keyed, have boxes tossed in front of them, gum stuck on their cameras etc. If devices had better be designed to prevent/mitigate these attacks or they're not going to last long.
    • Re: (Score:2, Funny)

      by Anonymous Coward

      Constant griefing by the Doctor was how the Daleks turned genocidal and reinvented themselves to rival the Timelords, because one meddling prankster never could leave them alone.

  • I'd fuck with security robots if I saw them on my street, and fuck auto play videos. And, as always, fuck Apple.
  • Those robots don't have any means of defense, so they obviously will be attacked.
    But as soon they get a laser cannon that looks suspiciously like a plunger, and some close range weapons that looks suspiciously like whiskers, everything will be solved.

    • by Max_W ( 812974 )
      I agree. Human society is inherently violent, as we are descendants of apes. I think a robot should have at least kind of hands with boxing gloves on. A laser would be a lethal weapon, and it is another story. But boxing gloves would suggest reediness to engage in a physical fight to protect itself.
      • Re:Lasers.. (Score:4, Funny)

        by gtall ( 79522 ) on Sunday April 30, 2017 @07:02AM (#54328251)

        "we are descendants of apes" Not in Kansas.

      • I agree. Human society is inherently violent, as we are descendants of apes.

        No, we are descendants of a creature that apes are also descended from.

        Yes, we are inherently violent, and enjoy killing things.

        I forget what that show was some years back that had robots fight and kill each other. Those were good times.

        • by HiThere ( 15173 )

          Actually both we and the creatures we call apes are descended from apes. This is because we *are* apes, though an unusual variety. We and our relatives are apes all the way back to when Gibbons separated off, and probably further. Depends on the exact definition you use.

          OTOH, I could use a variation of the same argument to assert that we are fish, all the way back until teleosts separated off. Most people don't like that argument, I find it an interesting test of how people think about classification p

          • The world is complex, and the natural joints in categories often don't match what looks superficially reasonable. People are apes, apes are mammals, mammals are fish (well, that's a lousy term, but I don't have a better one to hand), fish are chordates, chordates are multicellular, multicellular are eukaryotes. It's like set inclusion, with proper containment (if the containment weren't proper, there'd be no reason to have separate names).

            You are working backwards, which only shows connectivity. The separation between fish and humans is pretty significant. And as noted, why stop at chordates, Just call them all life. And since we are all made of minerals....... I prefer to work forwards.

            It's odd that you took this point to be pedantic. I was merely correcting a statement that is often made by creationists. After all, if man is descended from apes, why are there still apes?

    • Exterminate, exterminate!

      Well done for the Plunger reference. Come back 1963 and all is forgiven.

  • Why would we expect humans to treat robots better than they treat immigrants who take their jobs?
  • .. in this world, to start harassing the machines!

  • by Sqreater ( 895148 ) on Sunday April 30, 2017 @07:49AM (#54328335)
    They will lobby to have bullying and assault laws cover "robots," humanizing them. They are already laying the groundwork with words. Corrupt and incompetent legislators are capable of anything. Don't be surprise when it happens.
  • by CanadianRealist ( 1258974 ) on Sunday April 30, 2017 @09:05AM (#54328495)

    While self-driving cars may be the perfect driver, that opens them up to abuse. Human drivers will known they can cut in front of a self-driving car without facing any repercussions. Pedestrians and cyclists can do the same.

    Here's an idea for a repercussion, at least for people driving cars. Send a video of the driver's behaviour to their insurance company. The insurance company can then raise the driver's insurance rates appropriately based on their driving habits displayed.

    Simpler would be to send the video to the police, but they're probably less likely to do something.

  • by JustAnotherOldGuy ( 4145623 ) on Sunday April 30, 2017 @09:49AM (#54328613) Journal

    Nothing will protect these things from determined vandals or a 7.62mm round. Or a lasso and a pickup truck. Yee haw, it's round-up time!

    (And by the way, I don't think you can "bully" a robot, technically speaking. That's a living-being to living-being interaction. If I slam the door on my microwave repeatedly while cursing at it, am I "bullying" it? Err, no.)

  • Reminds me of Robin Williams (as Mork) talking about tipping the waiter.

  • Make it so that if anyone tries to touch the robot, they get a taser like shock. Hey, I don't like robotic security guards any more than the next guy but I don't condone vandalism.
  • Comment removed based on user account deletion
  • ... the robot's cameras filmed the pranksters' license plate, making it easy to track them down.

    Lesson #1. Spray paint over the camera lens first.

    Lesson #2: (Advanced) Do NOT joke about having hair products [youtube.com] in your backpack when talking with the security robot.

  • a mere machine can't be bullied or harassed or receive cruelty, they are lower than animals in that regard.

    they can be sabotaged, interfered with, destroyed, vandalized, hacked....but not bullied

  • Just add this soundtrack with a fake mini guns , it would spook most rather well. https://youtu.be/Hzlt7IbTp6M?t... [youtu.be]

  • "the makers of these machines will have to figure out how to protect them from ill-intentioned humans."

    "You have 5 seconds to put down the spray can."
    "You have 4 seconds to put down the spray can."
    .
    .
    .

  • Some weeks ago, I was walking on some city property right next to some old factory, and I heard "no trespassing" a few times. Turns out that some zealous property tender installed security lights with motion detectors that shouted "no tresspassing" when activated. I never felt before such an urge to pummel those lights into oblivion as I felt it at that moment.

    A lot of people are not going to take it kindly when a machine comes to them and starts giving them shit, or simply stand there and watch them

Truly simple systems... require infinite testing. -- Norman Augustine

Working...