Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Software Hardware Technology

Self-Driving Shuttle Involved In Crash Two Hours After Debut (www.cbc.ca) 204

New submitter Northern Pike writes: Las Vegas roll out of new driver-less shuttle spoiled by human error. It sounds like the shuttle did what it was designed to do but the human semi driver wasn't as careful. "The shuttle did what it was supposed to do, in that it's (sic) sensors registered the truck and the shuttle stopped to avoid the accident," the city said in a statement. "Unfortunately the delivery truck did not stop and grazed the front fender of the shuttle. Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided." The self-driving shuttle can transport up to 12 people and has a attendant and computer monitor, but no steering wheel and no brake pedals. It relies heavily on GPS, electronic curb sensors and other technology to make its way.
This discussion has been archived. No new comments can be posted.

Self-Driving Shuttle Involved In Crash Two Hours After Debut

Comments Filter:
  • by Tulsa_Time ( 2430696 ) on Thursday November 09, 2017 @06:24PM (#55521989)

    I wonder if the shuttle doing the right thing was what the human driver expected.... maybe their algorithms are incompatible.

    • by jtara ( 133429 ) on Thursday November 09, 2017 @06:27PM (#55522009)

      The AI switched from human emulation mode to the Deer in Headlights program...

      • by exodus2 ( 88214 ) on Thursday November 09, 2017 @06:29PM (#55522031) Homepage

        The story says it stopped moving and the truck backed into it. I wonder if there was a horn option in the software.

        • by Motard ( 1553251 )

          From the story...

          NASCAR driver Danica Patrick and magic duo Penn and Teller were among the first passengers.

          Penn Jillette has a podcast where he said he wanted to be one the the first to ride on it, It's almost certain he'll be talking about it there on Sunday. It's called Penn's Sunday School.

          • by sexconker ( 1179573 ) on Thursday November 09, 2017 @07:14PM (#55522229)

            From the story...

            NASCAR driver Danica Patrick and magic duo Penn and Teller were among the first passengers.

            Penn Jillette has a podcast where he said he wanted to be one the the first to ride on it, It's almost certain he'll be talking about it there on Sunday. It's called Penn's Sunday School.

            Conversely, I doubt Teller will have much to say on the matter.

        • by PolygamousRanchKid ( 1290638 ) on Thursday November 09, 2017 @08:12PM (#55522497)

          I wonder if there was a horn option in the software.

          This is the USA. We have "stand your ground" laws. If another motor vehicle is trying to run you over or back into you, you are permitted to engage with licensed firearms.

          Obviously, the Autonomous Defense systems of the new vehicle are not working correctly, or the Self-Driving Shuttle would have flattened the tires of the truck that was attempting to ram it.

          More field tests, and plenty of ammo are obviously still needed.

      • by DarkOx ( 621550 ) on Thursday November 09, 2017 @06:47PM (#55522119) Journal

        Exactly sounds like the shuttle just stopped, when a human might have steered to the edge of the lane or onto the shoulder to avoid being "grazed"

        • Trouble is....I can see it coming....there will be movement to get human driving of cars made illegal and then ONLY AI vehicles will be able to run on the public roads.

          A sad day, as that I just bought a new FUN driving car yesterday.....I hope that I'll be long dead and gone before the scenario above plays out, but I see it coming.

          [goes and throws on Red Barchetta....]

          • by ShanghaiBill ( 739463 ) on Thursday November 09, 2017 @08:19PM (#55522527)

            Trouble is....I can see it coming....there will be movement to get human driving of cars made illegal and then ONLY AI vehicles will be able to run on the public roads.

            This will be a GOOD THING. Once we get the humans off the road, we can make lanes narrower, traffic will flow more smoothly, cars can be made lighter, and traffic lights can be eliminated.

            A sad day, as that I just bought a new FUN driving car yesterday.

            Why should my tax dollars subsidize your hobby? If you want to drive, do it on a private track.

            • This will be a GOOD THING. Once we get the humans off the road, we can make lanes narrower, traffic will flow more smoothly, cars can be made lighter, and traffic lights can be eliminated.

              All of this seems to assume that bicycles and pedestrians never have to share the road with cars. Sure it could be designed, but it's prohibitively expensive.

            • Why should my tax dollars subsidize your hobby? If you want to drive, do it on a private track.

              Er, robot cars will still drive on publicly funded roads won't they? So why should my tax dollars subsidize your hobby?

              • FWIW the biggest subsidisation of road users is for vehicles heavier than a couple of tons.

                Roadbed damage is proportional to the 5th power of axle pressure(weight) and the 2nd power of speed. 18 wheelers pay less than 1% of the maintenance costs they incur.

            • Why should my tax dollars subsidize your hobby? If you want to drive, do it on a private track.

              If a self-driving car costs several thousand dollars more and furthermore requires cheaper cars to be outlawed for full safety functionality, wouldn't it be the self-driving car that is actually being subsidized (through laws that prohibit cheaper cars)?

              • If a self-driving car costs several thousand dollars more

                Very unlikely. Sensors are cheap. Actuators cost less than a steering column. Once you add in the insurance, SDCs will almost certainly be less expensive once they are mass produced.

                • ......once they are mass produced.

                  Do you have visions of Minority Report where all of the autonomous cars are the same?

                  Who is going to manufacture the cars? Are all but one of the manufacturers going to go out of business? Will there be a global mandate to have one template against which all manufacturers will work or will each country/state have their own template?

                  You can impose standards but not personalization.

                  If we do get to a future that looks like Minority Report (which was optimistically set in 2054) the same individualizati

                • If a self-driving car costs several thousand dollars more

                  Very unlikely. Sensors are cheap. Actuators cost less than a steering column. Once you add in the insurance, SDCs will almost certainly be less expensive once they are mass produced.

                  No, Lidar is not cheap. Compute is not cheap. We're talking thousands that might hopefully come down to around a thousand or so with extreme volumes, but maybe not. Self-driving cars will never be as cheap as the apples-to-apples comparison to the equivalent manual car (similar to how hybrid cars will never come close to the equivalent non-hybrid car).

            • by mjwx ( 966435 )

              Trouble is....I can see it coming....there will be movement to get human driving of cars made illegal and then ONLY AI vehicles will be able to run on the public roads.

              This will be a GOOD THING. Once we get the humans off the road, we can make lanes narrower, traffic will flow more smoothly, cars can be made lighter, and traffic lights can be eliminated.

              Ooooh and Unicorns. Because out of what you said and Unicorns, Unicorns are more realistic.

              90% of the Highway Code is based on physics, not human response times. Physics wont change, you wont have autonomous cars going bumper to bumper at Eleventy Billion AU's an hour because physics doesn't change.

              A sad day, as that I just bought a new FUN driving car yesterday.

              Why should my tax dollars subsidize your hobby? If you want to drive, do it on a private track.

              You've got that backwards boy, our fun cars are subsidising your crappy ones. If not for us who drive sports cars

            • Why should my tax dollars subsidize your hobby?

              I think forcing self-driving cars on people is infringing on their right to travel. If I'm not in control of the car - if the company that made it, or the government, or hackers or whatever, can make it take me somewhere other than my intended destination - then you've effectively turned travel into a privilege instead of a right.

          • No need for a law.

            Insurance companies will do it anyway - as soon as self driving vehicles are shown to be safer drivers they'll attract discounts - or human drivers will attract higher premiums, which amounts to the same thing.

            Very quickly, only those who can afford the insurance will still be driving themselves.

            There may be exemptions carved into the rules, such as a ridiculously low speed limit for manual driving in order to handle tricky manouvering that the computer can't do, but in reality such situat

        • by Dorianny ( 1847922 ) on Thursday November 09, 2017 @07:52PM (#55522411) Journal
          Yep risking rolling over a bus full of people to avoid a fender-bender is exactly what a panicked human driver might have done
        • So the shuttle had a very predictable response while the human on the other hand could have done any number of things.

          My aunt would wish she had this self driving car in front of her. If both had hit the brake there would have been a fender bender. Instead she got someone who didn't know the difference between the brake and accelerator and when both attempted to stop the other proceeded to take off hit 3 parked cars force a car in the oncoming lane onto the footpath and came to a rest inside a convenience s

      • There is no 'human emulation' mode because it has no actual capacity for THOUGHT, therefore it cannot in any way shape or form 'emulate' a human brain. All it can do is be an extremely limited, pale imitation.
      • by slick7 ( 1703596 )

        The AI switched from human emulation mode to the Deer in Headlights program...

        Once again, human stupidity trumps artificial intelligence.
        Can't wait until they make machines as stupid as humans, then humans will truly be obsolete.

    • Re: (Score:3, Insightful)

      by Pentium100 ( 1240090 )

      Also, maybe if the shuttle had a human driver, he would have been more careful near a semi truck and stopped further from it. I assume that the visibility from a big truck is quite poor and keep my distance.

      • by mark-t ( 151149 )
        Good point... although interestingly, this is also something that should be fairly easy to program, and ought to be possible for the programmers of this system to mitigate from being likely to occur again.
        • I wouldn't know how easy or difficult it is to program, but when I drive, I assume that the other drivers may make a mistake (just like I sometimes make a mistake) and I use caution, even if I have the priority. Doubly so around big vehicles, like trucks and buses, since they may need more time to stop.

          For example:
          1.I am at an uncontrolled intersection, planning to turn right. I see a car from my left, it has its right turn signal on. That would mean that the other driver will turn to the street I am curren

      • This is Las Vegas, that it went two hours without a fender bender is already doing better than human drivers.

    • Re: (Score:3, Insightful)

      by Rob Y. ( 110975 )

      In any case, unless this was a freaky situation, I'm gonna guess a human driver of the shuttle would've not gotten into the accident. So maybe hitting the brakes and stopping isn't enough of an algorithm to let this thing loose in the real world. Calling this human error is giving the algorithm a bit too much benefit of the doubt.

      • by mikael ( 484 )

        Probably a human would have been more aggressive, beeped the horn a few times, rolled down the window, shouted various expletives relating to the future afterlife, the cognitive abilities of the driver, as well as the functional capabilities of his visual system.

        • Motorcyclists learn a useful rule of the road: if it's bigger than you, assume it's trying to kill you. Can't be that hard to program.
    • by HornWumpus ( 783565 ) on Thursday November 09, 2017 @06:39PM (#55522081)

      Not what the pictures show.

      The shuttle bus drove right up to the side of a backing semi then stopped right behind the angled front wheel. You wouldn't have done that, because you could understand the truck drivers plan at a glance (and presumably aren't an asshole). Also because you would understand that the fastest way past was to let the truck finish backing up.

      The trucker should have stopped and waited for the shuttle to back away. But the shuttle shouldn't have said 'my right of way' until it achieved gridlock. A human that did what the shuttle did is an asshole.

      • And yet the truck driver got the traffic ticket. Next phase of AI is realizing that humans routinely ignore the traffic laws.

      • by mysidia ( 191772 ) on Thursday November 09, 2017 @08:28PM (#55522591)

        Based on the picture; the shuttle should have been cited for pulling up too close to a vehicle moving in conflicting direction to cause a crash, not the truck driver --- sometimes the officer at the scene gets it wrong.

        You DON'T pull up to obstruct the passage of the FRONT of a vehicle that is backing up, as the driver will clearly be looking at the path behind their vehicle, not at their front tire section, and you will get hit.

      • because you could understand the truck drivers plan at a glance

        In this case, yes. However not understanding people's plan actually is what causes a large portion of road accidents.

      • by mjwx ( 966435 )

        Not what the pictures show.

        The shuttle bus drove right up to the side of a backing semi then stopped right behind the angled front wheel. You wouldn't have done that, because you could understand the truck drivers plan at a glance (and presumably aren't an asshole). Also because you would understand that the fastest way past was to let the truck finish backing up.

        The trucker should have stopped and waited for the shuttle to back away. But the shuttle shouldn't have said 'my right of way' until it achieved gridlock. A human that did what the shuttle did is an asshole.

        This is why autonomous cars are nowhere near ready for Prime Time.

        Technically the autonomous car was in the right but a human driver would have spotted a reversing truck and waited. That is what we call common courtesy or road craft. You plan ahead, be aware of your surroundings and react accordingly. Lorries and articulated trucks have huge blind spots (and our nations depend on fleets of these vehicles to run day to day), sometimes they need to bend the rules to do the jobs.

        OK some drivers cant plan

    • Not sure why this is rated funny, it should be informative.
      Expected behaviour is a thing. I noticed something just today as I was driving, a woman and child ran quickly across a busy street with a trajectory straight in front of my fast moving car. I saw them, they saw me and I knew that their intention was to stop on the small median island between lanes so carried on at full speed. The event resulted as expected, they stopped on the island, I kept going uninterrupted.
      With AI how does it know they intend
    • I'm looking at the "No Brakes", and "No Steering Wheel"; that's what I'd call, "A Hell of a Plan."
  • Not ready yet. (Score:5, Insightful)

    by Fly Swatter ( 30498 ) on Thursday November 09, 2017 @06:37PM (#55522071) Homepage

    Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided.

    If the shuttle had the same sensing equipment as the truck has the accident would have been avoided (ftfy). A human would have laid into the horn as the truck got closer to alert him hes about to hit someone. A human would also have seen the truck backing in and yielded a larger room for error. An alert human may also see the situation that they could quickly back up a bit before the truck hit them. (per article trucker was cited for illegal backing (up?). This isn't ready in my opinion, but a nice alpha test though.

    • by l20502 ( 4813775 )
      When in doubt use some cameras to detect fear in passengers and what they're looking at
    • Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided.

      If the shuttle had the same sensing equipment as the truck has the accident would have been avoided (ftfy). A human would have laid into the horn as the truck got closer to alert him hes about to hit someone. A human would also have seen the truck backing in and yielded a larger room for error. An alert human may also see the situation that they could quickly back up a bit before the truck hit them. (per article trucker was cited for illegal backing (up?). This isn't ready in my opinion, but a nice alpha test though.

      Um maybe you don't know how to read. It's clearly ready. It just works in only this one highly specific scenario that requires a complete paradigm shift in how the world works to come true. That means ready.

  • by idontgno ( 624372 ) on Thursday November 09, 2017 @06:42PM (#55522093) Journal

    The victim self-driving shuttle bus didn't try to back away from being run over. According to reports, it couldn't for unspecified reasons. (I speculate that the autonomous logic or arrangement of sensors didn't adequately cover "going into reverse.")

    Someone up-topic asked about sounding a horn. I haven't heard any press reporting that the autonomous vehicle tried.

    Either case (if true) represent a difference between how the self-driving logic reacted and how a human driver would probably have. This tells me unless an autono-car can do everything a human driver can, at least as well as a human driver (admittedly a low bar), it shouldn't be on the streets. There will always be corner conditions; they have to be handled as well by the robot as they would be by a human.

    • by mikael ( 484 )

      The self-driving AI works on matching the current configuration of external objects, matched with a database of "what should I do here" actions. It's really a case of not being trained correctly.

      • > It's really a case of not being trained correctly.

        Which kind of surprises me, because if I were working on a driverless car, I'd have it generating a map of the surroundings rating them by how available they were for the vehicle - even a sidewalk can be a place to drive in an emergency if there are no pedestrians. (The 'where could I go if nothing changes' map, which is different from the presumed 'where is the road' map).

        I'd build a second map layer rating the probability of the vehicle occupying a g

  • by 93 Escort Wagon ( 326346 ) on Thursday November 09, 2017 @06:43PM (#55522095)

    1) Robotic vehicles need a horn - and additional logic to handle when to sound it.

    2) Robotic vehicles would benefit from the addition of a mechanical arm with a mechanical middle finger - for these sorts of post-accident situations.

    • Re:Two takeaways (Score:4, Interesting)

      by Obfuscant ( 592200 ) on Thursday November 09, 2017 @07:45PM (#55522381)

      2) Robotic vehicles would benefit from the addition of a mechanical arm with a mechanical middle finger

      If you are an asshole and pull up to a truck in such a way that he cannot continue the maneuver he was trying to perform, which would have gotten him out of your way, then you deserve the finger, not the truck driver.

      If a large truck is making a right turn and has moved into the left lane so he could accomplish that without running over the curb or other cars, it is an asshole who pulls up as far as he can go in the right lane to prevent the truck from completing the turn and causing a traffic jam, even if the car in the right lane technically has the right of way over the truck. Unfortunately, "asshole" is not a ticketable offense.

      A human would have identified the situation and remained clear. The AI assumed it had the right of way and did not. It doesn't matter in the end if the AI did or did not have the right of way, proper defensive driving would have prevent the accident altogether. "Being right" isn't always better than "being safe".

      As to the snarky comment by someone else that going a couple of hours in Las Vegas without a fender bender is better than humans can do, I'll just point out that I've driven for hundreds of hours in Las Vegas and have neither run into anyone else, nor have I had anyone run into me.

      • by j-beda ( 85386 )

        If a large truck is making a right turn and has moved into the left lane so he could accomplish that without running over the curb or other cars, it is an asshole who pulls up as far as he can go in the right lane to prevent the truck from completing the turn and causing a traffic jam, even if the car in the right lane technically has the right of way over the truck. Unfortunately, "asshole" is not a ticketable offense.

        A human would have identified the situation and remained clear. The AI assumed it had the right of way and did not. It doesn't matter in the end if the AI did or did not have the right of way, proper defensive driving would have prevent the accident altogether. "Being right" isn't always better than "being safe".

        I'm not so sure that most drivers I have observed would consistently even identify the truck turning right from the left lane as being something to be aware of. Heck, I'm not confident that I would catch this 100% of the time. I'm not doing a lot of driving in areas of town with big rigs - and I'm getting old and stupid....

  • by fluffernutter ( 1411889 ) on Thursday November 09, 2017 @07:07PM (#55522199)
    How many comments have I seen on Slashdot asking if all edge cases have *really* been tested? Well it turns out everyone was right in this case. I mean, was this AI tested on real streets at all? It's hard to imagine a car on the road for more than a month wouldn't have had a truck pull out in front of it unexpectedly a couple times. It doesn't matter how fast the AI brain is, this is a case where anticipation may have helped. I just feel bad for the truck driver. Yes, he was in the wrong but a lot of times driving a big truck you have to maneuver this way and rely on other cars working with you a little bit.
    • I can't wait for a self-driven semi to attempt a buttonhook turn at an intersection, just to have both it and the approaching self-driven car freeze to a complete halt thus causing complete gridlock.
  • Bullshit (Score:4, Insightful)

    by Rick Schumann ( 4662797 ) on Thursday November 09, 2017 @07:55PM (#55522427) Journal

    Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided."

    If the shuttle had a human driver the entire incident would never have happened because the half-assed excuse for 'AI' they keep trotting out can't actually THINK.

    • by TheSync ( 5291 )

      Newsflash, trucks run into human-driven buses all the time, for example [syracuse.com].

      • by bws111 ( 1216812 )

        Newsflash, if it happened 'all the time' it wouldn't be news. The reason it is news is because it happens INFREQUENTLY. What actually happens 'all the time' is that buses and trucks encounter each other AND DON'T CRASH.

        This is the big problem with a lot of self-driving proponents - they focus on the RARE events and say they could be prevented, but completely ignore the common, everyday realities of driving.

        • I actually attribute the 'thinking' of SDC fanbois as what's referred to as 'magical thinking'; they assume that this 'technology' will somehow 'magically' solve all the problems. Of course it takes a particular kind of mind, disciplined in certain ways, to remember: you have to also think about what can go WRONG with something. Good engineers and programmers know this; the average person doesn't.
  • Now please, inform me of how there is absolutely no need for driverless cars to communicate with each other. Hell, if Driverless cars are going to bering the much promised safety, all drivered cars will have to comunicate with teh driverless cars.

    Because if someone was injured in this accident, the no comms peopel would immediately shift into No True Scotsman mode.

    So Trump's decision to remove the requirement is a death knell to the Autonomous driving initiative.

  • > "The shuttle did what it was supposed to do, in that it's (sic) sensors registered the truck and the shuttle stopped to avoid the accident,"

    Assuming the "goal" with driverless cars is to have a vehicle that can respond as a human might. To avoid the inevitable issues.

    This bus failed miserably on two accounts.

    First, if it were programmed more human like it would have blown its horn to warn the big rig. Failure #1.

    Secondly, if it were programmed more human like it would have backed up to avoid the bi

  • Look, we human drivers know how other human drivers are going to either break the law or cut corners in driving. We are able to anticipate and react correctly to avoid an accident as a result. These machines can't do that. They are developed in a perfect lab environment kind of thinking and they will never be able to be human enough to interact with humans in the real world situations that result from driving in the real world, not a software world. Driving is psychological as much as physical. If the machi
  • by gillbates ( 106458 ) on Friday November 10, 2017 @11:57AM (#55525991) Homepage Journal

    A large part of the fact that I've managed to avoid accidents for so long is the fact that, as a human, I understand how other humans are likely to act and react.

    The problem with AI drivers is that humans only loosely follow the rules of the road; their actions are driven multiple influences, and understanding what another human is likely to do in any given situation requires being a human being. For example, consider the following:

    • Who, but an AI, would actually drive the speed limit in the hammer lane?
    • In the case of a driver ahead slowing down because of the presence of children, would the AI know to slow down gradually, so as not to create a risk of collision to the cars behind?
    • Would an AI know how to modulate its breaking in an emergency stop so that it neither struck the vehicle ahead, nor was struck by the vehicle behind? Would it understand that in such a situation, even if it could stop completely, that a better course of action might be to pull to the shoulder so that an impending collision behind it could be avoided?
    • Would an AI know that a driver is likely to stop for the geese crossing the road? (which, btw, is required by law).

    I'm sure there are dozens of other similar cases, but you get the point. AI might understand, in the nominal sense, how to drive a car. What it can't understand is what other drivers are likely to do.

Without life, Biology itself would be impossible.

Working...