Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Transportation Software Hardware Technology

Experts Say Video of Uber's Self-Driving Car Killing a Pedestrian Suggests Its Technology May Have Failed (4brad.com) 325

Ever since the Tempe police released a video of Uber's self-driving car hitting and killing a pedestrian, experts have been racing to analyze the footage and determine what exactly went wrong. (If you haven't watched the video, you can do so here. Warning: it's disturbing, though the actual impact is removed.) In a blog post, software architect and entrepreneur Brad Templeton highlights some of the big issues with the video:
1. On this empty road, the LIDAR is very capable of detecting her. If it was operating, there is no way that it did not detect her 3 to 4 seconds before the impact, if not earlier. She would have come into range just over 5 seconds before impact.
2.On the dash-cam style video, we only see her 1.5 seconds before impact. However, the human eye and quality cameras have a much better dynamic range than this video, and should have also been able to see her even before 5 seconds. From just the dash-cam video, no human could brake in time with just 1.5 seconds warning. The best humans react in just under a second, many take 1.5 to 2.5 seconds.
3. The human safety driver did not see her because she was not looking at the road. She seems to spend most of the time before the accident looking down to her right, in a style that suggests looking at a phone.
4.While a basic radar which filters out objects which are not moving towards the car would not necessarily see her, a more advanced radar also should have detected her and her bicycle (though triggered no braking) as soon as she entered the lane to the left, probably 4 seconds before impact at least. Braking could trigger 2 seconds before, in theory enough time.)

To be clear, while the car had the right-of-way and the victim was clearly unwise to cross there, especially without checking regularly in the direction of traffic, this is a situation where any properly operating robocar following "good practices," let alone "best practices," should have avoided the accident regardless of pedestrian error. That would not be true if the pedestrian were crossing the other way, moving immediately into the right lane from the right sidewalk. In that case no technique could have avoided the event.
The overall consensus among experts is that one or several pieces of the driverless system may have failed, from the LIDAR system to the logic system that's supposed to identify road objects, to the communications channels that are supposed to apply the brakes, or the car's automatic braking system itself. According to Los Angeles Times, "Driverless car experts from law and academia called on Uber to release technical details of the accident so objective researchers can help figure out what went wrong and relay their findings to other driverless system makers and to the public."
This discussion has been archived. No new comments can be posted.

Experts Say Video of Uber's Self-Driving Car Killing a Pedestrian Suggests Its Technology May Have Failed

Comments Filter:
  • Doesn't matter (Score:5, Insightful)

    by Snotnose ( 212196 ) on Thursday March 22, 2018 @07:55PM (#56309899)
    Forget the Lidar, or lack of (they were testing cameras?). Forget the dude (heh, the first 12 hours thought he was a she. That's gotta hurt).

    Had I been driving that car, full alert, I would have killed that chick. I'd have felt bad, even knowing it was her fault. But the fact is, this dumbass walked in front of a fast moving car, at night, when she had no illumination, and the car had headlights. Her best hope of survival was a 100% functioning self driving car, anything less and she's dead.
    • Re:Doesn't matter (Score:5, Insightful)

      by Anonymous Coward on Thursday March 22, 2018 @08:06PM (#56309963)

      Your missing the point. The point of the 'dynamic range' bits of the summary are to say "just because the video didn't show enough light doesn't mean there actually wasn't enough light".

      When driving a car with normal headlights, can you genuinely not see what's the next lane over 150 feet ahead? If not, you need new headlights. Or new eyes.

      • by SuperKendall ( 25149 ) on Thursday March 22, 2018 @09:44PM (#56310339)

        When driving a car with normal headlights, can you genuinely not see what's the next lane over 150 feet ahead?

        Lots of cars fudge the headlights a bit to the right to keep from blinding oncoming cars. Combine that with her stepping out of a very dark shadow of the tree, as well as her dark clothing (jeans and black jacket) and it's very likely that because of the dynamic range of the scene (bright headlights making darker anything street lights shone on, then her being in the shadow from even the street lights) a driver would have been too blinded by available light to see the pedestrian.

        She didn't even have reflectors on the wheel of the bike, much less herself - even one might have saved her.

        • I call bullshit (Score:3, Informative)

          by Anonymous Coward

          I've got a forward facing car camera, and its very clear that eyes see things far bigger and brighter at the horizon than cameras. You would see her, even in that light.

          This is why the moon seems bigger and brighter when its near the horizon. When actually its dimmer due to atmosphere and the same size as normal.

          *But*, more than that.

          I've also seen the other video of this stretch of road filmed with a normal smartphone camera and its clear the camera in the car had terrible dynamic range. There is absolutel

        • Lots of cars fudge the headlights a bit to the right to keep from blinding oncoming cars.

          Uber's fleet are made up of current model Volvo XC90s. They have very well designed projection headlamps and are bright to boot. People haven't needed to "fudge" headlights since the parabolic dish + lightbulb days or on the cheapest and nastiest of todays cars.

          You can shine light as far forward as you want without blinding other drivers with nearly every modern car. Hell my now 12 year old hatchback has projection headlamps with a wonderfully controlled beam, my dad's 16 year old hatchback had electronic h

      • In perfect dark condition, no new moon possibly cloud coverage no good street lamp, I can barely see 30-40 meter away object which have no reflective properties. Try it. With a new moon or better reflective object I can naturally as you say see muuuuch further away. The question is : what are the condition here. People says "human eye see further away" yes and no. Was there other source of lights ? Because if yes it ruins your night vision, and if there is only shadow and nor reflective surface, forget you
        • by Khyber ( 864651 )

          "In perfect dark condition, no new moon possibly cloud coverage no good street lamp, I can barely see 30-40 meter away object which have no reflective properties."

          You'd have a horrible time doing night-time desert rockhounding with me and my buddies if your eyes are that bad. It's usually a 100 meter minimum trek to actual mountain base from our vehicles and we still illuminate the area quite fine with our headlights. Of course, we're using nice LED headlights (and one guy has a sick MK-R lightbar) instead

        • The street lights just before where the accident occurred may have blinded the camera too.

          Also the bushes and small trees to the left on the median would have blocked lidar and visual L.O.S. until she entered the road *and* the car was within 50'.

          The road literally widened from 2 to 4 lanes right where the accident occurred.

          https://www.google.com/maps/@3... [google.com]

          I don't see how an approaching car or human could have seen her before she was on the road.

          I regularly (2-3 times a year) drive up on people who I simply

      • by jrumney ( 197329 )
        150ft is still less than 3 seconds at 38mph. On a perfectly dry road, you need about 75 feet to stop from that speed once the brakes are applied, so you better react within the first 1.5s. That is about average (reaction times range from around 0.8 to 2.5 seconds), so she has a 50/50 chance with a human driver (though probably a lot of the 50% who still hit her would reduce their speed enough to only injure her).
    • Re:Doesn't matter (Score:5, Informative)

      by haruchai ( 17472 ) on Thursday March 22, 2018 @08:30PM (#56310053)

      Forget the Lidar, or lack of (they were testing cameras?). Forget the dude (heh, the first 12 hours thought he was a she. That's gotta hurt).

      Had I been driving that car, full alert, I would have killed that chick. I'd have felt bad, even knowing it was her fault. But the fact is, this dumbass walked in front of a fast moving car, at night, when she had no illumination, and the car had headlights. Her best hope of survival was a 100% functioning self driving car, anything less and she's dead.

      The released video is very misleading. While the deceased made a stupid decision to cross at that point, it was quite well lit.
      No one with even average vision or reflexes would have hit her.

      https://www.youtube.com/watch?... [youtube.com]

      • Re:Doesn't matter (Score:4, Informative)

        by amicusNYCL ( 1538833 ) on Thursday March 22, 2018 @10:43PM (#56310575)

        That video definitely shows better lighting. For reference, the impact happens around the 0:33 mark of the above video, there's a sign on the right side of the road for reference.

        But, that video is a little bit over-saturated. The light strings on the bridge look like a solid light, they aren't that bright. You can clearly make out the individual lights when you're actually there. Like usual, the reality is somewhere between these videos. I've been to the theater on that corner many times, and I remember it as being a poorly-lit street.

        • by haruchai ( 17472 )

          I thought the video and several accompanying photos taken nearby were looking too washed out but thought it might have been too much yellowish sodium lights.
          But even if it were pitch black at the time, the Uber vehicle has a large roof-mounted LiDAR which should have spotted her hundreds of feet sooner

          • too much yellowish sodium lights.

            I think those are the only kind we have in the Phoenix area. Not the best.

            LiDAR which should have spotted her hundreds of feet sooner

            This is true.

    • why forget the driver? a Convicted armed robber http://www.dailymail.co.uk/new... [dailymail.co.uk]

      no illumination? this is how it looks to human eye https://discourse-cdn.freetls.... [fastly.net]

      • Re: (Score:3, Informative)

        by Khyber ( 864651 )

        "Vasquez has felony convictions for attempted armed robbery after plot with Blockbuster video store co-worker to seize their own shop's taking's at gunpoint"

        BLOCKBUSTER. That crime must have been AGES ago.

        "Vasquez was convicted under her original name Rafael but now identifies as a woman"

        Well, that makes me feel less guilty about assuming it was a man driving when I saw the video. XY chromosomal pair is still present.

    • Her best hope of survival was a 100% functioning self driving car, anything less and she's dead.

      That's not entirely true, but that's the whole issue. The simple fact is that this situation - clear weather, dry road, no traffic, no light, obstacle in the road - is exactly the situation where any self-driving car (it doesn't need to be level 5 or whatever) should excel. The article is exactly right - one or more systems had a catastrophic failure. The car should have come to a complete stop if necessary before the driver ever saw the woman crossing the street.

      Personally, I think regulation is require

      • update quickly = new car each 2-4 years no regulation saying free updates for at least 9-12 years

      • by dgatwood ( 11270 )

        Personally, I think regulation is required. It's great if Google/Alphabet/Waymo is having success with their cars, or Lyft, or Tesla's experience with autopilot, but if we're going to have these cars on the road they should all be running the same software, it needs to be a collaborative effort. They can compete on human amenities inside the car, the software at a minimum (maybe sensors as well) should be a cooperative process where they share information and develop together. At the end, it gets certified

        • No, having multiple competing technologies is inarguably a good thing... But the emphasis should be on minimum standards, not creating a single, standard set of software. Otherwise bad things will happen.

          Ahhh. Right, buy car X, 20% less likely to run over pedestrians or kill occupants based on our patented algorithm that no one else can use. Well, you said right there "inarguably", so I guess I can't argue with that logic.

          the human population potentially collapses overnight.

          I'm glad we're keeping things in perspective. Surely there isn't a happy medium that humans are capable of creating, right? It's either one extreme or the other. I don't know, maybe we can take some of the lessons we've learned with all of our hardened distros and the countless failur

      • Part of the problem is that programming a car to safely drive autonomously on a limited-access freeway is ENORMOUSLY easier than programming a car to do the same thing on a "normal" road, unless the "normal" road is completely gridlocked & the car is just creeping along a few feet at a time.

        High-speed (but non-limited-access) divided highways with grade crossings and pedestrians are still very much in the "experimental" zone when it comes to autonomous vehicles.

        There are common situations where autonomo

    • Forget the dude (heh, the first 12 hours thought he was a she. That's gotta hurt).

      Ok, after extensive consultation with my wife, and since no one else had addressed this, I feel the need to. I don't remember which article I first read, but I remember the driver being a woman, and thought that until I saw the video, when I thought to myself, "well, she's not all that attractive, but I don't want to be insensitive." Then I showed my wife the video, and she immediately pointed out that wasn't a very attractive woman. I again remained silent, so as not to sound insensitive. But in the ba

    • Had I been driving that car, full alert, I would have killed that chick.

      You routinely hit stationary objects in the middle of the road? Just how poor is your eyesight?

    • by tlhIngan ( 30335 )

      Forget the Lidar, or lack of (they were testing cameras?). Forget the dude (heh, the first 12 hours thought he was a she. That's gotta hurt).

      Had I been driving that car, full alert, I would have killed that chick. I'd have felt bad, even knowing it was her fault. But the fact is, this dumbass walked in front of a fast moving car, at night, when she had no illumination, and the car had headlights. Her best hope of survival was a 100% functioning self driving car, anything less and she's dead.

      II've nearly had

    • That's not really the point here (although it might be valid in other discussions).

      The LIDAR/Radar/Whatever does not care much about lighting condition, so understanding the cause of the failure is important. If the cause isn't linked to lighting, then this situation (pedestrian crossing where it shouldn't) could trigger the same lack of reaction in broad daylight if the conditions are right. This is a serious issue.

      It would be better for everyone (except maybe for short-term gains from carmakers) to crowds

    • by sycodon ( 149926 )

      Maybe you are just a very shitty driver.

  • Duh!! yes! along with the convept of a safety driver.

    Just my 2 cents ;)
  • by NewtonsLaw ( 409638 ) on Thursday March 22, 2018 @08:07PM (#56309969)

    There were multiple failures all around which caused this death. If any one of those failures had not happened then the pedestrian would likely still be alive today.

    I've summed it up here [aardvark.co.nz] in a column which was written almost 24 hours ago so it's nice to see that others have come to similar conclusions.

    • This is a Shakespearean tragedy, no character does anything correctly.

      1. A woman, homeless, crossing the street, with a bicycle. In her trek across 4 lanes worth of street, she appears to never even glance towards the one thing that could ruin her day.

      2. A "safety driver", who is neither of those things.

      3. A vehicle, controlled by Uber, which completely shits the bed.

  • China is not doing a great job on these. Sad.
  • ...or due to changes made in wake of the Google-Uber lawsuit settlement. But the questions probably need to be asked, in light of statements made by both companies: 1. A note on our lawsuit against Otto and Uber [medium.com]: "Recently, we uncovered evidence that Otto and Uber have taken and are using key parts of Waymo's self-driving technology." 2. Uber and Waymo Reach Settlement [uber.com]: "We are taking steps with Waymo to ensure our Lidar and software represents just our good work."

  • by SuperKendall ( 25149 ) on Thursday March 22, 2018 @08:30PM (#56310055)

    I agree the LIDAR should have been able to see her before she entered the light.

    However, what we still do not know is - when did she start moving?

    If she was just standing in the left lane waiting to cross, the LIDAR may have seen her and just thought "well that lane is blocked, stick to this one". With the bike she might have looked like a barricade of some kind.

    It could still be she started moving around the time we see her in the video, which means she essentially jumped in front of the car...

    There could be a reason for her to do that - what if the car saw her, and slightly slowed out of caution? We know the car was going well under the limit when it hit, that could be a sign the car slowed down a bit prior.

    The human, seeing a car slow light that might have assumed it saw her and was going to stop to let her cross. So it could easily be a case of mixed signals, with the cars cautious actions in the end being a bad thing, when driving an over-abundance of caution can often have bad consequences.

    I'm still not sure the human safety driver would have seen her though, even though humans do have better dynamic range than cameras there still are times when you really can't see outside the headlights, and the woman crossing was all in dark clothing.

    • The camera should have seen her [youtube.com], and so should the backup driver if he/she hadn't been looking at her/his phone at the time.
      • That video is somewhat brighter than what a real driver would see.

        But even so, look at your video at 33 seconds - that's the point where the much wider camera in the uber catches sight of her shoes in his lane (at about 7 seconds in this video [theguardian.com]). The pedestrian is in shadow and if she wasn't moving before it would be very easy not to see her until she started moving. If you go back a bit further in the video she would be standing right between some bright background lights, making it even more difficult to

        • Look at the video again. As soon as the driver looked up, she saw the pedestrian. Took a while to realize what she was seeing, though.
          • Look at the video again. As soon as the driver looked up, she saw the pedestrian.

            When she looked up and saw the pedestrian, she was well past the bridge, and it was way too late to do anything. She basically looked up as the car hit the person.

            I was talking about the time she looked up BEFORE that time (in the driver video she looked up from her phone several times), which you can clearly tell by the poles passing by outside her window is when she is passing under the bridge, and match that with the other b

            • The timeline is thus: She looks up as she is going under the bridge, looks down again, then a few seconds later she looks up basically as the car is hitting the woman.

              Yeah, the driver was totally negligent in any case, looking at her phone more than at the street. It's true a human can't pay perfect attention all the time, but any human can put their phone down.

        • Why the fuck does the pedestrian have to be moving? A pedestrian can walk out into the road after not moving just as easily; the car needs to sense the danger and slow down regardless.
    • by Ramze ( 640788 )

      There's way too many factors for me to judge. I'll wait to see what the Uber tech's say.

      As you mentioned, the Uber might have seen her on LIDAR, and hey... she had a bike -- maybe the Uber thought she was riding the bike in her lane and never predicted she might move in front of the UBER. Or, maybe the UBER had a glitch (it happens... that's what the driver is there for.).

      The driver certainly wasn't paying as much attention to the road as he/she should have. Some states have laws against distracted dri

    • However, what we still do not know is - when did she start moving?

      If you look at the video closely, you can see the pedestrian's feet, allowing you to have better insight onto that question.

    • by AmiMoJo ( 196126 )

      If she was just standing in the left lane waiting to cross, the LIDAR may have seen her and just thought "well that lane is blocked, stick to this one". With the bike she might have looked like a barricade of some kind.

      In which case it should have slowed down. A human driver knows that lanes are not randomly barricaded with no prior warning like signs and cones. It should be a huge red flag, a clear indication that something unusual and potentially dangerous is happening.

    • Why does she have to move for the car to see her? The car should sense a dangerous situation whether there is movement or not.
    • by mjwx ( 966435 )

      I agree the LIDAR should have been able to see her before she entered the light.

      However, what we still do not know is - when did she start moving?

      LIDAR definitely would have seen it. It would be nearly impossible for it not to. LIDAR isn't the problem, the problem is what the software did with the information provided by LIDAR. The software either wrote the pedestrian off as a false positive or failed to take into account it's path and how it intersected with the vehicles path.

      However, what we still do not know is - when did she start moving?

      We can extrapolate a likely answer from the movement speed before they were hit.

  • by fluffernutter ( 1411889 ) on Thursday March 22, 2018 @08:35PM (#56310065)
    Well there you go. Clearly these cars should be kept on the road so Uber has every opportunity to make their technology better.
  • The main failure will be holding uber accountable.
  • The car had a single occupant who was not focused on the road. In a test vehicle, regardless of how autonomous the vehicle is if one person is required to monitor computer systems whilst the car is travelling, they need a second occupant to monitor the road.

    The Lidar system should work with no light at all, its an infrared laser system, it emits its own light! In fact if anything it would probably work better in complete darkness!

    This was obviously a major technology failure, however it needn't have resulted in someones death.

    The greatest failure here is not in the technology but in Ubers testing procedures.

    • If it's a driverless car that requires driver to be attentive of the road and what's goign on - it's never gonna work right.
      Either driver actively drives the car, or it doesn't... there's no in between. We're humans, not robots...

      This person was looking at his facebook all the time, next one will nod off.. etc.
      • When the car is deemed ready and put into production ans so on, yes absolutely.

        But this was a *test driving* situation, during which unexpected and possibly dangerous situations may occur. It is irresponsible of Uber to not have two people in the car at all times, one to monitor the road and take evasive action if necessary, and one to monitor the systems. One person cannot do both at the same time.

    • This is not only a failure in Uber's testing procedure. This is the failure of the whole concept of autonomous "driver assist only" systems. There are a lot of people out there who would do the exact same thing in e.g. their Tesla - let their attention slip for a while and let the system do its job, because humans are lazy, plus we are bad at drawing conclusions - nothing bad happened the first two weeks we drove the car, so that means nothing bad is going to happen after that either - right?

      This is also th

  • The idea of "safety driver" might be inherently flawed. When you drive a normal car, your attention is fixed on the traffic and surroundings. When the car does the driving, how are you supposed to sustain your attention for more than a few minutes? Boredom and attention lapses may be inevitable. If the safety driver is told ahead of time that attention lapses have severe penalties, they might struggle to remain alert for a while longer, but it might be a fact of human nature, that avoiding distractions is a
    • by iktos ( 166530 ) *

      Perhaps the car could generate simulated obstacles at random intervals for the driver to react to.

  • The first time I saw the video, I went, "Man, somebody's going to win a multimillion dollar judgement against Uber!" True be know, due to my night blindness, I probably would have hit the pedestrian too under similar circumstances. I constantly see pedestrians or bicyclists in dark clothing on the side of the road only after it would be too late to avoid them if they moved in front of my car.
  • The overall consensus among experts is that one or several pieces of the driverless system may have failed, from the LIDAR system to the logic system that's supposed to identify road objects, to the communications channels that are supposed to apply the brakes, or the car's automatic braking system itself.

    But the most important thing that failed was the human driver taking over control in an emergency when all systems fail to work properly, destroying a long held "get out of jail free card" creators of self driving cars use whenever anyone challenges them on the tech.

  • I'm no expert, but I've seen a lot of news about self-driving cars over the last ten years or so.
    At first it was DARPA, then Google, and they seemed to be testing for years . . .

    Then Tesla, and now Uber ?
    I guess Uber suddenly realized that self-driving cars might be a threat to their business in the long term.
    So they are notoriously getting caught with stolen code and poaching engineers.
    Seems like they are playing catch-up.
    That usually means skipping testing.

    Maybe the states should require that self-driving

  • Pull up google maps.

    https://www.google.com/maps/@3... [google.com]

    Observe the sign that says not to cross there (one on the other side of the median too btw).

    Observe the road widens from 2 to 4 lanes right before where the accident occurred.

    Observe that trees and bushes on the edge of the median to the left would have blocked Lidar until the car was within 50' of where the accident occurred.

    Confirm the scene against the video with the sign that reads

    "
    Begin right turn lane
    -
    Yield to Bikes.
    "
    Understand that it refers to bik

  • IMNSHO, "autonomous driving" is a huge fad - the idea is good, but it is _way_ ahead of its time.

    The problem is all those situations in traffic that require understanding a situation, processing what is going on, predicting potential risks, and responding appropriately. One example - you are driving and up ahead on the sidewalk there are two children. Do you slow down? What are the children doing? Do they seem to be behaving erratic, or are they engaged in playful behaviour like pushing each other around a

    • IMELPHO*, AI already has reached and even exceeded human levels in some areas. It remains laughably far away in others. But I'll echo others' observation that it does not have to be perfect in order to be a safer alternative to human drivers. It just has to be better than they are. And that's a very low bar. I'm not sure we're there yet, but I think we're close, closer than I'd have thought possible 10 years ago, and I think we'll get there. Sadly, but just as in other human endeavors, there will be m

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...