Experts Say Video of Uber's Self-Driving Car Killing a Pedestrian Suggests Its Technology May Have Failed (4brad.com) 325
Ever since the Tempe police released a video of Uber's self-driving car hitting and killing a pedestrian, experts have been racing to analyze the footage and determine what exactly went wrong. (If you haven't watched the video, you can do so here. Warning: it's disturbing, though the actual impact is removed.) In a blog post, software architect and entrepreneur Brad Templeton highlights some of the big issues with the video:
1. On this empty road, the LIDAR is very capable of detecting her. If it was operating, there is no way that it did not detect her 3 to 4 seconds before the impact, if not earlier. She would have come into range just over 5 seconds before impact.
2.On the dash-cam style video, we only see her 1.5 seconds before impact. However, the human eye and quality cameras have a much better dynamic range than this video, and should have also been able to see her even before 5 seconds. From just the dash-cam video, no human could brake in time with just 1.5 seconds warning. The best humans react in just under a second, many take 1.5 to 2.5 seconds.
3. The human safety driver did not see her because she was not looking at the road. She seems to spend most of the time before the accident looking down to her right, in a style that suggests looking at a phone.
4.While a basic radar which filters out objects which are not moving towards the car would not necessarily see her, a more advanced radar also should have detected her and her bicycle (though triggered no braking) as soon as she entered the lane to the left, probably 4 seconds before impact at least. Braking could trigger 2 seconds before, in theory enough time.)
To be clear, while the car had the right-of-way and the victim was clearly unwise to cross there, especially without checking regularly in the direction of traffic, this is a situation where any properly operating robocar following "good practices," let alone "best practices," should have avoided the accident regardless of pedestrian error. That would not be true if the pedestrian were crossing the other way, moving immediately into the right lane from the right sidewalk. In that case no technique could have avoided the event. The overall consensus among experts is that one or several pieces of the driverless system may have failed, from the LIDAR system to the logic system that's supposed to identify road objects, to the communications channels that are supposed to apply the brakes, or the car's automatic braking system itself. According to Los Angeles Times, "Driverless car experts from law and academia called on Uber to release technical details of the accident so objective researchers can help figure out what went wrong and relay their findings to other driverless system makers and to the public."
1. On this empty road, the LIDAR is very capable of detecting her. If it was operating, there is no way that it did not detect her 3 to 4 seconds before the impact, if not earlier. She would have come into range just over 5 seconds before impact.
2.On the dash-cam style video, we only see her 1.5 seconds before impact. However, the human eye and quality cameras have a much better dynamic range than this video, and should have also been able to see her even before 5 seconds. From just the dash-cam video, no human could brake in time with just 1.5 seconds warning. The best humans react in just under a second, many take 1.5 to 2.5 seconds.
3. The human safety driver did not see her because she was not looking at the road. She seems to spend most of the time before the accident looking down to her right, in a style that suggests looking at a phone.
4.While a basic radar which filters out objects which are not moving towards the car would not necessarily see her, a more advanced radar also should have detected her and her bicycle (though triggered no braking) as soon as she entered the lane to the left, probably 4 seconds before impact at least. Braking could trigger 2 seconds before, in theory enough time.)
To be clear, while the car had the right-of-way and the victim was clearly unwise to cross there, especially without checking regularly in the direction of traffic, this is a situation where any properly operating robocar following "good practices," let alone "best practices," should have avoided the accident regardless of pedestrian error. That would not be true if the pedestrian were crossing the other way, moving immediately into the right lane from the right sidewalk. In that case no technique could have avoided the event. The overall consensus among experts is that one or several pieces of the driverless system may have failed, from the LIDAR system to the logic system that's supposed to identify road objects, to the communications channels that are supposed to apply the brakes, or the car's automatic braking system itself. According to Los Angeles Times, "Driverless car experts from law and academia called on Uber to release technical details of the accident so objective researchers can help figure out what went wrong and relay their findings to other driverless system makers and to the public."
Doesn't matter (Score:5, Insightful)
Had I been driving that car, full alert, I would have killed that chick. I'd have felt bad, even knowing it was her fault. But the fact is, this dumbass walked in front of a fast moving car, at night, when she had no illumination, and the car had headlights. Her best hope of survival was a 100% functioning self driving car, anything less and she's dead.
Re:Doesn't matter (Score:5, Insightful)
Your missing the point. The point of the 'dynamic range' bits of the summary are to say "just because the video didn't show enough light doesn't mean there actually wasn't enough light".
When driving a car with normal headlights, can you genuinely not see what's the next lane over 150 feet ahead? If not, you need new headlights. Or new eyes.
You can't in shadow, at night (Score:5, Insightful)
When driving a car with normal headlights, can you genuinely not see what's the next lane over 150 feet ahead?
Lots of cars fudge the headlights a bit to the right to keep from blinding oncoming cars. Combine that with her stepping out of a very dark shadow of the tree, as well as her dark clothing (jeans and black jacket) and it's very likely that because of the dynamic range of the scene (bright headlights making darker anything street lights shone on, then her being in the shadow from even the street lights) a driver would have been too blinded by available light to see the pedestrian.
She didn't even have reflectors on the wheel of the bike, much less herself - even one might have saved her.
I call bullshit (Score:3, Informative)
I've got a forward facing car camera, and its very clear that eyes see things far bigger and brighter at the horizon than cameras. You would see her, even in that light.
This is why the moon seems bigger and brighter when its near the horizon. When actually its dimmer due to atmosphere and the same size as normal.
*But*, more than that.
I've also seen the other video of this stretch of road filmed with a normal smartphone camera and its clear the camera in the car had terrible dynamic range. There is absolutel
Re: (Score:2)
Lots of cars fudge the headlights a bit to the right to keep from blinding oncoming cars.
Uber's fleet are made up of current model Volvo XC90s. They have very well designed projection headlamps and are bright to boot. People haven't needed to "fudge" headlights since the parabolic dish + lightbulb days or on the cheapest and nastiest of todays cars.
You can shine light as far forward as you want without blinding other drivers with nearly every modern car. Hell my now 12 year old hatchback has projection headlamps with a wonderfully controlled beam, my dad's 16 year old hatchback had electronic h
Re: (Score:3, Informative)
Aren't bicycle reflectors a legal requirement in the US?
It's a legal (FTC) requirement that bicycles are sold with reflectors. There is no Federal obligation on the buyer to keep them. the State of Arizona or City of Tempe may have statutes, but there's no applicable US Code for riding a bicycle with or without reflectors.
Depends on local illumination (Score:2)
Re: (Score:2)
"In perfect dark condition, no new moon possibly cloud coverage no good street lamp, I can barely see 30-40 meter away object which have no reflective properties."
You'd have a horrible time doing night-time desert rockhounding with me and my buddies if your eyes are that bad. It's usually a 100 meter minimum trek to actual mountain base from our vehicles and we still illuminate the area quite fine with our headlights. Of course, we're using nice LED headlights (and one guy has a sick MK-R lightbar) instead
Re: (Score:2)
The street lights just before where the accident occurred may have blinded the camera too.
Also the bushes and small trees to the left on the median would have blocked lidar and visual L.O.S. until she entered the road *and* the car was within 50'.
The road literally widened from 2 to 4 lanes right where the accident occurred.
https://www.google.com/maps/@3... [google.com]
I don't see how an approaching car or human could have seen her before she was on the road.
I regularly (2-3 times a year) drive up on people who I simply
Re: (Score:3)
https://www.youtube.com/watch?... [youtube.com]
Take a look at 0:30 and beyond, the robocar obviously failed and the driver had lots of time to react if they were paying attention.
Re: (Score:2)
Re:Doesn't matter (Score:5, Insightful)
Experimental technology doesn't work as promised. Total shocker there.
Yes, experimental technology obviously fails every time, but that's not the point. The point is Uber is deploying fail-prone experimental technology in public and it took someone's life.
Re:Doesn't matter (Score:4, Interesting)
The point is Uber is deploying fail-prone experimental technology in public and it took someone's life.
What you just said describes a huge majority of industries. Be thankful you didn't apply that thinking to oil refining or you wouldn't have cars at all.
Re: (Score:3)
Maybe they should have called the system "autopilot" instead of "self-driving"...
Re: (Score:2, Insightful)
There's a right and a wrong way of doing that. In this case, it's absolutely ridiculous that the government is allowing any of these companies to operate AI cars without first demonstrating that they can handle this very problem. Having a car sense something in the road in the same or adjacent lanes and slow or stop is one of the easiest tasks to get right.
Obviously, it's not easy, but if you can't do that, then none of the rest of the stuff makes any sense. What good is keeping a car between the lane marke
Re: (Score:3)
a car will eventually show up and you don't want to blind them.
Thanks for not joining the growing group of people determined to blind everyone.
Re:Doesn't matter (Score:5, Informative)
Forget the Lidar, or lack of (they were testing cameras?). Forget the dude (heh, the first 12 hours thought he was a she. That's gotta hurt).
Had I been driving that car, full alert, I would have killed that chick. I'd have felt bad, even knowing it was her fault. But the fact is, this dumbass walked in front of a fast moving car, at night, when she had no illumination, and the car had headlights. Her best hope of survival was a 100% functioning self driving car, anything less and she's dead.
The released video is very misleading. While the deceased made a stupid decision to cross at that point, it was quite well lit.
No one with even average vision or reflexes would have hit her.
https://www.youtube.com/watch?... [youtube.com]
Re:Doesn't matter (Score:4, Informative)
That video definitely shows better lighting. For reference, the impact happens around the 0:33 mark of the above video, there's a sign on the right side of the road for reference.
But, that video is a little bit over-saturated. The light strings on the bridge look like a solid light, they aren't that bright. You can clearly make out the individual lights when you're actually there. Like usual, the reality is somewhere between these videos. I've been to the theater on that corner many times, and I remember it as being a poorly-lit street.
Re: (Score:2)
I thought the video and several accompanying photos taken nearby were looking too washed out but thought it might have been too much yellowish sodium lights.
But even if it were pitch black at the time, the Uber vehicle has a large roof-mounted LiDAR which should have spotted her hundreds of feet sooner
Re: (Score:3)
too much yellowish sodium lights.
I think those are the only kind we have in the Phoenix area. Not the best.
LiDAR which should have spotted her hundreds of feet sooner
This is true.
Re: Doesn't matter (Score:3, Insightful)
Re: (Score:3)
> These cars need to recognize that they shouldn't hit anything in the road no matter if it is a person, pothole, or box
That's not always the best strategy. What if you're driving 60 mph and a tumbleweed blows in front of your car? And there is a fully-loaded cement truck behind you. Do you really want to slam on the brakes in that situation? There is more to safe driving than just "avoid hitting things". Sometimes complex decisions need to be made based on the consideration of multiple risk factors.
Re: (Score:3)
why forget the driver? a Convicted armed robber http://www.dailymail.co.uk/new... [dailymail.co.uk]
no illumination? this is how it looks to human eye https://discourse-cdn.freetls.... [fastly.net]
Re: (Score:3, Informative)
"Vasquez has felony convictions for attempted armed robbery after plot with Blockbuster video store co-worker to seize their own shop's taking's at gunpoint"
BLOCKBUSTER. That crime must have been AGES ago.
"Vasquez was convicted under her original name Rafael but now identifies as a woman"
Well, that makes me feel less guilty about assuming it was a man driving when I saw the video. XY chromosomal pair is still present.
Re: (Score:3)
Her best hope of survival was a 100% functioning self driving car, anything less and she's dead.
That's not entirely true, but that's the whole issue. The simple fact is that this situation - clear weather, dry road, no traffic, no light, obstacle in the road - is exactly the situation where any self-driving car (it doesn't need to be level 5 or whatever) should excel. The article is exactly right - one or more systems had a catastrophic failure. The car should have come to a complete stop if necessary before the driver ever saw the woman crossing the street.
Personally, I think regulation is require
update quickly = new car each 2-4 years no regulat (Score:2)
update quickly = new car each 2-4 years no regulation saying free updates for at least 9-12 years
Re: (Score:3)
Re: (Score:2)
No, having multiple competing technologies is inarguably a good thing... But the emphasis should be on minimum standards, not creating a single, standard set of software. Otherwise bad things will happen.
Ahhh. Right, buy car X, 20% less likely to run over pedestrians or kill occupants based on our patented algorithm that no one else can use. Well, you said right there "inarguably", so I guess I can't argue with that logic.
the human population potentially collapses overnight.
I'm glad we're keeping things in perspective. Surely there isn't a happy medium that humans are capable of creating, right? It's either one extreme or the other. I don't know, maybe we can take some of the lessons we've learned with all of our hardened distros and the countless failur
Re: (Score:2)
Part of the problem is that programming a car to safely drive autonomously on a limited-access freeway is ENORMOUSLY easier than programming a car to do the same thing on a "normal" road, unless the "normal" road is completely gridlocked & the car is just creeping along a few feet at a time.
High-speed (but non-limited-access) divided highways with grade crossings and pedestrians are still very much in the "experimental" zone when it comes to autonomous vehicles.
There are common situations where autonomo
Re: (Score:2)
Forget the dude (heh, the first 12 hours thought he was a she. That's gotta hurt).
Ok, after extensive consultation with my wife, and since no one else had addressed this, I feel the need to. I don't remember which article I first read, but I remember the driver being a woman, and thought that until I saw the video, when I thought to myself, "well, she's not all that attractive, but I don't want to be insensitive." Then I showed my wife the video, and she immediately pointed out that wasn't a very attractive woman. I again remained silent, so as not to sound insensitive. But in the ba
Re: (Score:2)
Had I been driving that car, full alert, I would have killed that chick.
You routinely hit stationary objects in the middle of the road? Just how poor is your eyesight?
Re: (Score:2)
II've nearly had
Re: (Score:2)
<snip list of excuses>
So slow the fuck down!
Re: (Score:2)
Re: (Score:2)
That's not really the point here (although it might be valid in other discussions).
The LIDAR/Radar/Whatever does not care much about lighting condition, so understanding the cause of the failure is important. If the cause isn't linked to lighting, then this situation (pedestrian crossing where it shouldn't) could trigger the same lack of reaction in broad daylight if the conditions are right. This is a serious issue.
It would be better for everyone (except maybe for short-term gains from carmakers) to crowds
Re: (Score:2)
Maybe you are just a very shitty driver.
Re:Doesn't matter (Score:5, Insightful)
The fact that she had a bicycle leads me to believe that she was not blind. And based on her walking speed, I doubt that she was being chased. However, IF it was an electric car, she might have misjudged the vehicle's distance and speed.
What I saw on the video was an inattentive "driver", looking down for a full 5 seconds just before impact, and not hitting the brakes or making any attempt to avoid the pedestrian. I suspect that the "driver" was lulled into believing that the car was better than it really was, and therefore, behaving like a passenger rather than a driver.
Very sad.
Re:Doesn't matter (Score:4, Insightful)
Exactly. The car was a prototype, which by its very nature could be expected to fail in unexpected ways. Thus, you put a human in the car as a backup whose sole job is to remain attentive to react to any sudden failure of the car's self driving capabilities. Clearly, the emergency backup driver was treating the car as a complete functional autonomous vehicle rather than a prototype, so this is kind of on her.
Re:Doesn't matter (Score:5, Insightful)
Exactly. The car was a prototype, which by its very nature could be expected to fail in unexpected ways. Thus, you put a human in the car as a backup whose sole job is to remain attentive to react to any sudden failure of the car's self driving capabilities. Clearly, the emergency backup driver was treating the car as a complete functional autonomous vehicle rather than a prototype, so this is kind of on her.
No, no... this has been said so many times, that when people keep ignoring it, it's no longer funny.
Not even if you paid people a huge salary to remain fully attentive, would you get an attentive driver. The brain is remarkably apt at learning not to care about the car when it drives itself. One learns that "something else will do this, I don't need to" most of the time, so that robs you of countless *seconds* of reaction time in an event like this. By the time you realize you really do need to act, the accident already happened.
This is what everybody has been saying all along. No, human backups are unreliable and unworkable. You have to perfect the technology first in controlled conditions, because in the field the risk *is* high. This is NOT a task for agile methodologies.
Re: Doesn't matter (Score:3)
You have to perfect the technology first in controlled conditions
Right, because that's totally a doable thing. Sorry, guys, we're not going to the moon. Gotta get these rocket thingies 100% safe first in controlled conditions.
Re: (Score:2)
Re: (Score:3)
Right, because that's totally a doable thing.
It totally is. Others in the industry (e.g. Google/Waymo) have done it. They use a combination of starting first in safer conditions (closed environments; safer, slower roads; multiple, attentive safety drivers on short shifts) and massive use of simulation. The thing about self-driving systems is that you can take a real data feed and replay it as many times as you want, and you can also alter it or generate fake data to provide a realistic situation that is more challenging.
There's a reason that Google
Re: (Score:2)
Well you've been in a dark corner of the internet, haven't you?
Re: (Score:3)
For passenger transport the solution of automatic trans
Re: (Score:2)
Not even if you paid people a huge salary to remain fully attentive, would you get an attentive driver.
It can be done. This situation is hardly unique, and there are lots of tried and tested solutions.
GM has gaze tracking technology for their hands-off Supercruise system, and if it notices you are not looking out of the window at the road it starts demanding your attention and eventually shuts down. Uber could have used that tech and fired anyone who didn't have a 99% attentive rate.
In Japan they use a system where the operator has to point to things they are checking and sometimes say the name of that thing
Re: (Score:2)
Just play a fucking annoying sound every time the driver doesn't watch the road and doesn't have his foot on the break peddle.
I don't care if they're attentive. but they better be watching the road.
Re: (Score:3)
I have mixed feelings on your posting. On the one hand, our aviation experience tells us that humans do not perform well monitoring automation (we get lulled into complacency) and it takes much longer for us to come back up to speed and deal with an issue when we were just monitoring, versus when we were doing the task ourselves.
That said, the first thing I thought when I saw the video was that the person was there to ensure the vehicle didn't get into an accident, and yet the person wasn't paying any atten
Re: (Score:3)
Nobody can keep their attention focused for hours at a time when there is nothing to do. This is the fundamental problem with expecting a human to remain alert and focused on being ready to intervene in driving. When yesterday was eight hours of nothing to do, and the day before that, and the day before that, people are going to let their guard down. Either the human needs to remain involved in driving the vehicle directly, or they are just a meat ornament because we humans just aren't good at that sort of
Re: Doesn't matter (Score:2)
But I can't imagine there's any human who could keep perfect attention and ready to take over while being driven by an automatic car. A driver keeps attention on the road by having a continuous feedback of input and output. What they're asking their drivers to do is pay perfect attention without any feedback for I presume hours on end. Human drivers fall asleep if the road has too few turns... this is far worse. I doubt you could get the safety driver to focus for more than 30-60 minutes. It's simply too bo
Re:Doesn't matter (Score:5, Insightful)
However, IF it was an electric car, she might have misjudged the vehicle's distance and speed.
That doesn't matter. She was crossing a road with 4 lanes coming at her, and when she stepped off the curb she should have seen the car on the Mill bridge. That bridge is covered in lights, too, you can see the bridge lights on the internal camera behind her. There's no reason the woman crossing the completely dark section of road wouldn't have been able to look towards Tempe and see the car coming. And who crosses 4 lines, in the dark, with a car approaching, without looking at the car to see if they need to pick up the pace? She never looked at the car. The Uber car obviously experienced some sort of catastrophic failure, but in the same way that this is a case study for computer science or engineering students, it's also a case study on how not to cross a street. The woman easily could have avoided being hit. I expect her to avoid getting hit the same way I expect the autonomous car to not hit her.
Re: (Score:3)
Yes and no. You're seeing a recording that almost certainly has a lot less dynamic range than the raw pixel data from the sensor. I would assume that the self-driving tech uses raw pixel data, not a JPEG/MPEG-compressed approximation thereof.
If you take a photo in RAW mode on a DSLR, you can crank the gain up by two or three stops and see all sorts of stuff in the shadows that would otherwise not be visible within the color gamut of your monitor or a JPEG rendering. And even with the smaller cameras that
May have failed? (Score:2)
Just my 2 cents
Re: (Score:2)
The point of the safety driver is so the estate has someone to sue that isn't Uber.
An unfortunate coincidence of failures (Score:5, Interesting)
There were multiple failures all around which caused this death. If any one of those failures had not happened then the pedestrian would likely still be alive today.
I've summed it up here [aardvark.co.nz] in a column which was written almost 24 hours ago so it's nice to see that others have come to similar conclusions.
Re: (Score:2)
This is a Shakespearean tragedy, no character does anything correctly.
1. A woman, homeless, crossing the street, with a bicycle. In her trek across 4 lanes worth of street, she appears to never even glance towards the one thing that could ruin her day.
2. A "safety driver", who is neither of those things.
3. A vehicle, controlled by Uber, which completely shits the bed.
Re: An unfortunate coincidence of failures (Score:4, Funny)
To prove that we're not heartless we have posthumously given her a Darwin Award. That ought to set your mind at ease.
volvo not so good. (Score:2)
Hopefully not based on Google self-driving tech... (Score:2)
...or due to changes made in wake of the Google-Uber lawsuit settlement. But the questions probably need to be asked, in light of statements made by both companies: 1. A note on our lawsuit against Otto and Uber [medium.com]: "Recently, we uncovered evidence that Otto and Uber have taken and are using key parts of Waymo's self-driving technology." 2. Uber and Waymo Reach Settlement [uber.com]: "We are taking steps with Waymo to ensure our Lidar and software represents just our good work."
What you don't see - when did movement start (Score:5, Interesting)
I agree the LIDAR should have been able to see her before she entered the light.
However, what we still do not know is - when did she start moving?
If she was just standing in the left lane waiting to cross, the LIDAR may have seen her and just thought "well that lane is blocked, stick to this one". With the bike she might have looked like a barricade of some kind.
It could still be she started moving around the time we see her in the video, which means she essentially jumped in front of the car...
There could be a reason for her to do that - what if the car saw her, and slightly slowed out of caution? We know the car was going well under the limit when it hit, that could be a sign the car slowed down a bit prior.
The human, seeing a car slow light that might have assumed it saw her and was going to stop to let her cross. So it could easily be a case of mixed signals, with the cars cautious actions in the end being a bad thing, when driving an over-abundance of caution can often have bad consequences.
I'm still not sure the human safety driver would have seen her though, even though humans do have better dynamic range than cameras there still are times when you really can't see outside the headlights, and the woman crossing was all in dark clothing.
Re: (Score:2)
That confirms my point (Score:2)
That video is somewhat brighter than what a real driver would see.
But even so, look at your video at 33 seconds - that's the point where the much wider camera in the uber catches sight of her shoes in his lane (at about 7 seconds in this video [theguardian.com]). The pedestrian is in shadow and if she wasn't moving before it would be very easy not to see her until she started moving. If you go back a bit further in the video she would be standing right between some bright background lights, making it even more difficult to
Re: (Score:2)
Not that time, the time before that (Score:2)
Look at the video again. As soon as the driver looked up, she saw the pedestrian.
When she looked up and saw the pedestrian, she was well past the bridge, and it was way too late to do anything. She basically looked up as the car hit the person.
I was talking about the time she looked up BEFORE that time (in the driver video she looked up from her phone several times), which you can clearly tell by the poles passing by outside her window is when she is passing under the bridge, and match that with the other b
Re: (Score:3)
The timeline is thus: She looks up as she is going under the bridge, looks down again, then a few seconds later she looks up basically as the car is hitting the woman.
Yeah, the driver was totally negligent in any case, looking at her phone more than at the street. It's true a human can't pay perfect attention all the time, but any human can put their phone down.
Re: (Score:2)
Re: (Score:2)
There's way too many factors for me to judge. I'll wait to see what the Uber tech's say.
As you mentioned, the Uber might have seen her on LIDAR, and hey... she had a bike -- maybe the Uber thought she was riding the bike in her lane and never predicted she might move in front of the UBER. Or, maybe the UBER had a glitch (it happens... that's what the driver is there for.).
The driver certainly wasn't paying as much attention to the road as he/she should have. Some states have laws against distracted dri
Re: (Score:2)
However, what we still do not know is - when did she start moving?
If you look at the video closely, you can see the pedestrian's feet, allowing you to have better insight onto that question.
Re: (Score:3)
If she was just standing in the left lane waiting to cross, the LIDAR may have seen her and just thought "well that lane is blocked, stick to this one". With the bike she might have looked like a barricade of some kind.
In which case it should have slowed down. A human driver knows that lanes are not randomly barricaded with no prior warning like signs and cones. It should be a huge red flag, a clear indication that something unusual and potentially dangerous is happening.
Re: (Score:3)
Road crews are often made up of a large assortment of complete fucking idiots.
You should see programming teams.
Re: (Score:2)
Re: (Score:3)
I agree the LIDAR should have been able to see her before she entered the light.
However, what we still do not know is - when did she start moving?
LIDAR definitely would have seen it. It would be nearly impossible for it not to. LIDAR isn't the problem, the problem is what the software did with the information provided by LIDAR. The software either wrote the pedestrian off as a false positive or failed to take into account it's path and how it intersected with the vehicles path.
However, what we still do not know is - when did she start moving?
We can extrapolate a likely answer from the movement speed before they were hit.
Better tech (Score:3)
Total failure (Score:2)
The most incriminating aspect of this video is... (Score:3)
The car had a single occupant who was not focused on the road. In a test vehicle, regardless of how autonomous the vehicle is if one person is required to monitor computer systems whilst the car is travelling, they need a second occupant to monitor the road.
The Lidar system should work with no light at all, its an infrared laser system, it emits its own light! In fact if anything it would probably work better in complete darkness!
This was obviously a major technology failure, however it needn't have resulted in someones death.
The greatest failure here is not in the technology but in Ubers testing procedures.
Re: (Score:2)
Either driver actively drives the car, or it doesn't... there's no in between. We're humans, not robots...
This person was looking at his facebook all the time, next one will nod off.. etc.
Re: (Score:3)
When the car is deemed ready and put into production ans so on, yes absolutely.
But this was a *test driving* situation, during which unexpected and possibly dangerous situations may occur. It is irresponsible of Uber to not have two people in the car at all times, one to monitor the road and take evasive action if necessary, and one to monitor the systems. One person cannot do both at the same time.
Re: (Score:2)
This is not only a failure in Uber's testing procedure. This is the failure of the whole concept of autonomous "driver assist only" systems. There are a lot of people out there who would do the exact same thing in e.g. their Tesla - let their attention slip for a while and let the system do its job, because humans are lazy, plus we are bad at drawing conclusions - nothing bad happened the first two weeks we drove the car, so that means nothing bad is going to happen after that either - right?
This is also th
Safety Driver - Flawed Idea? (Score:2)
Re: (Score:2)
Perhaps the car could generate simulated obstacles at random intervals for the driver to react to.
Lawsuit! (Score:2)
But the most important thing that failed...... (Score:2)
The overall consensus among experts is that one or several pieces of the driverless system may have failed, from the LIDAR system to the logic system that's supposed to identify road objects, to the communications channels that are supposed to apply the brakes, or the car's automatic braking system itself.
But the most important thing that failed was the human driver taking over control in an emergency when all systems fail to work properly, destroying a long held "get out of jail free card" creators of self driving cars use whenever anyone challenges them on the tech.
Are self-driving newbies skimping on testing ? (Score:2)
I'm no expert, but I've seen a lot of news about self-driving cars over the last ten years or so. .
At first it was DARPA, then Google, and they seemed to be testing for years . .
Then Tesla, and now Uber ?
I guess Uber suddenly realized that self-driving cars might be a threat to their business in the long term.
So they are notoriously getting caught with stolen code and poaching engineers.
Seems like they are playing catch-up.
That usually means skipping testing.
Maybe the states should require that self-driving
Trees and Bushes on the median (Score:2)
Pull up google maps.
https://www.google.com/maps/@3... [google.com]
Observe the sign that says not to cross there (one on the other side of the median too btw).
Observe the road widens from 2 to 4 lanes right before where the accident occurred.
Observe that trees and bushes on the edge of the median to the left would have blocked Lidar until the car was within 50' of where the accident occurred.
Confirm the scene against the video with the sign that reads
"
Begin right turn lane
-
Yield to Bikes.
"
Understand that it refers to bik
Auto-pilot will not be good enough for a long time (Score:2)
IMNSHO, "autonomous driving" is a huge fad - the idea is good, but it is _way_ ahead of its time.
The problem is all those situations in traffic that require understanding a situation, processing what is going on, predicting potential risks, and responding appropriately. One example - you are driving and up ahead on the sidewalk there are two children. Do you slow down? What are the children doing? Do they seem to be behaving erratic, or are they engaged in playful behaviour like pushing each other around a
Re: (Score:2)
IMELPHO*, AI already has reached and even exceeded human levels in some areas. It remains laughably far away in others. But I'll echo others' observation that it does not have to be perfect in order to be a safer alternative to human drivers. It just has to be better than they are. And that's a very low bar. I'm not sure we're there yet, but I think we're close, closer than I'd have thought possible 10 years ago, and I think we'll get there. Sadly, but just as in other human endeavors, there will be m
Re: (Score:2)
Re: (Score:2)
The lesson for the dead person would have been to look both ways. The lesson for the rest of us is that Uber's self driving technology is not ready for prime time, for whatever reason(s).
Re: Yeah, no (Score:4, Insightful)
The lesson for the rest of us is that Uber's self driving technology is not ready for prime time, for whatever reason(s).
I dunno ... I mean even if we assume that a human driver could have avoided this particular accident, that doesn't mean the technology isn't still an improvement over human drivers in other cases. You'd need a lot more data to reach that conclusion. It could very well be that lives saved in other, more common types of preventable accidents massively outweigh the lives lost in these types of abnormal occurrences.
Re: (Score:3)
So if it was daylight where the SNR of the LIDAR is worse with ambient light hitting the sensor and the car hit her where would you place blame? Because that is exactly what would have happened. The LIDAR system would still have missed her. The system failed period full stop. It is as bad as when Intel released the processor that did not always add right. I know the tech world is spending billions to make this work and it seems to be every techies wet dream for it to work, but it does not work yet. And depl
Re: (Score:2)
Re: (Score:2)
Re: Right of way?? (Score:2)
Back in the day when I learned to drive the rule was that pedestrians ALWAYS have the right of way -- even if they're crossing in the middle of the road.
Yeah, lots of people "teach" that, but it's pure insanity. By that reasoning if some maniac jumps out onto a major highway right in front of your car while you're doing 100 km/h, it's your fault. If any country actually has such laws it's a place run by lunatics.
Re: Right of way?? (Score:2)
Not insanity by any means. It doesn't mean that a the driver is always at fault -- that would be insane -- but rather that since the driver is in control of a vehicle that will easily kill a pedestrian, the driver is expected to yield by default.
No. Right of way means exactly what it says; you have the right to take a given way. In your scenario the pedestrian has the right to walk out onto a major highway. If they have that right, then anyone who hits them is automatically at fault.
I actually think the law stated above is more problematic as it seems to allow a driver to willingly kill a pedestrian with the excuse that they were crossing outside of the crosswalk.
Again, no. There are other laws to deal with that; specifically murder laws. The fact that someone is breaking the law does not automatically give you the right to commit murder.
The original meaning of "right of way" applied to passage through owned property. Eg.
Re: (Score:2)
Obligatory funny video to illustrate the point: https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
28-793. Crossing at other than crosswalk
A. A pedestrian crossing a roadway at any point other than within a marked crosswalk or within an unmarked crosswalk at an intersection shall yield the right-of-way to all vehicles on the roadway.
Re: (Score:3)
Re: (Score:2)
Probably looking at system information for the car itself.
Re: (Score:2)
If you're blamed for killing a pedestrian, you may have your ass handed to you.
Re: (Score:2)
Why wouldn't the systems have stereo vision? That was one of the earliest successes of machine vision in the 1970s. Relatively easy to do, once you match up the features on both views.
I have never understood the need for Lidar.
Re: (Score:2)