Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Google Transportation Businesses Software Hardware Technology

Waymo CEO Expresses Confidence Its Cars Wouldn't Have Killed Elaine Herzberg (washingtonpost.com) 141

theodp writes: Nearly a week after an autonomous Uber SUV claimed the first life in testing of self-driving vehicles, The Washington Post reports that Waymo CEO John Krafcik says he is confident its cars would have performed differently under the circumstances (Warning: source may be paywalled; alternative source), since they are intensively programmed to avoid such calamities. "I can say with some confidence that in situations like that one with pedestrians -- in this case a pedestrian with a bicycle -- we have a lot of confidence that our technology would be robust and would be able to handle situations like that," Krafcik said Saturday when asked if a Waymo car would have reacted differently than the self-driving Uber.

In explaining its since-settled lawsuit against Uber last year, Google charged that Uber was "using key parts of Waymo's self-driving technology," and added it was "seeking an injunction to stop the misappropriation of our designs." In announcing the settlement of the lawsuit last month, Uber CEO Dara Khosrowshahi noted, "we are taking steps with Waymo to ensure our LIDAR and software represents just our good work." A Google spokesperson added, "We have reached an agreement with Uber that we believe will protect Waymo's intellectual property now and into the future. We are committed to working with Uber to make sure that each company develops its own technology. This includes an agreement to ensure that any Waymo confidential information is not being incorporated in Uber Advanced Technologies Group hardware and software." All of which might prompt some to ask: was Elaine Herzberg collateral damage in Google and Uber's IP war?
"I want to be really respectful of Elaine [Herzberg], the woman who lost her life and her family," Krafcik continued. "I also want to recognize the fact that there are many different investigations going on now regarding what happened in Tempe on Sunday." His assessment, he said, was "based on our knowledge of what we've seen so far with the accident and our own knowledge of the robustness that we've designed into our systems."
This discussion has been archived. No new comments can be posted.

Waymo CEO Expresses Confidence Its Cars Wouldn't Have Killed Elaine Herzberg

Comments Filter:
  • Seems legit (Score:3, Funny)

    by PeterGM ( 5304449 ) on Sunday March 25, 2018 @10:38AM (#56322993)
    I'm sure this statement is made in absence of any bias or potential for personal gain.
  • Soo.. (Score:2, Insightful)

    by Anonymous Coward

    Waymo killed Elaine by forcing Uber to take out the parts that worked..

    • Re: (Score:2, Insightful)

      by inking ( 2869053 )
      Waymo didn’t force anyone to operate the vehicle on a public road. This one is on Uber. Couldn’t have happened to a better company. They have been avoiding safety regulations for years and now it got someone killed.
    • Re: (Score:2, Interesting)

      by Gavagai80 ( 1275204 )

      Intellectual property kills. Drug patents kill millions every year. Self driving car patents will kill people too. The only solution would be legislatively-enforced openness/collaboration, done in a careful way that ensures there's enough profit incentive left for developing the tech.

  • What amazing me (Score:5, Insightful)

    by rsilvergun ( 571051 ) on Sunday March 25, 2018 @10:54AM (#56323065)
    is how good the damage control from Uber was. They got videos out fast with pitch black cameras that made it look like she came out of nowhere. Several days later videos popped up from locals showing the stretch of road was actually well lit. Even now I'm having a trough time finding those videos. There are stories now saying Uber's cars are behind Waymo, but I'm only just now seeing stories that say Uber should have avoided the crash. The first several /. posts about this story were riddled with comments from folks saying the crash was unavoidable and the pedestrian was completely at fault.

    I think Some of this is the media at large siding with corporations to our detriment. The big outlets (CNN, Fox, MSNBC) have long since stopped covering the story on their front page websites, even as a single link. There's a little bit of left wing press, but I heard about those videos showing how well lit the road was from a post on Ars Technical that was on my feed.

    Based on this I'm guessing that most people who don't read /. are going to end up assuming this was just an unavoidable accident caused by a crazy old homeless woman (a fact that was emphasized in many stories I read). I can't help but think we're being manipulated to think these cars are safer than they really are.
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      This is less about Uber's damage control and more about how moronic and whipped the US public is. It doesn't matter how dark the road was, even with no lights at all, the car should have been able to see her using those other sensors. These AI cars are supposed to be better at driving than we are because they can monitor what's going on in a 360 degrees around the car and aren't subject to the same light limitations that we are.

      I do wonder a bit if the dashboard cam wasn't deliberately set for day and left

      • Re:What amazing me (Score:5, Insightful)

        by fluffernutter ( 1411889 ) on Sunday March 25, 2018 @11:09AM (#56323141)
        That's what I don't get. It would be so simple to certify for these kinds of things before allowing on public roads. Everyone acts like it would be such an impediment. I have to demonstrate I can see without glasses for my license, otherwise I have to drive with glasses. Why does a self-driving car not have to make a basic demonstration of visual/sensor skills before being put on the road? Since self-driving largely relies on the superiority of its sensors for its safety, there should be much higher expectations for sensor testing than a human could actually perform.
        • Why does a self-driving car not have to make a basic demonstration of visual/sensor skills before being put on the road?

          What test do you propose? I'm willing to bet they will all pass with flying colours. That's the problem with standard testing. A friend of mine is in a similar but human scenario. He is blind as a bat but not restricted to driving with glasses. Why? He remembered what the bottom few lines of the eye chart said and just recited them.

          Just look at how good we are at emission controls.

        • by Kjella ( 173770 )

          That's what I don't get. It would be so simple to certify for these kinds of things before allowing on public roads. Everyone acts like it would be such an impediment.

          I think you'd just be wasting a ton of resources testing a non-problem. The sensors are designed to be capable enough and throw errors if they malfunction as they either go dark or produce gibberish. It seems extremely unlikely that you'd have a sensor glitch hide a person crossing the road while everything else looks normal. It's the logic layer that's difficult and formally retesting that for every tweak would be an impediment, it'd be like retesting a human driver every time he learns something new. At b

          • The sensors are designed to be capable enough and throw errors if they malfunction

            You're still using that line even after sensors have clearly failed and the driver was none the wiser.

    • Re:What amazing me (Score:5, Insightful)

      by quantaman ( 517394 ) on Sunday March 25, 2018 @12:01PM (#56323381)

      is how good the damage control from Uber was. They got videos out fast with pitch black cameras that made it look like she came out of nowhere. Several days later videos popped up from locals showing the stretch of road was actually well lit. Even now I'm having a trough time finding those videos. There are stories now saying Uber's cars are behind Waymo, but I'm only just now seeing stories that say Uber should have avoided the crash. The first several /. posts about this story were riddled with comments from folks saying the crash was unavoidable and the pedestrian was completely at fault.

      I'm not sure it was that good. The point of damage control is to find the least damning narrative and make that the one that sticks. Here Uber's first narrative was:

      "Not our fault! The pedestrian came from the next lane and appeared out of nowhere! Totally unavoidable!!"

      The moment they put out the crappy video a bunch of us could tell Uber was lying [slashdot.org]. Now their corporate credibility has taken another hit and they don't control the narrative anymore.

      Now imagine Uber's narrative was more like:

      "While the pedestrian was legally at fault our vehicle should have avoided the accident, and barring that, the safety driver should have been more attentive and avoided the situation. We are suspending all tests until we have determined the nature of the failure and taken steps to make sure it won't be repeated."

      The beauty of that narrative is it's consistent with an accident that's almost impossible to avoid, it just sounds like Uber is being really accepting of blame. And then when it comes out they really should have avoided the accident... well the statement is still true, so it doesn't really trigger another news cycle or destroy their credibility.

      I think Some of this is the media at large siding with corporations to our detriment. The big outlets (CNN, Fox, MSNBC) have long since stopped covering the story on their front page websites, even as a single link. There's a little bit of left wing press, but I heard about those videos showing how well lit the road was from a post on Ars Technical that was on my feed.

      It's more to do with the news cycle, news outlets only do investigative reporting when their viewers really care, otherwise they just do events. Self-driving Uber killing a pedestrian is an event, and they covered with the sources that were available, Uber and the PD, and they both backed Uber's narrative. Now for the investigative portion only the technical press really cares (Ars Technical, Slashdot). But if that investigation turns into another event, ie the Police making another announcement or a lawsuit on behalf of the victim, well that's an event again and Uber's BS gets called out by the mainstream headlines.

      • by MrL0G1C ( 867445 )

        The Victim's step daughter is 'retaining' lawyers regarding the death.

      • That sort of excuse making has become common (maybe because apologizing doesn't work anymore). Tell three lies and two excuses and hope 12 percent of the population will believe each one, then you're still ok. I would say it's ridiculous but it's remarkably effective.
      • Now imagine Uber's narrative was more like:

        "While the pedestrian was legally at fault our vehicle should have avoided the accident, and barring that, the safety driver should have been more attentive and avoided the situation. We are suspending all tests until we have determined the nature of the failure and taken steps to make sure it won't be repeated."

        But that was their narrative, almost word for word. They immediately suspended all tests and put out a press release saying the above. Uber is a horrible c

        • Now imagine Uber's narrative was more like:

          "While the pedestrian was legally at fault our vehicle should have avoided the accident, and barring that, the safety driver should have been more attentive and avoided the situation. We are suspending all tests until we have determined the nature of the failure and taken steps to make sure it won't be repeated."

          But that was their narrative, almost word for word. They immediately suspended all tests and put out a press release saying the above. Uber is a horrible company, but in this case they've done exactly what you suggested. The problem is that they could've easily foreseen this accident if they hadn't been cutting corners and trying to pretend their tech was better than it was for the sake of the next round of funding.

          I think it's a bit of a both. Uber was mostly silent while the police put out a very Uber friendly (or homeless pedestrian hostile) statement about the crash [theverge.com] which became the only narrative.

          I can accept Uber got a bit unlucky with how the PR played out. You don't want to say a lot during the investigation, and if the cops are being nice it doesn't really occur that you need to speak up to lower expectations.

          Plus, for whatever reason the video is so crappy, it makes it look like Uber is trying to pull a fast

    • by g01d4 ( 888748 )

      that most people who don't read /. are going to end up assuming this was just an unavoidable accident caused by a crazy old homeless woman

      I'd think most people would assume that a human driver would have a better view. That the accident was unavoidable would be due to Uber relying exclusively on such poor video (if that indeed was what they did). Most people would also put some blame on the "crazy [or inebriated] old homeless woman" who couldn't be bothered to cross in a legal fashion and wouldn't yield to

    • by Anonymous Coward

      Video links and a technical discussion here Engineering Tips - Engineering Failures & Disasters - Self Driving Uber Fatality [eng-tips.com]

    • by imidan ( 559239 )

      most people who don't read /. are going to end up assuming this was just an unavoidable accident caused by a crazy old homeless woman

      I haven't read the news on this very closely, so I have no idea whether the woman was crazy, old, homeless, high, or whatever. When I watched the video, I did assume that she had some level of cognitive impairment for one reason or another because otherwise she probably would have checked for traffic before walking in front of it.

      Doesn't mean she *deserves* death, but regardle

      • by MrL0G1C ( 867445 )

        The fact that she was attempting to cross a road isn't in dispute, whether or not the autonomous vehicle could of stopped is in dispute. Although pretty much every expert has said the car should have stopped and I very much agree given the road was well lit and the woman crossed several lanes and was nearly finished crossing the road when the vehicle hit her without braking. And the car also had radar and lidar.

    • I've not been paying attention but what I've got is it was way dark and she "just appeared out of nowhere". I've had similar driving experience surprises (haven't killed anyone.) Car computers may have faster reaction times but still can't bypass inertia.

      Waymo CEO John Krafcik says he is confident its cars would have performed differently

      Really? I'll believe him when I see him in this. [youtu.be]. Strongly suggest you waste 30s viewing this -- I wish a lot MORE people would stand behind (or in front of ) their work.

    • Except the video you cite was released by the police, not by Uber. Maybe Uber requested that it be released, but it seemed like the police were just trying to fill the information vacuum with what little information they had. Since the video is half-damning (the safety driver appeared to be distracted) and half-vindicating (the pedestrian was not easily visible), it's not really a great thing for Uber. The better PR came from initial reports, like that she "stepped out from behind a bush in the median" or s

    • Part of the problem is that by the time the information comes out it's not newsworthy anymore. It's a traffic accident with an experiment that was supposed to fail and rely on a safety driver. That's worth a news article on the day it happens, but it's not worth mainstream news updating with details as they develop. Makes it easy for Uber's spin to be the only thing most people see.

  • So the guy is supposed to say he has cars on the road that would murder someone?
  • suicide by self-driving car exceed suicide by cops deaths?

  • Just step in front of one of your cars.
    • This is exactly what I was thinking. So take very top ranking exec at self driving car company ? and have them walk in front of their car. Put your life where your mouth is!

      • So take very top ranking exec at self driving car company ? and have them walk in front of their car.

        I wouldn't be surprised if they actually did this as part of an advertising gag:

        "Our CEO is so confident of our car's safety, that he will walking out in front of a speeding car!"

        Of course, the car will have been customized, with special "high level exec" detection sensors . . .

      • by mrvan ( 973822 )

        This is actually exactly what the founder of Otis lifts did: he demonstrated his safety elevator by standing on a raised platform and having the rope severed, showing that the safety system would stop the fall safely [https://www.britannica.com/technology/elevator-vertical-transport#ref90006]

  • ... is to understand why, exactly, the car's sensors did not respond to the pedestrian as they should have.

    Figure that out, program that into part of the car's repertoire of situations to handle, appropriately, and at the very least, you'll have made future cars that much safer.

    If software was supposed to do X, and didn't do X, then the designers need to find out what is wrong with their assumptions about what the software is doing, and come to a resolution, so that software can behave as intended by it

    • Since the data will be in Uber's hands before it will be in the hands of the authorities, I'm not at all confident that the real story will ever come out. They will fabricate it the way that gets them off the easiest.
    • by amorsen ( 7485 ) <benny+slashdot@amorsen.dk> on Sunday March 25, 2018 @02:43PM (#56324203)

      Uber cars can't manage 13 miles between interventions. Whoever is doing the debugging must be buried under tens of thousands of reports.

      THAT is why they are to blame. An automatic car can mess up. It will happen once in a blue moon, and someone will die. Too bad, but that is the price of progress and you cannot really blame anyone, you can just compensate the family.

      Sending cars out on the road that demonstrably cannot function is different. That is reckless manslaughter. There is no way that Dara Khosrowshahi was unaware of the (lack of) performance of the Uber cars. He needs to be prosecuted.

  • "This ship is unsinkable!"

    History says NO ONE should make statements like this.

    • by amorsen ( 7485 )

      There is a difference between saying "This ship is unsinkable" and "This ship would not have been sunk by that particular iceberg."

  • I kind of thought this would be the end result of only one company seriously doing any work on self-driving cars. What we have here is a typical capitalist worst case scenario: Every company doing their own thing and starting from square one because each is unwilling to license IP from the other, and each is unwilling to let the fruits of their labour be used for the general good.

    Waymo could have avoided the accident? As far as I'm concerned they are culpable.

    • If you can't make a self driving car that drives safely, the answer is not to make a self driving car. "We couldn't make safe cars on our own, so we just put dangerous ones on the road" is not an excuse. This is totally Uber's fault. Waymo started working on this long before Uber did, and they've been a lot more cautious about deploying it because they don't want to kill anyone.

      • If you can't make a self driving car that drives safely, the answer is not to make a self driving car.

        You just completely missed the point in a most spectacular fashion.

        • by b0bby ( 201198 )

          I think the point is that it's entirely possible for anyone to come out with open source self driving car software. No one has. Google has dumped billions of dollars into testing and developing these things. If they didn't have the assurance that they could use the stuff they developed as they wished, they wouldn't have done it in the first place.

          Will there be some sort of framework developed over time where edge cases can be shared between platforms? Probably. But we are a long way off from that.

          • And that was my point. Yeah Google is only doing what's best for it's shareholders, that doesn't mean that this problem potentially could have been avoided if there wasn't an IT spat between two money hungry corporations.

            The industry can't continue this way. If it wasn't for the gifting of IP to save lives, then we wouldn't have seat-belts, famously developed by Volvo, and then purposely patented for the sole purpose of publicly opening the patent and ensuring no single company can lock down innovation in s

            • Google doesn't have ethics anymore so they are just acting as expected. I'm not sure what the solution is though. Nationalizing IP through executive orders would have a chilling effect on innovation.
          • On the other hand, research can be really expensive, as you've pointed out. To make this worthwhile, they have to protect their ideas and implementations as best they can. The alternative is large government grants to develop self-driving cars, and that's less efficient than the private sector. Government research grants are a good idea if the private sector isn't going to do the research, but not otherwise.

        • Then perhaps you'd care to clarify. What is your point? That Waymo is evil because they're investing billions of dollars into developing technology and not giving it away for free? That capitalism is evil because it leads to companies investing in technology and not giving it away for free? Clearly not that Uber is evil for choosing to develop their own technology rather than licensing another company's superior technology, though that seems to me a more obvious conclusion. Of course, an even more obvi

  • Well that's an easy enough challenge to reproduce so here ya are put up or shut up..set up a dummy on the very same road at the very same time and see what happens..
  • WE can now do something to prevent it from happening again... with human drivers that's not really possible.
  • by Anonymous Coward

    All he has to do is step out in front of traffic consiting of his own cars in the same conditions. I think the typical statistical standard is about 30 samples, right?

    • by novakyu ( 636495 )

      If I were in his position, and if somebody offered me big enough of an incentive (perhaps a massive fine on Uber), I would. Seriously, the conditions were such that any driver who wasn't DUI'ing wouldn't have hit the pedestrian, much less the "superior" autonomous car that can see in wavelengths we can't see.

      Uber really messed up here, and Waymo is doing a good (corporate) job kicking Uber while it's down.

  • by Ed Tice ( 3732157 ) on Monday March 26, 2018 @05:34AM (#56326877)
    The Uber cars have a failure every 13 miles. Normally when we talk about self-driving cars on /., we point out that a safety driver isn't very useful because they just can't avoid the boredom. If you look at pools, most lifeguard only work 30-45 minutes without a break just for this reason. The Waymo drivers can probably barely stay awake. But the Ubers cars are rolling sarcophagi with a failure every 13 miles. If I were the "safety" driver on those death traps I'd be white knuckling the steering wheel and eyes glued to the road.
  • by foxalopex ( 522681 ) on Monday March 26, 2018 @10:53AM (#56328199)

    Google's self-driving car technology has been around longer and probably done far more miles than Uber's tech ever has and we have yet to hear of them running over a Pedestrian. Even Tesla's Super-Cruise technology despite it missing trucks and killing the driver hasn't run over Pedestrians yet. Plus the statistics that the makers are required to provide show that the Uber self-driving tech has an alarmingly large number of required Driver interventions. Heck it was speeding to begin with! Plus we know Uber's in a rush to get this tech to work because the hope to IPO in about a year so at the end of the day, can't say I'm surprised they'd be first to kill a Pedestrian.

    • People don't generally activate Tesla's automation on city streets with traffic control. Street driving is much trickier than highway for automated vehicles to handle.

Trap full -- please empty.

Working...