Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Software Transportation AI Businesses Communications Operating Systems Hardware Science Technology

Tesla Model S In Fatal Autopilot Crash Was Going 74 MPH In a 65 Zone, NTSB Says (latimes.com) 623

An anonymous reader quotes a report from Los Angeles Times: The Tesla car involved in a fatal crash in Florida this spring was in Autopilot mode and going about 10 miles faster than the speed limit, according to safety regulators, who also released a picture of the mangled vehicle. Earlier reports had stated the Tesla Model S struck a big rig while traveling on a divided highway in central Florida, and speculated that the Tesla Autopilot system had failed to intervene in time to prevent the collision. The National Transportation Safety Board released a preliminary report Tuesday that confirms some details of the May 7 collision, along with a photo that shows the car with its windshield flattened and most of its roof sheared off. The federal agency also included a photo of the big rig, circling an area on the right side of the tractor-trailer that showed the light damage the truck received from the collision. The 2015 Model S was moving at 74 mph, above the posted 65 mph speed limit, when it struck a 53-foot trailer being pulled by a Freightliner Cascadia truck. Tesla's semi-autonomous Autopilot driving feature was engaged, the report says.
This discussion has been archived. No new comments can be posted.

Tesla Model S In Fatal Autopilot Crash Was Going 74 MPH In a 65 Zone, NTSB Says

Comments Filter:
  • 74 at time of crash (Score:3, Interesting)

    by Chmarr ( 18662 ) on Tuesday July 26, 2016 @09:16PM (#52586865)

    So... it was going 74 mph at the time of the crash... was this after any kind of braking? What was the speed before any braking was applied?

    (I'm going to take a guess it was a LOT over 74mph)

    • Rumor down here is that no attempt was made to decelerate the Tesla.

      • by laird ( 2705 )

        They released a statement right after the accident saying that there was no indication that either AutoPilot or the driver tried to decelerate. That's consistent with an under-run accident where they didn't see the white trailer against a bright sky. Sadly, trucks are harder to see than you'd expect, so these accidents aren't rare.

    • It makes more sense if they are referring to the cruising speed before any breaking was applied.
    • by Dan East ( 318230 ) on Tuesday July 26, 2016 @09:25PM (#52586913) Journal

      Apparently brakes were not applied. They believe it was a combination of the trailer being a solid light gray color that tended to visually blend in with the sky, coupled with the radar being designed to ignore large flat signs that cross above the road. So the trailer managed to be filtered out as an hazard and was ignored by the software.

      • by OFnow ( 1098151 )
        In addition, the trailer was said to have a completely empty underside. None of the solid sheets installed on some trucks to improve fuel efficiency. So the road under the truck looked clear (sort of)
        • by cheater512 ( 783349 ) <nick@nickstallman.net> on Tuesday July 26, 2016 @10:19PM (#52587197) Homepage

          More than 'sort of'. Look at the photo of the car after the crash.
          It's almost entirely intact with minor damage if you ignore the roof!

          Very easy to see how the car thought it was clear - it technically was up to about a meter/4 feet above the road.

          • by gweihir ( 88907 )

            Indeed. The whole thing was a freak accident that will not repeat after the software has been adjusted. It took an incompetent driver (in two regards) and very special circumstances to happen in the first place. Criticizing Tesla for this is stupid, and I suspect its competitors are very much behind this campaign.

            • by Rei ( 128717 ) on Wednesday July 27, 2016 @07:12AM (#52588567) Homepage

              I'm not so sure that a simple software fix can fix it. Some key notes:

              About 4:40 p.m. eastern daylight time on Saturday, May 7, 2016, a 2015 Tesla Model S, traveling eastbound on US Highway 27A (US-27A), west of Williston, Florida, struck and passed beneath a 2014 Freightliner Cascadia truck-tractor in combination with a 53-foot semitrailer. At the time of the collision, the combination vehicle was making a left turn from westbound US-27A across the two eastbound travel lanes onto NE 140th Court, a local paved road. As a result of the initial impact, the battery disengaged from the electric motors powering the car. After exiting from underneath the semitrailer, the car coasted at a shallow angle off the right side of the roadway, traveled approximately 297 feet, and then collided with a utility pole. The car broke the pole and traveled an additional 50 feet, during which it rotated counterclockwise and came to rest perpendicular to the highway in the front yard of a private residence. The 40-year-old male driver and sole occupant of the Tesla died as a result of the crash.

              US-27A is a four-lane highway with a posted speed limit of 65 mph. A 75-foot-wide median separates the two eastbound lanes from the two westbound lanes. Additionally, at the uncontrolled intersection with NE 140th Court, both eastbound and westbound lanes incorporate left turn lanes, allowing for a median opening of about 132 feet. At the time of the crash, it was daylight with clear and dry weather conditions.

              Eastbound. Afternoon. May. Aka, the sun was right behind him. Clear and bright outside. This is a perfect recipe for light-colored objects ahead to be overexposed, against other overexposed objects, potentially including the road and the sky. If you have a big block of RGB(255,255,255), how do you determine the boundaries? The best you can do is recognize that it's a threat and disable autopilot, while warning the driver.

              A more appropriate solution, if this was indeed the case, would be a hardware fix: read the *raw* data from the camera. A potential alternative, if the frame exposure time can be adjusted, would be to read out alternating short and long exposure frames and combine them.

              • by gweihir ( 88907 )

                The system did recognize the side of the trailer, it just though it was a traffic-sign higher up. Hence there is data to work with.

        • by Anonymous Coward on Tuesday July 26, 2016 @10:22PM (#52587211)

          I think obstacles that are empty below 3 ft confuse the car.

          http://bgr.com/2016/05/11/tesla-model-s-summon-crash/

          I have also heard of the car running into 1/2 open garage doors.

          • by Rei ( 128717 )

            I've "heard of" the car doing a lot of things, the majority of which were shown not to have been accurate. No greater excuse has ever been made in the automotive world for wrecking your car than "the car went and wrecked itself!".

        • by Jeremi ( 14640 ) on Wednesday July 27, 2016 @12:12AM (#52587589) Homepage

          So the road under the truck looked clear (sort of)

          If only the autopilot system had been calibrated to take into account the exact height of the Tesla's roof. If that had been done, then there would have been no accident in this case (the Tesla would have stopped until the truck was out of the way), but when encountering a somewhat higher truck, the Tesla would pass cleanly underneath it, with the driver probably never even noticing what had happened. And that would have been rather awesome.

          • by SmilingBoy ( 686281 ) on Wednesday July 27, 2016 @04:25AM (#52588097)
            How would the car have been able to do this? The radar used does not have any vertical resolution, you only get a certain proportion of the radar that is returned, similar to what you get from a overhead sign. The camera would have been able to see the size of the gap but it did not detect the truck either as it was the same colour as the sky.
            • by Jeremi ( 14640 )

              How would the car have been able to do this?

              I'm sure that's the question that Tesla's engineers are asking each other right now. Since I'm not a Tesla engineer, I don't have the answer -- maybe it will require more hardware. I do know that a product that doesn't reliably take the location of its owner's head into account when doing its collision-prediction calculations is a product that is going to have trouble in the marketplace.

      • by OFnow ( 1098151 ) on Tuesday July 26, 2016 @09:31PM (#52586945)
        Forgot to mention. The car instructions say AutoPilot is not to be used where there are crossroads. In the Florida instance in question there were crossroads.
        • by vux984 ( 928602 )

          No wonder autopilot is safer than human drivers per mile driven.

          Humans drive everywhere in all weather in all circumstances... autopilot only drives on uninterrupted stretches of highway, in clear weather... and it still demands the human sit there with his hands on the wheel as a backup.

        • by gweihir ( 88907 )

          And so we have a driver that violated 3 (!) safety measures. It is no surprise he got killed with that. Technology can to only so much to compensate for stupid.

        • by Megane ( 129182 ) on Wednesday July 27, 2016 @07:26AM (#52588639)

          ...yes, because of the possibility that someone may cross in front of you when they don't have right of way, like this truck. It was a typical rural US Highway grade crossing intersection, and the truck was turning left onto a side road. The road was long and straight, and as I remember from looking at it on street view, it wasn't hilly, either, and it was daylight, so the truck driver should have had a good view of the oncoming car.

          I don't know why the truck driver's part is ignored so much. Well, I know, really, it's because that's so boring that it's not news. If the oncoming car hadn't been a Tesla, but had instead been an ordinary tired driver, none of this would have gone beyond local news.

      • by Anonymous Coward on Wednesday July 27, 2016 @12:07AM (#52587569)

        Apparently brakes were not applied. They believe it was a combination of the trailer being a solid light gray color that tended to visually blend in with the sky, coupled with the radar being designed to ignore large flat signs that cross above the road. So the trailer managed to be filtered out as an hazard and was ignored by the software.

        If the trailer had adhered to european safety regulation it would have at least side rails under to prevent cars being stuck underneath it. Not only would it have saved the car's driver it would also prevent the trailer from being detected as sign.

      • > designed to ignore large flat signs that cross above the road.

        Yeah, maybe ones the car can fit under. But there are no signs 'above' the road that are only 4 foot above it, that's an object you need to avoid.

  • or is it "nine you're mine"?
  • I call BFD here (Score:5, Insightful)

    by Snotnose ( 212196 ) on Tuesday July 26, 2016 @09:18PM (#52586879)
    I typically drive 10 mph over the posted speed limit, both on freeways and on roads. IMHO, the posted speed limit is for either A) the driver with dementia who shouldn't be driving anyway, or B) some government that needs the speeding fines to balance their budget.

    Go Los Angeles and there are some freeway offramps marked 25 MPH and, goddamit, they farking mean it oh holy shit will I make it. But as time goes on those honest speed limits get replaced with better intersections, but the speed limit stays the same.

    Freeway speed limits should be 80. Non freeway speeds should be a good 10 MPH over what they are already.

    / my comment doesn't count for the road in front of my house
    // please don't run over my cat
    • 10 over has pretty much become the accepted tolerance limit for most police and highway patrol. Its rare when you get a ticket for single digits over.
      • Yeah but that doesn't mean you aren't speeding. That means that the police lack the technology to really get you within that range. Now Tesla admits it in logs and it is probably to the thousands of a mile per hour. It is still illegal even if it is 'normal'.
        • thousandths.
        • Yeah but that doesn't mean you aren't speeding. That means that the police lack the technology to really get you within that range. Now Tesla admits it in logs and it is probably to the thousands of a mile per hour. It is still illegal even if it is 'normal'.

          Are you kidding, cops have had technology to accurately measure your speed to tenths of a mile an hour for quite some time. As for what constitutes speeding, we all know what is technically legal. What speed is accepted in general by drivers and law enforcement was the point.

    • Re:I call BFD here (Score:5, Insightful)

      by Solandri ( 704621 ) on Tuesday July 26, 2016 @11:36PM (#52587453)
      Normally I'd agree. But 74 mph is too fast for any road which allows cross-traffic (truck was on opposite side and made a left turn through the Tesla's path). I'm actually surprised it was even marked as a 65 mph zone when it has an uncontrolled intersection (not even a stop sign in the left turn lane).

      From an aircraft investigation standpoint (every accident has multiple contributing causes), I'd actually put most of the blame on the truck driver. If you look at the pic of the intersection, there is absolutely no way he didn't see the Tesla coming. He simply got impatient and made the turn, gambling that he could force the Tesla driver to slow down to avoid him (which didn't happen because the driver was inattentive with Autopilot on).

      That's not to excuse the Tesla driver. A big part of road safety is that both drivers are trying to avoid an accident. When one driver abandons that philosophy, the chances of an accident instantly double. When both drivers abandon that philosophy, you pretty much guarantee there will be an accident. While the truck driver made a one-time mistake, a Tesla driver who relies too much on Autopilot is making a continuous mistake. There will be a high chance of an accident any time he (or rather the car) drives past another inattentive or reckless driver.
    • by psy ( 88244 ) on Wednesday July 27, 2016 @12:58AM (#52587697)

      I typically drive 10 mph over the posted speed limit, both on freeways and on roads. IMHO, the posted speed limit is for either A) the driver with dementia who shouldn't be driving anyway, or B) some government that needs the speeding fines to balance their budget.

      Go Los Angeles and there are some freeway offramps marked 25 MPH and, goddamit, they farking mean it oh holy shit will I make it. But as time goes on those honest speed limits get replaced with better intersections, but the speed limit stays the same.

      Freeway speed limits should be 80. Non freeway speeds should be a good 10 MPH over what they are already.

      Lucky you're not in Australia.. I have been booked (via hidden camera) for doing 64km/h in a 60km/h zone (39.8mph in a 37.2 zone).

      Police generally will pull you over if you're doing 10km/h over the limit (6.2 mph) as the fine doubles at that point.

      15km/h over (9.3mph) triples the fine.

      And I'm not just talking about police on traffic duty - any police car will pull you over if you're speeding.

      If you get caught doing 25km/h over (15.5mph) that's an immediate loss of license.

      Our highway / freeway limits (apart from some isolated stretches on interstate highways) are all 100km/h (62mph).

      • To be fair Australian drivers are horrible, and I say this as an Australian who learnt to drive in Australia. Not only is road behavior bad but this stupid "Every K over is a Killer" marketing campagin has trained an entire country that it is more important to look at your dashboard than the road in front of you.

        Cracking down on speeding is a good thing. Doing it Australian style is definitely not.

  • The estate of the driver, or Eoin Musk

  • >"The Tesla car involved in a fatal crash in Florida this spring was in Autopilot mode and going about 10 miles faster than the speed limit," "Was Going 74 MPH In a 65 Zone,"

    Um, so what? That is about normal. Is this supposed to be shocking or something?

  • I remember reading something from Tesla saying they found autopilot was not on, and had it been it would have stopped the car.

    • by Areyoukiddingme ( 1289470 ) on Tuesday July 26, 2016 @09:34PM (#52586967)

      I remember reading something from Tesla saying they found autopilot was not on, and had it been it would have stopped the car.

      Different incident. There have been three in recent weeks. This is the fatality, where autopilot was on, didn't detect the truck, and the jackass was watching a Harry Potter DVD in the driver's seat. Hopefully he didn't have children, for their sake and for ours, so no one has lost their father and we get a Darwin Award nominee.

    • by starless ( 60879 )

      I remember reading something from Tesla saying they found autopilot was not on, and had it been it would have stopped the car.

      That was the incident in Pennsylvania, not Florida.
      https://www.engadget.com/2016/... [engadget.com]

  • by frank249 ( 100528 ) on Tuesday July 26, 2016 @09:33PM (#52586957)

    Why does this one death cause everyone to panic?

    • Because people around here tend to believe in personal accountability. If the car is driving then deaths become no one's fault?
    • by Anonymous Coward

      Because it was the first in an "autonomous" car and it was in a situation that was trivial for a human to avoid.

    • Because soon there will be tens of millions of these cars on the road.

      If there is a flaw in the system or the software, you now have tens of millions of malfunctioning missiles on the road.
    • by Jeremi ( 14640 ) on Wednesday July 27, 2016 @12:06AM (#52587567) Homepage

      Why does this one death cause everyone to panic?

      Who has panicked? Unless by "panic" you meant "engage in intense debate about the potential risks and rewards of a new and relatively unproven technology", but that's not a very common definition of that word.

  • by Stu Fuller ( 20764 ) on Tuesday July 26, 2016 @09:39PM (#52586989)

    So, did the truck turn in front of oncoming traffic? If so, why is this the Tesla's fault?

    • It's no one's fault, yet. There's no conclusions to the report.

    • by beanpoppa ( 1305757 ) on Tuesday July 26, 2016 @09:49PM (#52587049)
      Autopilot could see through the under-carriage of the semi. It's programmed to ignore obstacles it considers over the top of car so that it doesn't stop at every overpass and road sign. Also, the trailer was gray in color, which matched the color of the sky at the time.
      • by Jeremi ( 14640 )

        It's programmed to ignore obstacles it considers over the top of car so that it doesn't stop at every overpass and road sign.

        That's good, but this case demonstrates how important it is that it makes that determination correctly.

  • by Smiddi ( 1241326 ) on Tuesday July 26, 2016 @09:48PM (#52587041)
    The machines are already trying to kill us all - *runs away screaming*
  • by Okian Warrior ( 537106 ) on Tuesday July 26, 2016 @10:04PM (#52587135) Homepage Journal

    I think people are missing a rather big point here.

    The NTSB is investigating the accident, and will post a reasonably fair and accurate assessment of what happened.

    Tesla will make some changes to ensure that this type of accident is avoided in the future, and push at the next update.

    All teslas will become safer because of the analysis. In effect, the collective software will have "learned" from a mistake and corrected. This is not something that the driver of a fatal accident can do, nor other non-involved drivers.

    With enough data, enough mistakes and near-mistakes corrected, the software will quickly evolve to be safer than any human driver.

    From a machine-learning perspective, this has enormous benefits.

    • Re: (Score:3, Insightful)

      by ledow ( 319597 )

      You talk as if AI that "learns" is present here.

      There are no real good instances of that in the real world, certainly nothing predictable or verified to act in accordance with instructions.

      This is much more akin to a software bug. Someone will tweak a parameter, patch a flaw, add a condition but it's still inherently the same software underneath it all. Software that is trying to look down a webcam and interpret the data as a 3D model which it uses to try to drive.

      AI is NOWHERE NEAR this kind of capabilit

      • by Overzeetop ( 214511 ) on Wednesday July 27, 2016 @06:47AM (#52588471) Journal

        You're bitching about semantics. Machine learning, AI, programming - no, this isn't some autonomous correction to the system; it isn't going to "learn" from this in the human sense. But the system (programmers, sensors, and control fucntions) will be improved to deal with this type of situation. There is no AI in the car - it's just programmed reactions. But in your zeal to blather on about what AI is and isn't, you're missing the point that the *system* will become more capable of handling out-of-normal and unanticipated conditions. In humans we call this intelligence.

  • by theshowmecanuck ( 703852 ) on Tuesday July 26, 2016 @11:23PM (#52587417) Journal
    Seriously, who cares if it was going over by 9 mph? How does that significantly impact anything (other than the car and the trailer)? This is red herring that is being chummed right now. This is not a significant data point, or shouldn't be. They should just shut the fuck up until the report is complete.
  • by nicolaiplum ( 169077 ) on Wednesday July 27, 2016 @05:01AM (#52588167)

    Look at the way the trailer took the top of the car off while barely slowing it down. This shows how trailer under-run bars would have prevented this death. In Europe they are required, and we basically don't have this sort of side collision decapitation horror accident.

Your own mileage may vary.

Working...