Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Transportation Software Hardware Technology

Sorry Elon Musk, There's No Clear Evidence Autopilot Saves Lives (arstechnica.com) 128

Timothy B. Lee writes for Ars Technica: A few days after the Mountain View crash, Tesla published a blog post acknowledging that Autopilot was active at the time of the crash. But the company argued that the technology improved safety overall, pointing to a 2017 report by the National Highway Traffic Safety Administration (NHTSA). "Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40 percent," the company wrote. It was the second time Tesla had cited that study in the context of the Mountain View crash -- another blog post three days earlier had made the same point. Unfortunately, there are some big problems with that finding. Indeed, the flaws are so significant that NHTSA put out a remarkable statement this week distancing itself from its own finding.

"NHTSA's safety defect investigation of MY2014-2016 Tesla Model S and Model X did not assess the effectiveness of this technology," the agency said in an email to Ars on Wednesday afternoon. "NHTSA performed this cursory comparison of the rates before and after installation of the feature to determine whether models equipped with Autosteer were associated with higher crash rates, which could have indicated that further investigation was necessary." Tesla has also claimed that its cars have a crash rate 3.7 times lower than average, but as we'll see there's little reason to think that has anything to do with Autopilot. This week, we've talked to several automotive safety experts, and none has been able to point us to clear evidence that Autopilot's semi-autonomous features improve safety. And that's why news sites like ours haven't written stories "about how autonomous cars are really safe." Maybe that will prove true in the future, but right now the data just isn't there. Musk has promised to publish regular safety reports in the future -- perhaps those will give us the data needed to establish whether Autopilot actually improves safety.

UPDATE (2/16/19): The study's underlying data reveals serious flaws in the methodology that undermine its credibility, according to new analysis from a research and consulting firm.
This discussion has been archived. No new comments can be posted.

Sorry Elon Musk, There's No Clear Evidence Autopilot Saves Lives

Comments Filter:
  • Errors (Score:5, Informative)

    by LetterRip ( 30937 ) on Friday May 04, 2018 @08:56PM (#56557052)

    While Timothy normally does excellent articles, his reasoning and logic were severely flawed this time.

    First off the NHTSA report focused on autosteer, not autobraking. Hence his attributing the reduction in accidents to autobraking is bizarre. What the NHTSA was disavowing was that they had not examined the entirety of Autopilot (which includes autosteer, autobraking, lane keeping, etc.). Timothy mistakenly thinks they were stating that they hadn't verified the effectiveness of autosteer installation in accident reduction (they didn't verify the actual usage, but drivers with autosteer installed use it about 50% of their driving time).

    Secondly the Tesla's prior to the FSD update already had autobraking, so the 40% reduction in accidents after enabling FSD can't be attributed to the autobraking.

    • Re:Errors (Score:5, Informative)

      by Mr D from 63 ( 3395377 ) on Friday May 04, 2018 @09:12PM (#56557106)
      Anybody who read the NHTSA report should clearly understand that the Autopilot safety data comparison was not done to demonstrate the safety of Autopilot, but rather to decide if there was indication that AP caused an increase. Also, 2/3 of the cars in the study didn't have any pre-AP data at all. It was entirely useless for the purpose of making any kind of safety claim. The NHTSA should not have had to clarify, but too many idiots made stupid claims based on that information. The media in general can be really stupid with statistics.
      • by Eloking ( 877834 )

        Anybody who read the NHTSA report should clearly understand that the Autopilot safety data comparison was not done to demonstrate the safety of Autopilot, but rather to decide if there was indication that AP caused an increase. Also, 2/3 of the cars in the study didn't have any pre-AP data at all. It was entirely useless for the purpose of making any kind of safety claim. The NHTSA should not have had to clarify, but too many idiots made stupid claims based on that information. The media in general can be really stupid with statistics.

        One thing though,

        I thought those number were about Tesla autopilot on *almost ideal condition* VS people on *all condition* no?

        • Anybody who read the NHTSA report should clearly understand that the Autopilot safety data comparison was not done to demonstrate the safety of Autopilot, but rather to decide if there was indication that AP caused an increase. Also, 2/3 of the cars in the study didn't have any pre-AP data at all. It was entirely useless for the purpose of making any kind of safety claim. The NHTSA should not have had to clarify, but too many idiots made stupid claims based on that information. The media in general can be really stupid with statistics.

          One thing though,

          I thought those number were about Tesla autopilot on *almost ideal condition* VS people on *all condition* no?

          https://static.nhtsa.gov/odi/i... [nhtsa.gov]

    • "While Timothy normally does excellent articles, his reasoning and logic were severely flawed this time. First off the NHTSA report focused on autosteer, not autobraking. Hence his attributing the reduction in accidents to autobraking is bizarre."

      No. Not sure what you find bizarre. He simply mentioned that a large part of the reduction could be due to autobraking :

      Tesla shipped with Autopilot hardware starting in October 2014. Tesla activated automatic emergency braking and front collision in March 2015. Te

  • by Xenx ( 2211586 ) on Friday May 04, 2018 @08:58PM (#56557062)
    Unless the autopilot feature is actively instigating accidents, it's impossible for it not to be safer. Anything above and beyond relying solely on driver's response is an improvement, even if only minimally.
    • by Balial ( 39889 ) on Friday May 04, 2018 @09:04PM (#56557078) Homepage

      A Tesla will happily drive you into a stopped fire truck, or a turning semi trailer, or a freeway divider while you're driving at full speed. So yes, it's actively instigating accidents that humans are pretty good at avoiding. See google.

      I'm a huge fan of Tesla, but their autopilot scheme is a farce.

      • Re: (Score:3, Funny)

        by olsmeister ( 1488789 )

        happily drive

        What kind of messed up AI have they developed over there?

      • Re: (Score:3, Insightful)

        by Xenx ( 2211586 )
        Autopilot requires the driver to be attentive. The driver should be stopping the vehicle in those cases. The car didn't actively seek out hitting the fire truck, it just didn't detect it and stop. Absolute worst case, autopilot never detects anything ever and we're no worse off than just having a driver behind the wheel. As soon as the autopilot detects and prevents any potential accident, we're now safer with than without.
        • So Tesla advertises and advertises so people will trust the vehicle. And then they buy it and use it. Then the point is NOT to trust it or you may stop to late to avoid an accident. How Tesla pulled that off with no legal liability in the matter totally blows my mind.
          • by Xenx ( 2211586 )
            It's designed to operate in conjunction with the driver, at this point in time. The old adage, the whole greater than the sum of its parts, applies. Between you and autopilot, less mistakes are bound to happen.
            • But why would the driver stop of he/she bought the car so that the car does it for them?
              • by Xenx ( 2211586 )
                Well, if they bought the car they know they should... It's up to them if they want to cause an accident or not. You can't stop people from choosing to be dumb.
              • But why would the driver stop of he/she bought the car so that the car does it for them?

                Why wouldn't you eat your own feces?

                Answer to both: Because that would be stupid.

        • by cascadingstylesheet ( 140919 ) on Friday May 04, 2018 @09:38PM (#56557198) Journal

          and we're no worse off than just having a driver behind the wheel.

          That's demonstrably untrue. Humans are terrible at partial attention. Partial automation (like Tesla's misnamed Autopilot) lulls humans into a state of inattentiveness.

          • by Xenx ( 2211586 )
            It's not misnamed, regardless of what you want to think. It's very aptly named. As in, it more or less does what the feature it's named after does for airplanes. Further, the feature is explained to people when they buy the cars. So, they know(in as much as anyone knows something they're told) the limitations of the feature.
            • more or less does what the feature it's named after does for airplanes

              But without the extensive training given to aircrew who operate autopilots. In most cases the autopilot is operated by a person who's job it is to fly the aircraft properly.

              • by Xenx ( 2211586 )
                It's the driver's job to drive their car properly as well. It also requires them to be trained and licensed, though the hurdle isn't nearly as tall.
                • Hardly, do Tesla require any training at all to use their autopilot?

                  • by Xenx ( 2211586 )
                    .... to be able to drive a car.
                    • by Anonymous Coward

                      Just stop. You're well beyond idiot fan boy now.

                      If drivers have trouble staying alive because auto pilot does weird and unexpected shit it is Tesla's fault, not the drivers.

                      The responsibility falls on the seller and creator of this cutesy and horribly misnamed broken technology.

                • In some ways, driving with Autopilot (and it will get worse as these systems become more advanced) is like supervising a learner driver. Most of the time (in the limited scenarios it can be fully engaged) it will act like a normal fully-qualified driver but every so often it will fail to react properly or just do something really erratic. If you're the human in charge of the vehicle in those situations, then the role you are fulfilling is more akin to that of a driving instructor than to that of a regular d

            • It's not misnamed, regardless of what you want to think. It's very aptly named. As in, it more or less does what the feature it's named after does for airplanes.

              You think airplane autopilots need the pilot to take control in a matter of seconds to avoid a crash?

            • by MrL0G1C ( 867445 )

              Do Tesla's vehicles fly? No? Then it's misnamed.

              • by Xenx ( 2211586 )
                So, your point is that you're a pedant?
                • by MrL0G1C ( 867445 )

                  It doesn't matter what auto-pilot means by some exact definition. What really matters is what people perceive it to mean and auto-pilot does not equal lane assist.

                  Auto to many people is short for automatic and when they hear 'auto-pilot' then think 'automatic driving'. It doesn't matter whether you're definition is the more correct one, what matters is that people think the car can drive itself when it clearly can't. Many people will assume that Tesla want them to be alert at all times just because they are

                  • by Xenx ( 2211586 )
                    It is automatic, within the capabilities of the system. So, still a valid name. Further, the name itself does not matter at all. What matters is that the owners are told what the system is capable of when buying, so it could be called anything all and they should be aware of the limitations.
                    • by MrL0G1C ( 867445 )

                      Can it fly?

                    • by Xenx ( 2211586 )
                      So, we're back to you being a pedant asshole... ok...
                    • by MrL0G1C ( 867445 )

                      No, it's not a plane, it can't fly so auto-pilot is not the right name for it and I also stated why it is such a bad name. Nothing pedant asshole about that.

                    • by Xenx ( 2211586 )
                      Autopilot systems aren't exclusive to aircraft... so yeah.. you're being fucking pedantic.
                    • by MrL0G1C ( 867445 )

                      I really don't care, I fully explained why calling it autopilot is a bad idea, I haven't heard any argument debating the points I made.

                    • by Xenx ( 2211586 )
                      Except I did, and you never bothered to argue against it. You just turned to fighting over the fact that the car isn't a fucking plane. Ultimately it still comes down to my point that the name is perfect valid for the functions it performs. It does not cause an unrealistic expectation of the feature set. The owners are also told how the feature operates, so there would be no confusion over the name vs the functionality. So, again.. You're arguing over the semantics of the name.. because you have no other va
            • by Rob Y. ( 110975 )

              Airplanes don't have to deal with a constant stream of nearby obstacles. So, while it may be appropriate to have an autopilot system in which the pilot needs to remain attentive - their attention is rarely required, and they will almost always be given ample time to get back into the game and avoid a disaster. That is simply not true in a car. So, yeah, it's misleadingly named.

              And regardless, the problem with any self-driving technology is that legal liability only comes into effect when it fails. And t

          • #1 reason why I own a manual transmission and never use cruise control. I don't trust myself. It may be decidedly un-American to have a compact car with few gadgets and gizmos, but I always know how fast I'm going in my car with a manual transmission. When I drive automatics, I speed, and when I use cruise control, I zone out.

            Emergency response features sound useful, but I don't want a car with half-assed automation. If they car is supposed to drive itself, either do it completely and properly, or not a

        • Autopilot requires the driver to be attentive.

          Humans are far worse at being attentive to passive tasks than they are at driving.

          • by Xenx ( 2211586 )
            Driving is supposed to be an active task, even with autopilot. And yes, I'm fully aware not everyone will see it that way. It doesn't make it any less true. The same concerns could be made for brake assist and cruise control. In the end, good drivers will be good drivers with or without autopilot. The same will be true of bad drivers.
            • Driving is supposed to be an active task, even with autopilot.

              People are also supposed to avoid having traffic accidents.

              • by Xenx ( 2211586 )
                Are you trying to counter my point with a somewhat related, but contextually irrelevant statement? I said people should be actively involved in the driving process, and you said they should also avoid having accidents. I can only assume you mean people don't avoid having accidents, and so they will not be an active participant in the driving process with autopilot. However, most drivers do in fact avoid having accidents most of the time. I would also venture that most people that use autopilot are also comp
                • Are you trying to counter my point with a somewhat related, but contextually irrelevant statement? I said people should be actively involved in the driving process, and you said they should also avoid having accidents. I can only assume you mean people don't avoid having accidents, and so they will not be an active participant in the driving process with autopilot. However, most drivers do in fact avoid having accidents most of the time. I would also venture that most people that use autopilot are also competent drivers.

                  I think it's a very relevant statement.

                  Your argument relies on the assumption that people will be just as attentive while supervising the auto-pilot as they are while performing unassisted driving, but that's a false assumption.

                  Asking someone to pay attention while the auto-pilot is driving is a very difficult task, a far more difficult task than driving. If the auto-pilot makes a mistake there is a strong possibility the human will not be paying close enough attention to catch it. Telling them they're supp

                  • by Xenx ( 2211586 )
                    You would think wrong. You made a statement. One that is only related to mine in regards to automobiles and accidents. You don't make any point with it at all, you just made the statement. Further, I say that drivers should be attentive. That is a fact regardless of what features the vehicle has. An inattentive driver with autopilot is safer than an inattentive driver without autopilot.
                    • You would think wrong. You made a statement. One that is only related to mine in regards to automobiles and accidents. You don't make any point with it at all, you just made the statement. Further, I say that drivers should be attentive. That is a fact regardless of what features the vehicle has. An inattentive driver with autopilot is safer than an inattentive driver without autopilot.

                      You keep dodging the point I'm making.

                      A driver is more likely to be inattentive with an autopilot.

                      And that was the point of my statement, that saying the driver is supposed to be attentive is as useless as saying the driver is supposed to not crash. In both cases human error is a prerequisite of a crash, the question is how circumstances change the likelihood of those errors.

                    • by Xenx ( 2211586 )
                      You're starting from a faulty assumption yourself. That a driver that is inattentive with autopilot, regardless of whether it's more likely or not, would be a safe driver otherwise.
                    • You're starting from a faulty assumption yourself. That a driver that is inattentive with autopilot, regardless of whether it's more likely or not, would be a safe driver otherwise.

                      What? I never said nor implied that. All I'm saying is that the auto-pilot leads a driver to be less attentive. I never said that the auto-pilot was necessarily less safe, in face, I explicitly said:
                      Now, just because the human with the autopilot is less attentive and able to respond to emergencies doesn't necessarily mean they have more accidents. The autopilot could be good enough even with the inattentive human it's still safer, but that's far from an obvious conclusion.

                      I'm just baffled by your resistance

        • Its human nature to be less attentive when automation is doing part of the job for you.

        • Autopilot requires the driver to be attentive.

          Unless you've been stuck in a vacuuum your entire life, you'd know that simply specifying something as a requirement does not mean that the requirement is any good.

          Monotony and tedium leads to less attentiveness. We've known this for literally decades from studies of factory workers. Turning on the autopilot can reduce the drivers attentiveness. Blaming the resulting inattentiveness on the driver is stupid.

          • by Xenx ( 2211586 )
            No, blaming the driver for not paying attention to the fucking road is common fucking sense.
            • No, blaming the driver for not paying attention to the fucking road is common fucking sense.

              I really hate it when I accidentally respond to the anti-science crowd. Go read up on all the hundreds studies and trials in determining the correlation between engagement and inattentiveness.

              • by Xenx ( 2211586 )
                You're not talking any kind of anti-science crowd. I'm not denying, anywhere, that inattentiveness is a concern. The point you don't get is that the problem exists for most any car now a days... Cruise control has been a thing for decades. Pulling you foot away from the gas/brake because of cruise control can also lead to slow reaction times. This just furthers the automation, but the overall driver engagement isn't drastically reduced... unless they choose not to pay attention. A good driver would keep the
        • The AP technology seems basically fine. It'll surely help rational drivers drive safely. The problem, to the extent there is one, would seem to be Tesla's marketing which seems loath to acknowledge that AP is just a collection of simple tools that (usually) make driving a bit safer and are neither intended to, nor capable of, driving the car safely by themselves.

          I suppose that if you are trying to sell an odd, expensive, vehicle to people with more money than sense, you are likely find that some customer

          • by Xenx ( 2211586 )
            Their own materials state it's not a complete autonomous solution at this time. It just has the hardware to be fully autonomous when(if) the software is ready.
        • by Cederic ( 9623 )

          Absolute worst case, autopilot never detects anything ever and we're no worse off than just having a driver behind the wheel.

          Except that this is not true, and you've stated why yourself:

          Autopilot requires the driver to be attentive.

          We know that the drive is not going to be attentive, so we're already worse off than just having a driver behind the wheel.

          • by Xenx ( 2211586 )
            Obviously I didn't state why myself. Bad drivers will be bad drivers... autopilot has nothing to do with that. Someone not paying attention when using autopilot is the cause of the problem, not the autopilot itself. The problem isn't the autopilot.
            • by Cederic ( 9623 )

              The problem is that the way autopilot is designed, marketed and implemented it's guaranteed to result in inattentive drivers.

              Thus the problem is indeed autopilot.

              Errors the user is forced to make due to shit design is not user error. It's shit design.

              • by Xenx ( 2211586 )
                As are, apparently, cruise control... brake assist.. parking assist.. etc. They facilitate the driver being less attentive to what they're doing behind the wheel and thus deserve every bit as much flak as people like you seem to think autopilot deserves.... and yet they don't get it. Autopilot isn't an exception from any other driving convenience/safety feature. It's more advanced, but mostly just a combination of the other features.
                • by Cederic ( 9623 )

                  Plenty of people refuse to use cruise control exactly because it's dangerous.

                  Combining a number of features without assessing the usage and impact on the driver is shit design. People are dying because of this.

      • A Tesla will happily drive you into a stopped fire truck, or a turning semi trailer, or a freeway divider while you're driving at full speed. So yes, it's actively instigating accidents that humans are pretty good at avoiding. See google.

        Teslas are also pretty good at avoiding those things, but not perfect. Neither are humans. That's why they have those crumple zones on the freeway dividers because human were crashing into them already. Humans also hit fire trucks, police cars, and ambulances with their lights on or off.

      • If i recall correctly, the turning semi crash had the brake fail. The system - or the human - would have resulted in the same accident. The fire truck can be squashed in a patch. The freeway divider was due to incredibly faded lane markings. These things can be fixed easily, unlike humans that still cause far more fatalities no matter how many 'patches' we apply to them.

    • Unless the autopilot feature is actively instigating accidents, it's impossible for it not to be safer. Anything above and beyond relying solely on driver's response is an improvement, even if only minimally.

      If drivers attempt to use it as "hands free" driving, it probably would be more unsafe than a human driver alone. It is an assist. If huge numbers of driver are somehow ignoring the training they received when they picked up the car, ignoring the cars warnings about keeping hands on the wheel, and ign

      • by Xenx ( 2211586 )
        Your complaints are pretty much what I would expect to see. They're not without merit, but as you say an attentive driver does cover the weaknesses. In its current stages, as long as the driver is doing what they're supposed to be doing behind the wheel it can only be an improvement. We know people are misusing the feature, but we also know people do a number of dangerous things behind the wheel without it as well.
        • What I didn't say but is buried in there, is how attentive a human will remain for long periods on auto-pilot. Road hypnosis was a thing long before all these assists came about, and people did fall asleep at the wheel. I question if auto-pilot will only bore us more to the point where unattentive drivers become more common, even if we don't intend it. There are solutions to this too if it is real.

          Speaking for myself, I use it only during a 30 minute commute, it's not really an issue. But for people who are

    • by grumbel ( 592662 )

      The problem with a lot of safety measures is they make people behave less safe. Thus the benefit of the safety measure gets eaten up by people getting into more accidents. In the case of autopilot you don't just augment an attentive driver with additional features, you turn him into an inattentive driver when you give him autopilot abilities. See the recent Uber self-driving death, driver wasn't paying attention to the road and fumbling around with the phone, we can blame the driver, but that's ignoring the

      • by Xenx ( 2211586 )
        You don't turn him into an inattentive driver though. Some people are bad drivers, and will try to abuse the system. They would still be a bad driver without autopilot.
    • Unless the autopilot feature is actively instigating accidents, it's impossible for it not to be safer.

      The autopilot is actively encouraging inattentiveness, so yes, it is possible for the addition of autopilot to increase the accident rate.

      • by Xenx ( 2211586 )
        Bad drivers are bad drivers, autopilot has nothing to do with a persons driving habits.
    • by Megol ( 3135005 )

      Did that sound logical to yourself before you posted? It obviously isn't.

      If the presence of autopilot makes the driver worse in any way, even reducing reaction time in the order of tenth of a second, it can absolutely be more dangerous.

      And we know from Tesla's own released data that the reaction time in many people is increased, in some to extreme levels (the fellow deciding sitting in the passenger seat while the autopilot controlled the car).

    • Let's see... how about when your hands are on the wheel, and you're paying attention... and it "decides" to do something stupid anyhow??
  • Flaw (Score:2, Insightful)

    by Anonymous Coward

    The system is flawed because it has to rely on a lazy, distracted driver who really doesnâ(TM)t want to drive his own car to begin with.

  • 3.7 times lower (Score:5, Insightful)

    by gumpish ( 682245 ) on Friday May 04, 2018 @09:22PM (#56557148) Journal

    3.7 times lower

    What the hell are they trying to convey here?

    If value A is "2 times less" than value B, does that mean A is 1/2 B? So "3.7 times lower" means a factor of 1/3.7 ?

    While I'm on the subject, there is no unit for "coldness" or "slowness", so please stop saying nonsense like "twice as cold" or "ten times slower" and stick to "half the temperature" and "one-tenth the speed". FFS

  • Sorry? (Score:4, Interesting)

    by lucm ( 889690 ) on Friday May 04, 2018 @09:26PM (#56557160)

    "Sorry Elon Musk"? That's the most asswipy headline I've read in a long time. Was this mistakenly submitted to Slashdot instead of Jezebel or Salon?

    • You should see some of silly shit that "Tyler Durden" of Zerohedge constantly spews about Musk and Tesla. The desperation of the shorts seems to be building exponentially.
  • His performance on the earnings call was Enron like. He needs to step down.
    • by Anonymous Coward

      Bullshit. I listened to that call. He cut off and mocked some bankers whinging about volatility. He's busy getting Model 3 production in hand and doesn't time to suffer bean counters and day-traders.

  • Sorry for our really fucking lame attempt at writing clickbait headlines..
    I refuse to participate in this discussion because of this shit. What's wrong with "There's No Clear Evidence That Autopilot Saves Lives"? Seriously, I want an answer.
  • Tesla is the most shorted company on the market. About 1/3 of total shares are committed to short positions. This works tremendously to their benefit, as when shorts come due, their holders are forced to buy stock at the prevailing price in order to fulfill their short commitment. And this has previously kited the stock value to its present astronomical value.

    What if Elon Musk were out to create the world's biggest short crunch and further kite the value of Tesla? He might downplay good news, and act like a

  • Stop, for gawd fucking sake, calling the damn thing "autopilot".

    If it can't even *moderately* be relied upon to do the right thing without driver intevention, then it's not "auto" anything... because "auto" means "by itself".

    Call it "driving assist" or something like that... don't put a misleading term in the very name that will suggest to people that it does something it does not.

    While you can go ahead and blame the people for their own foolishness at trusting a technology that by its own admission

  • Just back from Vietnam, I thought to myself that auto pilot would NEVER work there. I also thought that many westerners should head there before getting their license in their home country.
  • Autopilot if followed exactly as Tesla recommends is a system that keeps the car in the lane and brakes when it sees a hazard. It requires the driver to be paying attention. So it doesn't save lives...

    Does that mean that we should remove all the lane assist features and auto braking features of all the other cars as well or is Tesla somehow special in that only Tesla's implementation gets criticised while every other car company gets a free pass?

    Side note: My friend owes his life to his Nissan Qashqai's Int

"If it ain't broke, don't fix it." - Bert Lantz

Working...