Sorry Elon Musk, There's No Clear Evidence Autopilot Saves Lives (arstechnica.com) 128
Timothy B. Lee writes for Ars Technica: A few days after the Mountain View crash, Tesla published a blog post acknowledging that Autopilot was active at the time of the crash. But the company argued that the technology improved safety overall, pointing to a 2017 report by the National Highway Traffic Safety Administration (NHTSA). "Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40 percent," the company wrote. It was the second time Tesla had cited that study in the context of the Mountain View crash -- another blog post three days earlier had made the same point. Unfortunately, there are some big problems with that finding. Indeed, the flaws are so significant that NHTSA put out a remarkable statement this week distancing itself from its own finding.
"NHTSA's safety defect investigation of MY2014-2016 Tesla Model S and Model X did not assess the effectiveness of this technology," the agency said in an email to Ars on Wednesday afternoon. "NHTSA performed this cursory comparison of the rates before and after installation of the feature to determine whether models equipped with Autosteer were associated with higher crash rates, which could have indicated that further investigation was necessary." Tesla has also claimed that its cars have a crash rate 3.7 times lower than average, but as we'll see there's little reason to think that has anything to do with Autopilot. This week, we've talked to several automotive safety experts, and none has been able to point us to clear evidence that Autopilot's semi-autonomous features improve safety. And that's why news sites like ours haven't written stories "about how autonomous cars are really safe." Maybe that will prove true in the future, but right now the data just isn't there. Musk has promised to publish regular safety reports in the future -- perhaps those will give us the data needed to establish whether Autopilot actually improves safety.
UPDATE (2/16/19): The study's underlying data reveals serious flaws in the methodology that undermine its credibility, according to new analysis from a research and consulting firm.
"NHTSA's safety defect investigation of MY2014-2016 Tesla Model S and Model X did not assess the effectiveness of this technology," the agency said in an email to Ars on Wednesday afternoon. "NHTSA performed this cursory comparison of the rates before and after installation of the feature to determine whether models equipped with Autosteer were associated with higher crash rates, which could have indicated that further investigation was necessary." Tesla has also claimed that its cars have a crash rate 3.7 times lower than average, but as we'll see there's little reason to think that has anything to do with Autopilot. This week, we've talked to several automotive safety experts, and none has been able to point us to clear evidence that Autopilot's semi-autonomous features improve safety. And that's why news sites like ours haven't written stories "about how autonomous cars are really safe." Maybe that will prove true in the future, but right now the data just isn't there. Musk has promised to publish regular safety reports in the future -- perhaps those will give us the data needed to establish whether Autopilot actually improves safety.
UPDATE (2/16/19): The study's underlying data reveals serious flaws in the methodology that undermine its credibility, according to new analysis from a research and consulting firm.
Errors (Score:5, Informative)
While Timothy normally does excellent articles, his reasoning and logic were severely flawed this time.
First off the NHTSA report focused on autosteer, not autobraking. Hence his attributing the reduction in accidents to autobraking is bizarre. What the NHTSA was disavowing was that they had not examined the entirety of Autopilot (which includes autosteer, autobraking, lane keeping, etc.). Timothy mistakenly thinks they were stating that they hadn't verified the effectiveness of autosteer installation in accident reduction (they didn't verify the actual usage, but drivers with autosteer installed use it about 50% of their driving time).
Secondly the Tesla's prior to the FSD update already had autobraking, so the 40% reduction in accidents after enabling FSD can't be attributed to the autobraking.
Re:Errors (Score:5, Informative)
Re: (Score:2)
Anybody who read the NHTSA report should clearly understand that the Autopilot safety data comparison was not done to demonstrate the safety of Autopilot, but rather to decide if there was indication that AP caused an increase. Also, 2/3 of the cars in the study didn't have any pre-AP data at all. It was entirely useless for the purpose of making any kind of safety claim. The NHTSA should not have had to clarify, but too many idiots made stupid claims based on that information. The media in general can be really stupid with statistics.
One thing though,
I thought those number were about Tesla autopilot on *almost ideal condition* VS people on *all condition* no?
Re: (Score:2)
Anybody who read the NHTSA report should clearly understand that the Autopilot safety data comparison was not done to demonstrate the safety of Autopilot, but rather to decide if there was indication that AP caused an increase. Also, 2/3 of the cars in the study didn't have any pre-AP data at all. It was entirely useless for the purpose of making any kind of safety claim. The NHTSA should not have had to clarify, but too many idiots made stupid claims based on that information. The media in general can be really stupid with statistics.
One thing though,
I thought those number were about Tesla autopilot on *almost ideal condition* VS people on *all condition* no?
https://static.nhtsa.gov/odi/i... [nhtsa.gov]
Re: (Score:2)
"While Timothy normally does excellent articles, his reasoning and logic were severely flawed this time. First off the NHTSA report focused on autosteer, not autobraking. Hence his attributing the reduction in accidents to autobraking is bizarre."
No. Not sure what you find bizarre. He simply mentioned that a large part of the reduction could be due to autobraking :
Tesla shipped with Autopilot hardware starting in October 2014. Tesla activated automatic emergency braking and front collision in March 2015. Te
How can it not be safer? (Score:3, Insightful)
Re:How can it not be safer? (Score:4, Insightful)
A Tesla will happily drive you into a stopped fire truck, or a turning semi trailer, or a freeway divider while you're driving at full speed. So yes, it's actively instigating accidents that humans are pretty good at avoiding. See google.
I'm a huge fan of Tesla, but their autopilot scheme is a farce.
Re: (Score:3, Funny)
happily drive
What kind of messed up AI have they developed over there?
Re: (Score:3, Insightful)
Re: (Score:1)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
But why would the driver stop of he/she bought the car so that the car does it for them?
Why wouldn't you eat your own feces?
Answer to both: Because that would be stupid.
Re: (Score:2)
Re:How can it not be safer? (Score:5, Interesting)
and we're no worse off than just having a driver behind the wheel.
That's demonstrably untrue. Humans are terrible at partial attention. Partial automation (like Tesla's misnamed Autopilot) lulls humans into a state of inattentiveness.
Re: (Score:1)
Re: (Score:2)
more or less does what the feature it's named after does for airplanes
But without the extensive training given to aircrew who operate autopilots. In most cases the autopilot is operated by a person who's job it is to fly the aircraft properly.
Re: (Score:2)
Re: (Score:2)
Hardly, do Tesla require any training at all to use their autopilot?
Re: (Score:2)
Re: How can it not be safer? (Score:1)
Just stop. You're well beyond idiot fan boy now.
If drivers have trouble staying alive because auto pilot does weird and unexpected shit it is Tesla's fault, not the drivers.
The responsibility falls on the seller and creator of this cutesy and horribly misnamed broken technology.
Re: (Score:3)
In some ways, driving with Autopilot (and it will get worse as these systems become more advanced) is like supervising a learner driver. Most of the time (in the limited scenarios it can be fully engaged) it will act like a normal fully-qualified driver but every so often it will fail to react properly or just do something really erratic. If you're the human in charge of the vehicle in those situations, then the role you are fulfilling is more akin to that of a driving instructor than to that of a regular d
Re: (Score:2)
It's not misnamed, regardless of what you want to think. It's very aptly named. As in, it more or less does what the feature it's named after does for airplanes.
You think airplane autopilots need the pilot to take control in a matter of seconds to avoid a crash?
Re: (Score:2)
Re: (Score:2)
Do Tesla's vehicles fly? No? Then it's misnamed.
Re: (Score:2)
Re: (Score:2)
It doesn't matter what auto-pilot means by some exact definition. What really matters is what people perceive it to mean and auto-pilot does not equal lane assist.
Auto to many people is short for automatic and when they hear 'auto-pilot' then think 'automatic driving'. It doesn't matter whether you're definition is the more correct one, what matters is that people think the car can drive itself when it clearly can't. Many people will assume that Tesla want them to be alert at all times just because they are
Re: (Score:2)
Re: (Score:2)
Can it fly?
Re: (Score:2)
Re: (Score:2)
No, it's not a plane, it can't fly so auto-pilot is not the right name for it and I also stated why it is such a bad name. Nothing pedant asshole about that.
Re: (Score:2)
Re: (Score:2)
I really don't care, I fully explained why calling it autopilot is a bad idea, I haven't heard any argument debating the points I made.
Re: (Score:2)
Re: (Score:2)
Airplanes don't have to deal with a constant stream of nearby obstacles. So, while it may be appropriate to have an autopilot system in which the pilot needs to remain attentive - their attention is rarely required, and they will almost always be given ample time to get back into the game and avoid a disaster. That is simply not true in a car. So, yeah, it's misleadingly named.
And regardless, the problem with any self-driving technology is that legal liability only comes into effect when it fails. And t
Re: (Score:2)
Re: (Score:3)
#1 reason why I own a manual transmission and never use cruise control. I don't trust myself. It may be decidedly un-American to have a compact car with few gadgets and gizmos, but I always know how fast I'm going in my car with a manual transmission. When I drive automatics, I speed, and when I use cruise control, I zone out.
Emergency response features sound useful, but I don't want a car with half-assed automation. If they car is supposed to drive itself, either do it completely and properly, or not a
Re: How can it not be safer? (Score:1)
Yes, exactly. Agreed. And 99.999% of car buyers have never piloted a plane and know almost nothing about how airplane auto pilot works.
Thank you for inadvertent making the point of all the sane anti-auto pilot people who understand that getting killed by a computer is still getting killed. It is not a glitch we'll fix in the next release. Tesla murdered someone. They -know- auto pilot is fucking broken and kills people and don't care.
Re: (Score:1)
but you'd expect a lane assist feature to work as advertised.
When I have learned one thing in the last few years, it's that almost nothing works as advertises. Aside maybe toilet paper (if you are able to use it right)
If you are lucky it might work as described in an independent review, if the review wasn't bought behind the scenes.
Re: (Score:2)
Autopilot requires the driver to be attentive.
Humans are far worse at being attentive to passive tasks than they are at driving.
Re: (Score:2)
Re: (Score:2)
Driving is supposed to be an active task, even with autopilot.
People are also supposed to avoid having traffic accidents.
Re: (Score:2)
Re: (Score:2)
Are you trying to counter my point with a somewhat related, but contextually irrelevant statement? I said people should be actively involved in the driving process, and you said they should also avoid having accidents. I can only assume you mean people don't avoid having accidents, and so they will not be an active participant in the driving process with autopilot. However, most drivers do in fact avoid having accidents most of the time. I would also venture that most people that use autopilot are also competent drivers.
I think it's a very relevant statement.
Your argument relies on the assumption that people will be just as attentive while supervising the auto-pilot as they are while performing unassisted driving, but that's a false assumption.
Asking someone to pay attention while the auto-pilot is driving is a very difficult task, a far more difficult task than driving. If the auto-pilot makes a mistake there is a strong possibility the human will not be paying close enough attention to catch it. Telling them they're supp
Re: (Score:2)
Re: (Score:2)
You would think wrong. You made a statement. One that is only related to mine in regards to automobiles and accidents. You don't make any point with it at all, you just made the statement. Further, I say that drivers should be attentive. That is a fact regardless of what features the vehicle has. An inattentive driver with autopilot is safer than an inattentive driver without autopilot.
You keep dodging the point I'm making.
A driver is more likely to be inattentive with an autopilot.
And that was the point of my statement, that saying the driver is supposed to be attentive is as useless as saying the driver is supposed to not crash. In both cases human error is a prerequisite of a crash, the question is how circumstances change the likelihood of those errors.
Re: (Score:2)
Re: (Score:2)
You're starting from a faulty assumption yourself. That a driver that is inattentive with autopilot, regardless of whether it's more likely or not, would be a safe driver otherwise.
What? I never said nor implied that. All I'm saying is that the auto-pilot leads a driver to be less attentive. I never said that the auto-pilot was necessarily less safe, in face, I explicitly said:
Now, just because the human with the autopilot is less attentive and able to respond to emergencies doesn't necessarily mean they have more accidents. The autopilot could be good enough even with the inattentive human it's still safer, but that's far from an obvious conclusion.
I'm just baffled by your resistance
Re: (Score:2)
Its human nature to be less attentive when automation is doing part of the job for you.
Re: (Score:2)
Autopilot requires the driver to be attentive.
Unless you've been stuck in a vacuuum your entire life, you'd know that simply specifying something as a requirement does not mean that the requirement is any good.
Monotony and tedium leads to less attentiveness. We've known this for literally decades from studies of factory workers. Turning on the autopilot can reduce the drivers attentiveness. Blaming the resulting inattentiveness on the driver is stupid.
Re: (Score:2)
Re: (Score:2)
No, blaming the driver for not paying attention to the fucking road is common fucking sense.
I really hate it when I accidentally respond to the anti-science crowd. Go read up on all the hundreds studies and trials in determining the correlation between engagement and inattentiveness.
Re: (Score:2)
Re: (Score:2)
The AP technology seems basically fine. It'll surely help rational drivers drive safely. The problem, to the extent there is one, would seem to be Tesla's marketing which seems loath to acknowledge that AP is just a collection of simple tools that (usually) make driving a bit safer and are neither intended to, nor capable of, driving the car safely by themselves.
I suppose that if you are trying to sell an odd, expensive, vehicle to people with more money than sense, you are likely find that some customer
Re: (Score:2)
Re: (Score:2)
Absolute worst case, autopilot never detects anything ever and we're no worse off than just having a driver behind the wheel.
Except that this is not true, and you've stated why yourself:
Autopilot requires the driver to be attentive.
We know that the drive is not going to be attentive, so we're already worse off than just having a driver behind the wheel.
Re: (Score:2)
Re: (Score:2)
The problem is that the way autopilot is designed, marketed and implemented it's guaranteed to result in inattentive drivers.
Thus the problem is indeed autopilot.
Errors the user is forced to make due to shit design is not user error. It's shit design.
Re: (Score:2)
Re: (Score:2)
Plenty of people refuse to use cruise control exactly because it's dangerous.
Combining a number of features without assessing the usage and impact on the driver is shit design. People are dying because of this.
Re: (Score:2)
A Tesla will happily drive you into a stopped fire truck, or a turning semi trailer, or a freeway divider while you're driving at full speed. So yes, it's actively instigating accidents that humans are pretty good at avoiding. See google.
Teslas are also pretty good at avoiding those things, but not perfect. Neither are humans. That's why they have those crumple zones on the freeway dividers because human were crashing into them already. Humans also hit fire trucks, police cars, and ambulances with their lights on or off.
Re: (Score:2)
If i recall correctly, the turning semi crash had the brake fail. The system - or the human - would have resulted in the same accident. The fire truck can be squashed in a patch. The freeway divider was due to incredibly faded lane markings. These things can be fixed easily, unlike humans that still cause far more fatalities no matter how many 'patches' we apply to them.
Re: (Score:2)
If used properly
If you used your brain properly you'd understand the fallacy here.
Re: (Score:2)
Unless the autopilot feature is actively instigating accidents, it's impossible for it not to be safer. Anything above and beyond relying solely on driver's response is an improvement, even if only minimally.
If drivers attempt to use it as "hands free" driving, it probably would be more unsafe than a human driver alone. It is an assist. If huge numbers of driver are somehow ignoring the training they received when they picked up the car, ignoring the cars warnings about keeping hands on the wheel, and ign
Re: (Score:2)
Re: (Score:2)
What I didn't say but is buried in there, is how attentive a human will remain for long periods on auto-pilot. Road hypnosis was a thing long before all these assists came about, and people did fall asleep at the wheel. I question if auto-pilot will only bore us more to the point where unattentive drivers become more common, even if we don't intend it. There are solutions to this too if it is real.
Speaking for myself, I use it only during a 30 minute commute, it's not really an issue. But for people who are
Re: (Score:3)
The problem with a lot of safety measures is they make people behave less safe. Thus the benefit of the safety measure gets eaten up by people getting into more accidents. In the case of autopilot you don't just augment an attentive driver with additional features, you turn him into an inattentive driver when you give him autopilot abilities. See the recent Uber self-driving death, driver wasn't paying attention to the road and fumbling around with the phone, we can blame the driver, but that's ignoring the
Re: (Score:2)
Re: (Score:3)
Unless the autopilot feature is actively instigating accidents, it's impossible for it not to be safer.
The autopilot is actively encouraging inattentiveness, so yes, it is possible for the addition of autopilot to increase the accident rate.
Re: (Score:2)
Re: (Score:2)
Did that sound logical to yourself before you posted? It obviously isn't.
If the presence of autopilot makes the driver worse in any way, even reducing reaction time in the order of tenth of a second, it can absolutely be more dangerous.
And we know from Tesla's own released data that the reaction time in many people is increased, in some to extreme levels (the fellow deciding sitting in the passenger seat while the autopilot controlled the car).
Re: How can it not be safer? (Score:2)
Re: (Score:2)
One car driven by a human swerves to avoid another car being driven by a human and happens to hit a self-driving car. Your point?
Flaw (Score:2, Insightful)
The system is flawed because it has to rely on a lazy, distracted driver who really doesnâ(TM)t want to drive his own car to begin with.
3.7 times lower (Score:5, Insightful)
What the hell are they trying to convey here?
If value A is "2 times less" than value B, does that mean A is 1/2 B? So "3.7 times lower" means a factor of 1/3.7 ?
While I'm on the subject, there is no unit for "coldness" or "slowness", so please stop saying nonsense like "twice as cold" or "ten times slower" and stick to "half the temperature" and "one-tenth the speed". FFS
Sorry? (Score:4, Interesting)
"Sorry Elon Musk"? That's the most asswipy headline I've read in a long time. Was this mistakenly submitted to Slashdot instead of Jezebel or Salon?
Re: Sorry? (Score:2)
Musk needs to step down (Score:1)
Re: (Score:1)
Bullshit. I listened to that call. He cut off and mocked some bankers whinging about volatility. He's busy getting Model 3 production in hand and doesn't time to suffer bean counters and day-traders.
Re: Musk needs to step down (Score:2)
Sorry Elon Musk? (Score:2)
I refuse to participate in this discussion because of this shit. What's wrong with "There's No Clear Evidence That Autopilot Saves Lives"? Seriously, I want an answer.
Tesla's Current Stock Short Situation (Score:1)
Tesla is the most shorted company on the market. About 1/3 of total shares are committed to short positions. This works tremendously to their benefit, as when shorts come due, their holders are forced to buy stock at the prevailing price in order to fulfill their short commitment. And this has previously kited the stock value to its present astronomical value.
What if Elon Musk were out to create the world's biggest short crunch and further kite the value of Tesla? He might downplay good news, and act like a
Re: (Score:2)
Re: Tesla's Current Stock Short Situation (Score:2)
Re: Tesla's Current Stock Short Situation (Score:2)
If I could say just one think to Musk, it would be (Score:2)
Stop, for gawd fucking sake, calling the damn thing "autopilot".
If it can't even *moderately* be relied upon to do the right thing without driver intevention, then it's not "auto" anything... because "auto" means "by itself".
Call it "driving assist" or something like that... don't put a misleading term in the very name that will suggest to people that it does something it does not.
While you can go ahead and blame the people for their own foolishness at trusting a technology that by its own admission
Re: (Score:2)
Vietnam (Score:1)
What is autopilot? (Score:2)
Autopilot if followed exactly as Tesla recommends is a system that keeps the car in the lane and brakes when it sees a hazard. It requires the driver to be paying attention. So it doesn't save lives...
Does that mean that we should remove all the lane assist features and auto braking features of all the other cars as well or is Tesla somehow special in that only Tesla's implementation gets criticised while every other car company gets a free pass?
Side note: My friend owes his life to his Nissan Qashqai's Int