Tesla Model S Plows Into a Fire Truck While Using Autopilot (cnbc.com) 345
On Monday, a Tesla Model S plowed into the back of a fire truck on a freeway near Culver City, California. The driver is claiming the car was on Tesla's Autopilot driver assistance system. As a result, the National Traffic Safety Board will be investigating both driver and vehicle factors. CNBC reports: The Culver City Firefighters Association Local 1927 union chapter tweeted out a picture of the crash on Monday afternoon. The firetruck was on the freeway helping after a motorcycle accident, the union said in an Instagram post. The post said there were no injuries. The outcome could have been much worse if firefighters had been standing at the back of the truck, Battalion Chief Ken Powell told the San Jose Mercury News. "Autopilot is intended for use only with a fully attentive driver," Tesla said in a statement sent to CNBC.
Defense: it was drunk (Score:3)
Re:Defense: it was drunk (Score:4, Insightful)
His defense didn't work because: "Autopilot is intended for use only with a fully attentive driver"! Same thing is said in TFS of this article with the fire truck story.
If you think about it, a "fully attentive driver" ready to take control at any time seems to me like the driver needs driving school instructor skills where the instructor can take control of the car if the student screws up. Driving school instructors need more skills than a casual driver. It seems to me like being able to take over on the fly at any time might be harder than when you already have control in the first place.
Does driving a Tesla require a driving school instructor license? Maybe it should if it doesn't...
Re: (Score:3)
Re:Defense: it was drunk (Score:5, Insightful)
what the hell is the point of having an automatic driving system if you have to sit there waiting for that split second between when you realize the autopilot isn't working and when the accident occurs?
It's not an automatic driving system. It's just Tesla marketing that implies it is. Their disclaimer says it's not.
Re: (Score:3, Informative)
Kind of funny... "The release of Tesla Version 7.1 software continues our improvements to self-driving technology" (from their announcement of version 7.1 of their software). Seems they do in fact call it self driving.
Re: (Score:2)
Re: (Score:2)
Approaching a fire truck parked on the highway gives you a LOT more than a split second to apply the brakes.
Re:Defense: it was drunk (Score:4, Insightful)
Approaching a fire truck parked on the highway gives you a LOT more than a split second to apply the brakes.
I think the Auto Emerg Braking may have kicked in. It doesn't look like the airbags were triggered and I would have expected a LOT more damage for at collision at 65 mph.
Re:Defense: it was drunk (Score:5, Interesting)
More to the point, I doubt it will turn out that Autopilot was even on. "Autopilot crashed me" is the best excuse bad drivers have ever been given. And people automatically take it at face values, until the logs get examined.
Re: (Score:2)
People drive where they are looking. He was looking at the lights. That's why so many cops get hit on the side of the road.
And yeah, that's not 65mph into a wall crumple. If the bags didn't fire then he wasn't going fast enough to die.
Re: (Score:3)
More to the point, I doubt it will turn out that Autopilot was even on. "Autopilot crashed me" is the best excuse bad drivers have ever been given. And people automatically take it at face values, until the logs get examined.
The problem is that the system is being examined by Tesla, not a third party. Because of the proprietary nature of the system, we're relying on Tesla to tell us Tesla hasn't fucked up. No matter if you like or dislike Tesla, that is a conflict of interest.
However that is not an issue in this case. The law is clear, all a judge will do is ask:
Judge: Did you turn on the "autopilot" system.
TWAT: Yes.
Judge: Then you were in command of the vehicle, you are responsible for what happened.
It doesn't matter
Re: (Score:3)
Re:Defense: it was drunk (Score:5, Funny)
There's no warning against putting your dick in the cigarette lighter either*. Some things are just common sense.
* I haven't checked in California.
Re: (Score:2)
Considering that my lighter socket is deep in a cubby hole..... Damn!
Re: (Score:3)
You can be damn sure your car's owner's manual mentions that you need to remain aware of your surroundings when using cruise control. I suspect there might even be big, bold warnings and exclamation marks in triangles around it.
Re: (Score:3)
Re:Defense: it was drunk (Score:5, Insightful)
So this excuses it from being safe?
It isn't clear if it is safe or not. This guy claimed Autopilot was engaged, but I am skeptical. In other Autopilot failures there were explanations, like projections above the cameras' field of view, or a lorry exactly the color of the sky. But in this case it just plowed into a firetruck for no apparent reason. That is a pretty big bug to have gone unnoticed until now.
Re: (Score:2)
I'm thinking the same thing and the owner is just saying so to try and CYA.
Re: (Score:2, Insightful)
But in this case it just plowed into a firetruck for no apparent reason.
It seems like there's a bug, either way.... the Tesla is buggy.... these cars are supposed to have automatic emergency breaking, aren't they?
Explain how crashing into a firetruck at 65 MPH happens without automatic emergency breaking having also failed......
Re: (Score:3)
"automatic emergency breaking" [sic] doesn't mean stopping at 65, it means getting it down to a survivable speed. Notice the air bags didn't fire. That means it was at a survivable speed. That crumple in the picture doesn't look like 65mph into a wall.
Re:Defense: it was drunk (Score:5, Funny)
Oh, it definitely broke.
Re:Defense: it was drunk (Score:5, Insightful)
Seems to be the fundamental issue that everyone is missing. There is no way that the Tesla was going 65 at impact. The crush zone is barely impacted, the fire truck looks barely dented. At most that looks like a 7-10 MPH hit. Which means if the Autopilot was engaged, it was doing it's best to stop.
At 65 MPH, that Tesla would of be buried under that red truck up to it's A pillar's at a minimum, if not the B pillar.
Re:Defense: it was drunk (Score:4, Informative)
This isn't the shoulder, this is the carpool lane on the 405. Notice the double yellow lines on the far side and single yellow on the nearside. Somewhere around 33.990053, -118.400939.
Re: (Score:3)
I'd say more rigorous testing, use of mature technology and code review? You know, the things that typically don't happen in consumer grade crapware, but it kind of does work in aerospace, spaceflight and military systems. Mostly anyway.
Should have said the brakes failed (Score:2)
so as not to look as stupid as he/she is
Now the idiot driver wrecked their Tesla and they have to pay for damages to the fire truck
Re: (Score:2)
Now the idiot driver wrecked their Tesla and they have to pay for damages to the fire truck
If autopilot turns out to have been on; then probably Tesla will wind up having to pay for damages to a fire truck.... AND replacing that poor idiot driver's car. /P
Too much delta-v? (Score:2)
I don't own a Tesla, but I know my own car's auto brake system doesn't gracefully slow to a stop unless the speed difference between my car and whatever is in front of me is less than 30 MPH.
If the delta-v is more than 30 MPH, my car will do a "panic stop" thing to slow the car down, but it'll be too late to avoid a collision.
Re: (Score:2)
The problem is that people assume that autopilot means they don't need to think of things like delta-v.
Re: (Score:3)
The problem is that people assume that autopilot means they don't need to think of things like delta-v.
The problem is that most people think "delta-v" is about Delta planes flying in a "V" formation.
Re: (Score:2)
I don't own a Tesla, but I know my own car's auto brake system doesn't gracefully slow to a stop unless the speed difference between my car and whatever is in front of me is less than 30 MPH.
If the delta-v is more than 30 MPH, my car will do a "panic stop" thing to slow the car down, but it'll be too late to avoid a collision.
Maybe, but it seems that the people driven cars also on the road were able to avoid that collision.
Re: Too much delta-v? (Score:2)
Thatâ(TM)s kind of my point (and Teslaâ(TM)s too): the driver is responsible.
On an aircraft, the pilot is still responsible for flying the aircraft. If the autopilot flies the plane into a mountain, itâ(TM)s still labeled pilot error. Thereâ(TM)s no absolving the guy at the controls.
Re: Too much delta-v? (Score:2)
Apologies for forgetting about slashdot and UTF-8.
Re: (Score:3)
Thatâ(TM)s kind of my point (and Teslaâ(TM)s too): the driver is responsible.
On an aircraft, the pilot is still responsible for flying the aircraft. If the autopilot flies the plane into a mountain, itâ(TM)s still labeled pilot error. Thereâ(TM)s no absolving the guy at the controls.
Yes, we can conclude two things from this event;
1) The driver is at fault for not paying attention, and can be held legally responsible
2) Tesla Autopilot is not yet advanced enough to prevent a car from ramming into a parked truck on its own.
What we can also consider is that by enabling drivers to reduce their driving concentration, and even take hands off the wheel, the Tesla Autopilot feature is a contributing factor to the accident. Not in legal terms, but in pure cause analysis terms.
Re: Too much delta-v? (Score:2)
The only question, I think, is whether these autonomous systems compensate for human inability to pay attention *more* than they encourage inattention.
Used properly, it can be very useful. As it stands, I think of the current technology as being like a horse saying "Even I am not *that* dumb'.
If nothing else, the big problem is this: marketing requires a very short term to describe a feature. That's how 'net neutrality' became the term for a subject that takes several paragraphs to explain adequately.
Like
Intended use (Score:2)
I'm kind of wondering what the purpose of autopilot is since it's only to be used by a fully attentive driver. What benefit does it add?
And if no benefit, why is it in the car in the first place, since it obviously acts as a lure for those who aren't fully attentive?
Re: (Score:3)
Because you have to walk before you can run. When you buy/enable the feature you opt in to Tesla mining the data so we can someday actually have autonomous cars.
Re: (Score:2)
I'm kind of wondering what the purpose of [blank] is since it's only to be used by [blank]. What benefit does it add?
The answer, as always, is kinky stuff and/or porn.
Re: (Score:2)
I'm kind of wondering what the purpose of autopilot is since it's only to be used by a fully attentive driver. What benefit does it add?
And if no benefit, why is it in the car in the first place, since it obviously acts as a lure for those who aren't fully attentive?
It allows Tesla to claim they have the first self-driving car.
Re: (Score:2, Insightful)
Re:Intended use (Score:4, Insightful)
What benefit does it add?
Its benefit is if the driver's treats it like an extra set of eyes, and is able to take corrective action when the driver fraks up. If the driver thinks it'll drive for him, you're right, it's a bad idea. (If you know anything about an aircraft's autopilot, you know it does not mean the flight crew is playing "I spy [youtu.be]" while the plane does all the work).
* Blind Spots. You wanna change lanes or merge into traffic. So you check your blind spot, and glance away from the road in front for a fraction of a second. Problem is, somebody else just cut you off and stomped on the brakes. (Or somebody cut off the guy in front of you, and he stomped on the brakes.) In either case, the car starts braking before you know there's a problem.
* Blind Spots part II: We aren't paying as much attention as we think we are. The reality is humans suck at paying attention, we have mountains of data to prove it, and that's why we pay big bucks to watch "Magicians" and "Illusionists" perform.
* Blind Spots, part III: We're effectively blind for the fraction of a second while our eyes move from one focus point to another. That matters more than you'd think. The "I didn't see it coming" excuse doesn't even require a distraction... just glance at the road sign for a second.
* Distractions [safestart.com]: A Pennsylvania insurance company found that 62% of accidents were caused by somebody being "lost in thought". Humans suck at paying attention.
* Another one I didn't appreciate until I got a car with a similar system: The car handles the gas pedal, and I cover the brake pedal with my foot. Wild animals (deer, moose), pets, children, and even adults jump in front of cars all the time. My car (not a Tesla) won't react until something is in my lane, so there's a chance I'll react first.
Re: (Score:3)
* Adaptive Cruse Control: One less task to pay high attention too and reduces road rage. Teaches proper following distance.
* Rear Assisted Braking / Camera: Get sandwiched between two vans when parking and trying to see out? I don't know how I lived without this. No more inching out and hoping.
* Blind Spot Alert: Nice little yellow light that, if I have my blinker on, beeps at me to let me know there is a car at my 8 o'clock.
* Lane Keeping Assist: So I was checking out the hottie walking the dog. Giv
Re: (Score:2)
All good questions that Elon Musk should be asking himself about now.
Given that he has carefully constructed a very successful brand that is practically synonymous with "Electric Car" for many people despite other popular models existing, why is he adding these useless and potentially harmful anti-features that can only damage this brand's reputation?
I realize there's a strong pro-self-driving-vehicle lobby here on /. but they conveniently ignore the need for and complete lack of software that, despite what
Re: (Score:2)
I think our implicit model of what the brain is doing when we talking about being "fully attentive" is too simplistic. Who hasn't found themselves thinking about something while they drive and at least starting to go to the wrong place out of habit? Clearly when you do that you are not being "fully attentive", and yet your brain is constantly monitoring for dangers and obstacles, adjusting the car, even navigating.
Here's another thing I've noticed. On a long drive, especially in difficult conditions, yo
Re: (Score:2)
Adaptive Cruse Control is something I don't know how I've lived without. My new Outback has it and the five mile drive on a semi-rural two lane road to civilization is now so much better, no more micro road rage. Just set and follow. Best thing since women that bake bread at home. (There is nothing better than a woman that bakes bread for you!)
That and the mild lane assist, just kind of nudges you back in lane, is all I really want from "auto pilot" in a car. I'm getting older and know that I'm not beco
Re:Intended use (Score:5, Insightful)
It's the difference between constant, can't-miss-a-second attention vs check-it-once-a-minute attention.
"Fully attentive" means can't-miss-a-second, not check-it-once-a-minute.
My brain wanders more, I'm able to glance to the side for a few seconds to look at something interesting on the road and I'm not constantly adjusting speed/steering.
In other words, you are part of the dangerous problem: people who don't understand that you need to pay full attention to driving when on autopilot. You should not be on the road, because your disregard of Tesla's instructions means you're not just a danger to yourself, but to everybody else too.
Don't let Tesla off the hook. (Score:2, Insightful)
You are right about what fully attentive means, but there's no reason for Tesla nor anyone else to think that that is the behaviour they're encouraging.
1) this "autopilot" idea makes it harder to pay attention because you aren't actually doing anything interactive, and
2) it's just not that easy to suddenly take control of something in a split second when you aren't already in the mindset of controlling the vehicle.
I prefer what some other manufacturers are doing: have the driving assistance leap into action
Re: (Score:3)
Re: (Score:2)
Re: Intended use (Score:2)
Regardless of blame, Autopilot is probably a net safety improvement for the majority of Tesla owners who use it. They might not be as attentive as they're officially supposed to be when using it, but the car is probably more attentive and adept at avoiding accidents than the nominal human driver would be in real life.
The fact is, being in or near a stopped vehicle on a limited-access road with high-speed freely-flowing traffic is EXTREMELY dangerous under ANY circumstances. Humans don't expect to encounter
Re:Intended use (Score:4, Insightful)
It's the difference between constant, can't-miss-a-second attention vs check-it-once-a-minute attention.
It you have not looked where you are going for a minute then, at 60 MPH, you will have travelled one mile -- you would not have seen that fire truck even if you had tried! Driving like that is what causes accidents like this.
A paradox of safety features like this is that gives the drivers more confidence to push the car to its limits; before anti skid brakes people were much more cautious on wet surfaces than they are today. I remember this being discussed on Radio 4 (England) some 20 years ago; the tongue in cheek comment was that the best way of reducing accidents would be to put a large, sharp spike above the dashboard pointing at the driver's head; the driver would then be careful enough to avoid any accident.
Re: (Score:2)
Driving the old pre-1968 VW Microbus made you feel that way. You never tailgated in those things. Your knees were only about 6 inches from the front of the bumper.
Re: (Score:2)
You are part of the problem.
STOP calling it Autopilot!!!!!! (Score:5, Insightful)
Hey Tesla, how about you STOP calling it autopilot. It's NOT autopilot. You don't get into the car and say "Ok Tesla, let's go to the pharmacy" and then sit back and enjoy the ride while the car drives you there.
Call it "Driver Assist" as in the driver is watching what's going on around them like they should and let the car keep itself within the lane and not bump into other cars while driving.
You set a high expectation with drivers when you keep calling it "Autopilot". Stop it.
Re:STOP calling it Autopilot!!!!!! (Score:5, Insightful)
Re:STOP calling it Autopilot!!!!!! (Score:5, Insightful)
Re: (Score:2)
That's OK; there's no need for them to understand what aircraft autopilots do. It's very clear in the Tesla manual that you can't just let the car drive itself. These incidents will just help publicise that fact more. It will be very interesting to see if this driver tries to claim that he didn't know that he had to continue to pay attention with autopilot.
Also, looking at the pictures of the crash, there's no way that Tesla hit at 65mph. It will be interesting also to hear how much it had slowed and wh
Re: (Score:2)
That's as stupid as saying that because a toddler doesn't know what "dead" means, the only reasonable course of action is to call it sleep.
We all have to grow up sometime.
Re: (Score:2)
That sounds nice but the analogy is terrible. We have decided that "pilot" means a person who operates the flying controls of an aircraft. We have also decided that "autopilot" is short for "automatic pilot".
I wouldn't call my child on this [imgur.com] and call him a pilot any more than I would someone driving a Tesla a pilot.
Meh...
Re: (Score:2)
It's an autopilot like in an aircraft, that still requires a human pilot(s) to be a systems manager.
Airplanes fly in the sky. The sky is generally pretty empty.
Asking the human to start paying attention again because something unexpected is going on is a bit more feasible.
Re: (Score:3, Insightful)
Hey Tesla, how about you STOP calling it autopilot. It's NOT autopilot.
Hey GM, how about you STOP calling it cruise control. It is NOT cruise control. Call it "Speed Assist".
Seriously, this is one of more dumb arguments against the name autopilot I've heard. It is almost exactly equivalent to a plane's autopilot system.
The fire department is sensationalizing it (Score:4, Interesting)
The tweet is on what appears to be an official twitter account. But, it claims the vehicle was traveling at 65 mph when it struck???
Firemen with any experience at all have usually worked a few highway crashes. Anyone with a clue as to what striking a near immovable object (as demonstrated by the mostly superficial damage to the truck) at 65 mph does to a modern vehicle with all sorts of built-in crumple zones can tell at a glance that this collision occurred at a far slower speed than 65 mph. I'd be surprised if it was even 40mph. It does not even appear that any of the Tesla's glass cracked. And the damage to the truck appears to be at a surface level. I wonder if the airbags deployed?
As public officials, these folks need to be much more responsible in what they tweet. Hopefully, responsible officials will correct the record and at least chastise whoever posted the tweet after reviewing the crash data.
All ways thought it was werid that (Score:5, Insightful)
In order to take instantaneous control if needed
Just my 2 cents
Re: (Score:2)
In Addition, A Red Stopped Vehicle (Score:3, Insightful)
Pretty bad oops, in their self driving code! And their product in general!
Let me see what was the first Tesla death, the Tesla mistook the White side of a semi trailer for the sky?
Just my 2 cents
Re: (Score:3)
My Outback with EyeSight would have avoided both. Of course, it also beeps at you if you take your hand off the wheel for more than about 10 seconds. Because it's, you know, an drivers assistance package.
Re: (Score:3)
Actually something like this reminds me of an even earlier Tesla accident where the investigation went something like:
Driver: "It was on autopilot!"
Investigator: "Tesla, was it on autopilot?"
Tesla: "No."
Driver: "Ok I lied, wait, how did you even know about that? Help help I'm being oppressed".
Misleading Headline (Score:2)
Re: (Score:2)
Poor Choice of Name (Score:2)
Autopilot is intended for use only with a fully attentive driver
Then why the hell are they calling it Autopilot?
Because that's what Autopilot means? (Score:2)
Autopilot still not working? (Score:2)
I would have thought they would have worked all the bugs out of Autopilot by now. After all they've been working on it for 40+ years as seen in this documentary clip:
https://www.youtube.com/watch?... [youtube.com]
Devil's Advocate (Score:2)
I don't think it can be realistically expected that drivers running autopilot will immediately react to poor choices being made by the car. Continuously monitoring an automated system that works well almost all the time is massively boring. There's just no way that anyone but the most OCD is going to continuesly maintain the level attention that they would deliver if they were actually driving.
Such auto pilot is totally useless. (Score:3)
The locomotive engineers!
All they have to do is to watch the speed, grade and signals.
And the number one problem for them? Boredom. They fall asleep. The nod off. There have been adding more and more devices to check the alertness of the drivers. Deadman's treadle is what? hundred years old? Now with computers they are thinking of creating a challenge and response to avoid them responding mechanically.
If the autopilot is going to steer and the documentation says, "driver must be fully attentive", it is time they add deadman's treadle and a host of devices to make sure there is a fully attentive driver there.
Re: (Score:3)
I own a 1960 chrysler with one of the very first curse controls. It pushes up from under the gas pedal. You have to keep your foot on it.
That's a conservative design. WTF happened? Believing your own BS is a trap.
Does Tesla autopilot improves at lower speeds ? (Score:2)
I was wondering if at lower speeds Tesla Autopilot is able to make better decisions...
Re: (Score:2)
If it was like the idiot in kansas or oklahoma more likely the driver was driving and texting or doing something else you're not supposed to do while driving
Re: (Score:2)
Because it isn't actually what most people would consider an autopilot and doesn't handle all situations. The truck was stopped and the car was driving at highway speed, and Tesla is probably not handling coming up on a fully stopped vehicle at that speed.
Re: (Score:2)
Re: Well... was the driver lying? (Score:2)
Except all/most auto braking implementations guarantee stopping before impact ONLY if the velocity difference is small enough. Tesla or any other system cannot make a full stop from highway speeds in time.
Re: (Score:2)
I would think that at highways speeds, it would slow down sooner, basically as when a stationary (or nearly stationary) obstacle is detected directly ahead, and the speed at which the vehicle is going still leaves a safe stopping distance.
If you are a safe following distance behind a car and it suddenly stops, why on earth would it not be able to detect that? If a moose or bear happens to be crossing the road, are you saying it would not see it and try to slow down as quickly as possible? It's obvious
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
The problem is by the time you realize the autopilot isn't going to stop in time its too late for you to stop as well.
Re: (Score:3)
I've noticed this with auto-pilot. It's good at following cars that are moving but if I'm driving down a road and there's a car stopped in front of me (for instance, at a red light), it seems to full speed way past the point where I'd start slowing down when driving manually.
I've not let it do its thing yet, instead I'll take over.
It seems they've calibrated it to function like most adaptive cruise controls in that it's great at matching the speed of a car it's already tracking in front of you. But it isn't
Re: (Score:3)
If no defects were found in the autopilot system, then why did the car crash?
Ignoring the fact that drivers are supposed to be paying attention while autopilot is engaged, I can see no reason that if the driver is telling the truth, that the car would not have slowed to a stop instead of hitting the truck at full speed.
I think it is more likely that the person is either lying (or mistaken) about autopilot being engaged or they were doing something else to override the autopilot's normal function.
The Tesla owner's manual says that Traffic-Aware Cruise Control, which is part of the Autopilot system, "cannot detect all objects and may not brake/decelerate for stationary vehicles"
Re: (Score:3)
Ah.... well that's a major pitfall right there. If it can't brake or detect for stationary vehicles, will it also fail to brake or detect someone, perhaps a child, who runs out in front of traffic?
I was under the impression that all modern collision avoidance systems are more than capable of handling this... if Tesla's cannot even manage this detail then their so-called autopilot is, to put it bluntly, a piece of shit.
Re: (Score:2)
Re: (Score:2)
Ah.... so it's untrustworthy when it's also the most likely to be deadly.
Good to know.
I thought stuff like this was supposed to *save* lives...?
Re: (Score:3)
Re:Well... was the driver lying? (Score:5, Interesting)
If no defects were found in the autopilot system, then why did the car crash?
The "no defects were found" is from the fatal crash a couple of years ago, and there were several contributing factors, outside of the autopilot.
That said: I don't drive a Tesla, but my car has a similar adaptive cruise control and auto-braking system. On my (non-tesla), I can easily see how somebody not familiar with it would think "Oh, I have the system engaged, the car will stop."
The reality is that it'll only stop if the difference in speed between my car and the object in front of me is less than 30 MPH. Drivers must go to the effort of learning the car's systems in order to know that. (And the learning comes from the Manufacturer's YouTube videos, The Fine Manual, The Dealership's guy whose only job is to teach customers about it, and said it at least a dozen times...)
I've been in more than a few situations where I can see traffic is stopped ahead, but my car continues accelerating towards them -- I'm accelerating past 50 MPH, while they're at a dead stop, 50 meters ahead.
Honestly, it feels like my brain is breaking every time: "Why isn't the car slowing down? Oh yeah, dummy! I gotta do it this time!"
So with my experience in a similar system on an entirely different make/model, I'm willing to bet the guy could have had autopilot engaged, but he didn't learn (for whatever reason) its limitations.
Re:Well... was the driver lying? (Score:4, Insightful)
That is a strikingly severe limitation... one that I had not heard about this previously. is this actually deliberate, because I cannot fathom how it would only be the best we can do technologically.
Re:Well... was the driver lying? (Score:4, Interesting)
My Outback with EyeSight will do its damnest to brake in that situation because the engineers understand that slower is better than doing nothing. I've tested up to about 45mph with cardboard boxes. Very strange felling. Amazing what they can do with two cameras, even in PNW rain.
Re: (Score:2)
I know exactly what you mean. But I think that is more to do with the specs of how far the EyeSight can reliably discern an object and providing false positives. But hey, at least it stops!
The only false positive I've had going forward is a slight braking (and beeping) when a plastic shopping bag flew in front of the windshield, but that was just for a moment. Better that than mistake a kid.
Re: Well... was the driver lying? (Score:2)
Well, if you can work out how a computer vision system can easily tell the difference between a car in a different lane on a bending road, versus a car stopped in the lane in front of you, you're a better developer than I.
Google's best can't tell the difference between primates and humans with a similar skin color, so I do not feel too bad.
I'm willing to bet they have the limitation to avoid unnecessary slamming on the brakes for no reason.
(My car has a much simpler system, and doesn't claim otherwise. It's
Re: (Score:2)
Drivers must go to the effort of learning the car's systems in order to know that..
Yikes - maybe we need a new class of license for people to drive a car with advanced but not fully autonomous features like these.
I always cringe when I see drivers not looking at the road for too long. This often happens in movies, but all too often it happens in real life as well. If I have to watch the road all adaptive cruise and auto-braking would do for me is give my foot a rest. That's where conventional cruise control is great (giving my foot a rest) on long trips but I still have to pay attent
Re: Well... was the driver lying? (Score:2)
Forget what Tesla does. Think of used car salesmen.
Re: (Score:2)
If no defects were found in the autopilot system, then why did the car crash?
If Volkswagen can cheat at environmental tests for about 8-10 years, then I'm not confident that complex autopilot systems are throughly examined. As far as I know they aren't even using formal verification [wikipedia.org] to prove the correctness of the autopilot software or even its subsystems.
Re:Well... was the driver lying? (Score:5, Interesting)
One, the other big notable accident was also with a vehicle with high ground clearance. At the time it was suggested that the system sensors were basically counting on something relatively close to the ground, and would miss things as they approach 'decaptiation level'.
I will say I am highly skeptical that the car slammed in at full 65 mph into a stopped fire truck. I got rear ended while I was going about 15 mph (traffic jam) by a car that was going about 60, and there were injuries and both cars were in much worse shape than the Tesla pictured (both cars totaled, frames bent so bad that no doors able to open without prybars), and that's with both cars having crumple zones, whereas the fire truck didn't yield much at all and the Tesla had to take the vast majority of the energy of the impact. Also, the Model S is a pretty heavy car, so there had to be a lot of energy in that collision.
Re: (Score:2)
NOTE: I fully understand that Tesla's 'autopilot' feature isn't a full-on self-driving car.
I really wish more people would understand this. Pretty much everyone I talk to, who only casually observes Tesla headlines, assumes that it is full-on self-driving. I'm sick and tired of having to constantly explain to them that it isn't.
Re: (Score:2)