Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Transportation Software United States Hardware Technology

Tesla Model S Plows Into a Fire Truck While Using Autopilot (cnbc.com) 345

On Monday, a Tesla Model S plowed into the back of a fire truck on a freeway near Culver City, California. The driver is claiming the car was on Tesla's Autopilot driver assistance system. As a result, the National Traffic Safety Board will be investigating both driver and vehicle factors. CNBC reports: The Culver City Firefighters Association Local 1927 union chapter tweeted out a picture of the crash on Monday afternoon. The firetruck was on the freeway helping after a motorcycle accident, the union said in an Instagram post. The post said there were no injuries. The outcome could have been much worse if firefighters had been standing at the back of the truck, Battalion Chief Ken Powell told the San Jose Mercury News. "Autopilot is intended for use only with a fully attentive driver," Tesla said in a statement sent to CNBC.
This discussion has been archived. No new comments can be posted.

Tesla Model S Plows Into a Fire Truck While Using Autopilot

Comments Filter:
  • by xxxJonBoyxxx ( 565205 ) on Tuesday January 23, 2018 @06:22PM (#55989535)
    https://tech.slashdot.org/story/18/01/22/2311225/tesla-owner-attempts-autopilot-defense-during-dui-stop
    • by ls671 ( 1122017 ) on Tuesday January 23, 2018 @06:55PM (#55989737) Homepage

      His defense didn't work because: "Autopilot is intended for use only with a fully attentive driver"! Same thing is said in TFS of this article with the fire truck story.

      If you think about it, a "fully attentive driver" ready to take control at any time seems to me like the driver needs driving school instructor skills where the instructor can take control of the car if the student screws up. Driving school instructors need more skills than a casual driver. It seems to me like being able to take over on the fly at any time might be harder than when you already have control in the first place.

      Does driving a Tesla require a driving school instructor license? Maybe it should if it doesn't...

      • Its one thing if Autopilot beeps and says 'please take manual control' a comfortable time before the accident occurs. But what the hell is the point of having an automatic driving system if you have to sit there waiting for that split second between when you realize the autopilot isn't working and when the accident occurs?
        • by viperidaenz ( 2515578 ) on Tuesday January 23, 2018 @07:16PM (#55989897)

          what the hell is the point of having an automatic driving system if you have to sit there waiting for that split second between when you realize the autopilot isn't working and when the accident occurs?

          It's not an automatic driving system. It's just Tesla marketing that implies it is. Their disclaimer says it's not.

          • Re: (Score:3, Informative)

            by Anonymous Coward

            Kind of funny... "The release of Tesla Version 7.1 software continues our improvements to self-driving technology" (from their announcement of version 7.1 of their software). Seems they do in fact call it self driving.

        • by sjames ( 1099 )

          Approaching a fire truck parked on the highway gives you a LOT more than a split second to apply the brakes.

          • by haruchai ( 17472 ) on Tuesday January 23, 2018 @07:34PM (#55990015)

            Approaching a fire truck parked on the highway gives you a LOT more than a split second to apply the brakes.

            I think the Auto Emerg Braking may have kicked in. It doesn't look like the airbags were triggered and I would have expected a LOT more damage for at collision at 65 mph.

            • by Rei ( 128717 ) on Tuesday January 23, 2018 @07:53PM (#55990141) Homepage

              More to the point, I doubt it will turn out that Autopilot was even on. "Autopilot crashed me" is the best excuse bad drivers have ever been given. And people automatically take it at face values, until the logs get examined.

              • by Nethead ( 1563 )

                People drive where they are looking. He was looking at the lights. That's why so many cops get hit on the side of the road.
                And yeah, that's not 65mph into a wall crumple. If the bags didn't fire then he wasn't going fast enough to die.

              • by mjwx ( 966435 )

                More to the point, I doubt it will turn out that Autopilot was even on. "Autopilot crashed me" is the best excuse bad drivers have ever been given. And people automatically take it at face values, until the logs get examined.

                The problem is that the system is being examined by Tesla, not a third party. Because of the proprietary nature of the system, we're relying on Tesla to tell us Tesla hasn't fucked up. No matter if you like or dislike Tesla, that is a conflict of interest.

                However that is not an issue in this case. The law is clear, all a judge will do is ask:
                Judge: Did you turn on the "autopilot" system.
                TWAT: Yes.
                Judge: Then you were in command of the vehicle, you are responsible for what happened.

                It doesn't matter

  • so as not to look as stupid as he/she is

    Now the idiot driver wrecked their Tesla and they have to pay for damages to the fire truck

    • by mysidia ( 191772 )

      Now the idiot driver wrecked their Tesla and they have to pay for damages to the fire truck

      If autopilot turns out to have been on; then probably Tesla will wind up having to pay for damages to a fire truck.... AND replacing that poor idiot driver's car. /P

  • I don't own a Tesla, but I know my own car's auto brake system doesn't gracefully slow to a stop unless the speed difference between my car and whatever is in front of me is less than 30 MPH.

    If the delta-v is more than 30 MPH, my car will do a "panic stop" thing to slow the car down, but it'll be too late to avoid a collision.

    • The problem is that people assume that autopilot means they don't need to think of things like delta-v.

      • The problem is that people assume that autopilot means they don't need to think of things like delta-v.

        The problem is that most people think "delta-v" is about Delta planes flying in a "V" formation.

    • I don't own a Tesla, but I know my own car's auto brake system doesn't gracefully slow to a stop unless the speed difference between my car and whatever is in front of me is less than 30 MPH.

      If the delta-v is more than 30 MPH, my car will do a "panic stop" thing to slow the car down, but it'll be too late to avoid a collision.

      Maybe, but it seems that the people driven cars also on the road were able to avoid that collision.

      • Thatâ(TM)s kind of my point (and Teslaâ(TM)s too): the driver is responsible.

        On an aircraft, the pilot is still responsible for flying the aircraft. If the autopilot flies the plane into a mountain, itâ(TM)s still labeled pilot error. Thereâ(TM)s no absolving the guy at the controls.

        • Apologies for forgetting about slashdot and UTF-8.

        • Thatâ(TM)s kind of my point (and Teslaâ(TM)s too): the driver is responsible.

          On an aircraft, the pilot is still responsible for flying the aircraft. If the autopilot flies the plane into a mountain, itâ(TM)s still labeled pilot error. Thereâ(TM)s no absolving the guy at the controls.

          Yes, we can conclude two things from this event;
          1) The driver is at fault for not paying attention, and can be held legally responsible
          2) Tesla Autopilot is not yet advanced enough to prevent a car from ramming into a parked truck on its own.

          What we can also consider is that by enabling drivers to reduce their driving concentration, and even take hands off the wheel, the Tesla Autopilot feature is a contributing factor to the accident. Not in legal terms, but in pure cause analysis terms.

          • The only question, I think, is whether these autonomous systems compensate for human inability to pay attention *more* than they encourage inattention.

            Used properly, it can be very useful. As it stands, I think of the current technology as being like a horse saying "Even I am not *that* dumb'.

            If nothing else, the big problem is this: marketing requires a very short term to describe a feature. That's how 'net neutrality' became the term for a subject that takes several paragraphs to explain adequately.

            Like

  • I'm kind of wondering what the purpose of autopilot is since it's only to be used by a fully attentive driver. What benefit does it add?
    And if no benefit, why is it in the car in the first place, since it obviously acts as a lure for those who aren't fully attentive?

    • Because you have to walk before you can run. When you buy/enable the feature you opt in to Tesla mining the data so we can someday actually have autonomous cars.

    • I'm kind of wondering what the purpose of [blank] is since it's only to be used by [blank]. What benefit does it add?

      The answer, as always, is kinky stuff and/or porn.

    • I'm kind of wondering what the purpose of autopilot is since it's only to be used by a fully attentive driver. What benefit does it add?
      And if no benefit, why is it in the car in the first place, since it obviously acts as a lure for those who aren't fully attentive?

      It allows Tesla to claim they have the first self-driving car.

    • Re: (Score:2, Insightful)

      by Anonymous Coward
      It is meant to be an adjunct to driving and nothing else. If you are too tired, inebriated, or otherwise do not posses the capacity to drive a car under normal conditions, the autopilot system is also unsuited for use. It was designed with the idea that the person behind the wheel was otherwise capable of driving and reacting under normal conditions without the aid of the autopilot system. It comes into its own when you consider things like driving fatigue and issues that come with old age or imperfect heal
    • Re:Intended use (Score:4, Insightful)

      by sl3xd ( 111641 ) on Tuesday January 23, 2018 @07:33PM (#55990011) Journal

      What benefit does it add?

      Its benefit is if the driver's treats it like an extra set of eyes, and is able to take corrective action when the driver fraks up. If the driver thinks it'll drive for him, you're right, it's a bad idea. (If you know anything about an aircraft's autopilot, you know it does not mean the flight crew is playing "I spy [youtu.be]" while the plane does all the work).

      * Blind Spots. You wanna change lanes or merge into traffic. So you check your blind spot, and glance away from the road in front for a fraction of a second. Problem is, somebody else just cut you off and stomped on the brakes. (Or somebody cut off the guy in front of you, and he stomped on the brakes.) In either case, the car starts braking before you know there's a problem.

      * Blind Spots part II: We aren't paying as much attention as we think we are. The reality is humans suck at paying attention, we have mountains of data to prove it, and that's why we pay big bucks to watch "Magicians" and "Illusionists" perform.

      * Blind Spots, part III: We're effectively blind for the fraction of a second while our eyes move from one focus point to another. That matters more than you'd think. The "I didn't see it coming" excuse doesn't even require a distraction... just glance at the road sign for a second.

      * Distractions [safestart.com]: A Pennsylvania insurance company found that 62% of accidents were caused by somebody being "lost in thought". Humans suck at paying attention.

      * Another one I didn't appreciate until I got a car with a similar system: The car handles the gas pedal, and I cover the brake pedal with my foot. Wild animals (deer, moose), pets, children, and even adults jump in front of cars all the time. My car (not a Tesla) won't react until something is in my lane, so there's a chance I'll react first.

      • by Nethead ( 1563 )

        * Adaptive Cruse Control: One less task to pay high attention too and reduces road rage. Teaches proper following distance.

        * Rear Assisted Braking / Camera: Get sandwiched between two vans when parking and trying to see out? I don't know how I lived without this. No more inching out and hoping.

        * Blind Spot Alert: Nice little yellow light that, if I have my blinker on, beeps at me to let me know there is a car at my 8 o'clock.

        * Lane Keeping Assist: So I was checking out the hottie walking the dog. Giv

    • by Trogre ( 513942 )

      All good questions that Elon Musk should be asking himself about now.

      Given that he has carefully constructed a very successful brand that is practically synonymous with "Electric Car" for many people despite other popular models existing, why is he adding these useless and potentially harmful anti-features that can only damage this brand's reputation?

      I realize there's a strong pro-self-driving-vehicle lobby here on /. but they conveniently ignore the need for and complete lack of software that, despite what

    • by hey! ( 33014 )

      I think our implicit model of what the brain is doing when we talking about being "fully attentive" is too simplistic. Who hasn't found themselves thinking about something while they drive and at least starting to go to the wrong place out of habit? Clearly when you do that you are not being "fully attentive", and yet your brain is constantly monitoring for dangers and obstacles, adjusting the car, even navigating.

      Here's another thing I've noticed. On a long drive, especially in difficult conditions, yo

      • by Nethead ( 1563 )

        Adaptive Cruse Control is something I don't know how I've lived without. My new Outback has it and the five mile drive on a semi-rural two lane road to civilization is now so much better, no more micro road rage. Just set and follow. Best thing since women that bake bread at home. (There is nothing better than a woman that bakes bread for you!)

        That and the mild lane assist, just kind of nudges you back in lane, is all I really want from "auto pilot" in a car. I'm getting older and know that I'm not beco

  • by Anonymous Coward on Tuesday January 23, 2018 @06:39PM (#55989631)

    Hey Tesla, how about you STOP calling it autopilot. It's NOT autopilot. You don't get into the car and say "Ok Tesla, let's go to the pharmacy" and then sit back and enjoy the ride while the car drives you there.

    Call it "Driver Assist" as in the driver is watching what's going on around them like they should and let the car keep itself within the lane and not bump into other cars while driving.

    You set a high expectation with drivers when you keep calling it "Autopilot". Stop it.

    • by b0s0z0ku ( 752509 ) on Tuesday January 23, 2018 @06:41PM (#55989641)
      It's an autopilot like in an aircraft, that still requires a human pilot(s) to be a systems manager.
      • by alvinrod ( 889928 ) on Tuesday January 23, 2018 @07:06PM (#55989827)
        But the common man on the street doesn't know that's what autopilot means and is likely to think it means the plane flying itself because they've never been in a cockpit or have any real idea what pilots do beyond vague notions of flying the plane.
        • by jaa101 ( 627731 )

          That's OK; there's no need for them to understand what aircraft autopilots do. It's very clear in the Tesla manual that you can't just let the car drive itself. These incidents will just help publicise that fact more. It will be very interesting to see if this driver tries to claim that he didn't know that he had to continue to pay attention with autopilot.

          Also, looking at the pictures of the crash, there's no way that Tesla hit at 65mph. It will be interesting also to hear how much it had slowed and wh

        • by sl3xd ( 111641 )

          That's as stupid as saying that because a toddler doesn't know what "dead" means, the only reasonable course of action is to call it sleep.

          We all have to grow up sometime.

          • That's as stupid as saying that because a toddler doesn't know what "dead" means, the only reasonable course of action is to call it sleep.

            We all have to grow up sometime.

            That sounds nice but the analogy is terrible. We have decided that "pilot" means a person who operates the flying controls of an aircraft. We have also decided that "autopilot" is short for "automatic pilot".

            I wouldn't call my child on this [imgur.com] and call him a pilot any more than I would someone driving a Tesla a pilot.

            Meh...

      • It's an autopilot like in an aircraft, that still requires a human pilot(s) to be a systems manager.

        Airplanes fly in the sky. The sky is generally pretty empty.

        Asking the human to start paying attention again because something unexpected is going on is a bit more feasible.

    • Re: (Score:3, Insightful)

      by bgarcia ( 33222 )

      Hey Tesla, how about you STOP calling it autopilot. It's NOT autopilot.

      Hey GM, how about you STOP calling it cruise control. It is NOT cruise control. Call it "Speed Assist".

      Seriously, this is one of more dumb arguments against the name autopilot I've heard. It is almost exactly equivalent to a plane's autopilot system.

  • by RhettLivingston ( 544140 ) on Tuesday January 23, 2018 @07:00PM (#55989785) Journal

    The tweet is on what appears to be an official twitter account. But, it claims the vehicle was traveling at 65 mph when it struck???

    Firemen with any experience at all have usually worked a few highway crashes. Anyone with a clue as to what striking a near immovable object (as demonstrated by the mostly superficial damage to the truck) at 65 mph does to a modern vehicle with all sorts of built-in crumple zones can tell at a glance that this collision occurred at a far slower speed than 65 mph. I'd be surprised if it was even 40mph. It does not even appear that any of the Tesla's glass cracked. And the damage to the truck appears to be at a surface level. I wonder if the airbags deployed?

    As public officials, these folks need to be much more responsible in what they tweet. Hopefully, responsible officials will correct the record and at least chastise whoever posted the tweet after reviewing the crash data.

  • by oldgraybeard ( 2939809 ) on Tuesday January 23, 2018 @07:02PM (#55989799)
    While using Tesla Autopilot the driver is to be ready with hands over the wheel and ready and aware of the complete environment around them.

    In order to take instantaneous control if needed ;) Heck if that is the case you may as well be driving yourself ;)

    Just my 2 cents ;)
  • by oldgraybeard ( 2939809 ) on Tuesday January 23, 2018 @07:13PM (#55989881)
    OK so you have a large red stopped vehicle, and the tesla's sensor location system failed so badly as not to detect it!

    Pretty bad oops, in their self driving code! And their product in general!

    Let me see what was the first Tesla death, the Tesla mistook the White side of a semi trailer for the sky? ;) So was there a reddish sky ;)

    Just my 2 cents ;)
    • by Nethead ( 1563 )

      My Outback with EyeSight would have avoided both. Of course, it also beeps at you if you take your hand off the wheel for more than about 10 seconds. Because it's, you know, an drivers assistance package.

    • Actually something like this reminds me of an even earlier Tesla accident where the investigation went something like:
      Driver: "It was on autopilot!"
      Investigator: "Tesla, was it on autopilot?"
      Tesla: "No."
      Driver: "Ok I lied, wait, how did you even know about that? Help help I'm being oppressed".

  • It should be: A person driving a Tesla plows into a fire truck. The person, not the car. The car was not at fault, it just happened to sustain the damage.
  • Autopilot is intended for use only with a fully attentive driver

    Then why the hell are they calling it Autopilot?

  • I would have thought they would have worked all the bugs out of Autopilot by now. After all they've been working on it for 40+ years as seen in this documentary clip:

    https://www.youtube.com/watch?... [youtube.com]

  • I don't think it can be realistically expected that drivers running autopilot will immediately react to poor choices being made by the car. Continuously monitoring an automated system that works well almost all the time is massively boring. There's just no way that anyone but the most OCD is going to continuesly maintain the level attention that they would deliver if they were actually driving.

  • by 140Mandak262Jamuna ( 970587 ) on Tuesday January 23, 2018 @08:38PM (#55990401) Journal
    There is already a large number of driver who don't have to steer their vehicles for the last 150 years.

    The locomotive engineers!

    All they have to do is to watch the speed, grade and signals.

    And the number one problem for them? Boredom. They fall asleep. The nod off. There have been adding more and more devices to check the alertness of the drivers. Deadman's treadle is what? hundred years old? Now with computers they are thinking of creating a challenge and response to avoid them responding mechanically.

    If the autopilot is going to steer and the documentation says, "driver must be fully attentive", it is time they add deadman's treadle and a host of devices to make sure there is a fully attentive driver there.

    • I own a 1960 chrysler with one of the very first curse controls. It pushes up from under the gas pedal. You have to keep your foot on it.

      That's a conservative design. WTF happened? Believing your own BS is a trap.

  • I was wondering if at lower speeds Tesla Autopilot is able to make better decisions...

A consultant is a person who borrows your watch, tells you what time it is, pockets the watch, and sends you a bill for it.

Working...