Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Transportation Software Hardware Technology

Driver Killed In a Tesla Crash Using Autopilot Ignored At Least 7 Safety Warnings (usatoday.com) 516

An anonymous reader quotes a report from USA Today: U.S. investigators said a driver who was killed while using Tesla's partially self-driving car ignored repeated warnings to put his hands on the wheel. In a 538-page report providing new details of the May 2016 crash that killed Ohio resident Joshua Brown in a highway crash in Florida, the National Transportation Safety Board described the scene of the grisly incident and the minutes leading up to it. The agency, which opened an investigation to explore the possibility that Tesla's Autopilot system was faulty, said it had drawn "no conclusions about how or why the crash occurred." The NTSB report appears to deliver no conflicting information. The agency said the driver was traveling at 74 miles per hour, above the 65 mph limit on the road, when he collided with the truck. The driver used the vehicle's self-driving system for 37.5 minutes of the 41 minutes of his trip, according to NTSB. During the time the self-driving system was activated, he had his hands on the wheel for a total of only about half a minute, investigators concluded. NTSB said the driver received seven visual warnings on the instrument panel, which blared "Hold Steering Wheel," followed by six audible warnings.
This discussion has been archived. No new comments can be posted.

Driver Killed In a Tesla Crash Using Autopilot Ignored At Least 7 Safety Warnings

Comments Filter:
  • Simple question (Score:4, Insightful)

    by Anonymous Coward on Tuesday June 20, 2017 @06:05PM (#54656933)

    Why would the car continue to operate for 37.5 minutes of the trip if the driver didn't have his hands on the steering wheel? If that's a requirement, why didn't the car just pull over and shut off? It seems like Tesla failed to implement some common sense safety protocols here.

    • Re:Simple question (Score:5, Informative)

      by hawguy ( 1600213 ) on Tuesday June 20, 2017 @06:11PM (#54656989)

      Why would the car continue to operate for 37.5 minutes of the trip if the driver didn't have his hands on the steering wheel? If that's a requirement, why didn't the car just pull over and shut off? It seems like Tesla failed to implement some common sense safety protocols here.

      Because they trusted that the owner of an $80,000 car had at least some minimal intelligence and even if the driver had blind trust in the car, that when the car says "put your hands on the wheel and pay attention", that the driver would listen.

      Yet this driver has demonstrated that people are about as dumb as you think they can be, so now they've implemented a 3 strikes policy that disabled autopilot after 3 reminders.

      • by Jhon ( 241832 )

        "Yet this driver has demonstrated that people are about as dumb as you think they can be, so now they've implemented a 3 strikes policy that disabled autopilot after 3 reminders."

        Maybe the driver was asleep? Deep sleep? Was that ruled out?

      • What if the drivers was incapacitated and couldn't put his hands on the wheel?
      • Re:Simple question (Score:5, Insightful)

        by quantaman ( 517394 ) on Tuesday June 20, 2017 @06:43PM (#54657201)

        Why would the car continue to operate for 37.5 minutes of the trip if the driver didn't have his hands on the steering wheel? If that's a requirement, why didn't the car just pull over and shut off? It seems like Tesla failed to implement some common sense safety protocols here.

        Because they trusted that the owner of an $80,000 car had at least some minimal intelligence and even if the driver had blind trust in the car, that when the car says "put your hands on the wheel and pay attention", that the driver would listen.

        Yet this driver has demonstrated that people are about as dumb as you think they can be, so now they've implemented a 3 strikes policy that disabled autopilot after 3 reminders.

        For the first few days people will be extremely cautious letting the autopilot do anything.

        For the first few weeks they'll give it more leeway, but be very attuned to any warnings it gives.

        After a few months, if they haven't had any real scares, they'll assume the auto-pilot knows what it's doing and generally ignore warnings.

        Some people will be more cautious, but as a software developer this is exactly what I expect to happen with a significant portion of people. Everyone knows the right thing to do, we should backup our data rigorously, always use good unique password, follow the proper procedures, etc. But that's not how people work. If it's not part of a routine, and it's not given an immediate payoff, then people won't do it.

        Give people a car that can self-drive in some situations and they will inevitably let it self-drive in every situation they can.

        • Re:Simple question (Score:5, Interesting)

          by Mr D from 63 ( 3395377 ) on Tuesday June 20, 2017 @07:39PM (#54657591)

          For the first few days people will be extremely cautious letting the autopilot do anything.

          For the first few weeks they'll give it more leeway, but be very attuned to any warnings it gives.

          After a few months, if they haven't had any real scares, they'll assume the auto-pilot knows what it's doing and generally ignore warnings.

          Some people will be more cautious, but as a software developer this is exactly what I expect to happen with a significant portion of people. Everyone knows the right thing to do, we should backup our data rigorously, always use good unique password, follow the proper procedures, etc. But that's not how people work. If it's not part of a routine, and it's not given an immediate payoff, then people won't do it.

          Give people a car that can self-drive in some situations and they will inevitably let it self-drive in every situation they can.

          Its human nature. Bad driving habits reinforced over time. A certain percentage of people will grow unsafely confident and increase their risk. Just like texting while driving .

      • by OrangeTide ( 124937 ) on Tuesday June 20, 2017 @06:50PM (#54657257) Homepage Journal

        Because they trusted that the owner of an $80,000 car had at least some minimal intelligence and even if the driver had blind trust in the car, that when the car says "put your hands on the wheel and pay attention", that the driver would listen.

        What if I go unconscious in my $80,000 car? Perhaps I have a heart attack, stroke, or simply had an unexplained fainting spell. So the car just beeps at me uselessly until it drives into something? Might as well use a traditional car, set it to cruise at 75 mph and take a nap because the end result will be the same.

        • So why is it Tesla's fault it can't handle something no other car can handle?
          • Because it named its feature "Autopilot", which means it's supposed to be able to pilot itself.

            Cue the idiots who will come in and say airline autopilot is incredibly limited and that people should similarly expect Tesla's Autopilot features to be extremely limited as well. The term autopilot has a meaning. It doesn't matter if you think you should apply airline autopilot features (which are much, much more involved than what Tesla offers) to what people should expect from a feature named "Autopilot". Wh

            • Because it named its feature "Autopilot", which means it's supposed to be able to pilot itself.

              I'll make you a wager. You get on any airplane and turn on auto pilot and walk away. Come talk to me after the plane lands itself when it's low on fuel and collect a million dollars.

          • Other cars (like the Honda lane assist/adaptive cruise) warn you and then FORCE you to not be a fucking idiot, by turning the system off before you get the stupid idea to completely rely on it. All that this Tesla system does differently than that is... give you a nice thick layer of overconfidence until the moment that the system completely fails you and careens you into a semi. Tesla deserves every ounce of bad press from this, they designed a system that failed in just the right way as to carry a man t

        • What if I go unconscious in my $80,000 car? Perhaps I have a heart attack, stroke, or simply had an unexplained fainting spell. So the car just beeps at me uselessly until it drives into something? Might as well use a traditional car, set it to cruise at 75 mph and take a nap because the end result will be the same.

          Try it, let me know how you get on when you meet the first corner. ZOMG....what happens if the same thing happens to you when you cross a street, you could get run over and be dead twice.

        • Car price is irrelevant.
          You can go unconscious in a Bugatti Veyron - which would immediately turn you dead.
          At least, with a Tesla, you have a much higher chance of actually arriving at your destination than if going unconscious in another car.

      • that the prompts were just there to make Tesla blameless and they implemented the policy after realizing that you can't be blameless when you kill people who can afford an $80k sportscar.
    • Re:Simple question (Score:5, Insightful)

      by wonkey_monkey ( 2592601 ) on Tuesday June 20, 2017 @06:30PM (#54657125) Homepage

      If that's a requirement, why didn't the car just pull over and shut off?

      Because the car isn't smart enough to do that. It can keep you between the lines on the road; it can't take you out of the lanes and park you up. That's actually a harder thing to do.

      • And they say we are so close to full autonomous driving
        • by lgw ( 121541 )

          We are quite close. Just ignore the Tesla hype and look to serious companies that will make their big marketing splash after they have working product. Volvo plans self-driving cars (of the sort the pull over safely when something's wrong) by 2020. Be interesting to see if they make it, but they're gathering test mileage already.

      • If that's a requirement, why didn't the car just pull over and shut off?

        Because the car isn't smart enough to do that. It can keep you between the lines on the road; it can't take you out of the lanes and park you up. That's actually a harder thing to do.

        Why didn't the car put its hazards on and let go of the accelerator until it slowed to a crawl while staying between the lines on the road? Yes, there's a possibility he could get rear-ended by some yahoo behind him who wasn't paying attention, but that's better than plowing into a truck in front of him.

        Basically, the failure mode for autopilot doesn't have to be "pull over and park safely".

    • "The driver used the vehicle's self-driving system for 37.5 minutes of the 41 minutes of his trip, according to NTSB. During the time the self-driving system was activated, he had his hands on the wheel for a total of only about half a minute, investigators concluded.

      It's fairly common to use a cruise control-type system for extended periods. As the article stated, he only had his hands of the wheel for a short period. The system seems to have alerted him a good amount in such a short time. Sadly, given that someone lost their life, I think this is a big case of user error.

      • I think this is a big case of user error.

        It usually is. PJ O'Rourke wrote a great piece about the NTSB in (I think) "Parliament of Whores".

        They were investigating unexplained crashes of Volvos. It turns out the type of person who was buying Volvos at the time (in the US anyway) were often a poor driver who stamped on the gas instead of the brake.

    • If the car is not smart enough to see a truck by itself, it may not be smart enough to find a safe space to pull over and shut itself off.

      In any case, I agree with you. Maintaining the 74 MPH speed by that point is insane. At the very least, it should begin to decelerate and flash its warning lights to show other cars that something is wrong.

      • If the car is not smart enough to see a truck by itself,

        This may also not have been a problem of smart, but a problem of sensors not registering the truck in this peculiar situation.
        (e.g.: unlike other cars, Teslas only have forward facing radars and 2D video camera, no LIDARs, nor 3D-pair of stereo cameras.)

        it may not be smart enough to find a safe space to pull over and shut itself off.

        It's definitely not as much a problem of smart as it is of sensors :
        whereas some other cars feature read-facing video cameras under the side mirrors (that can see if there's a car in the next lane much further back),
        Teslas are among the car that exclusively

    • Poor naive A/C, I see two problems here. One, what if the driver was mentally unbalanced? Two, if you think your idea is so scary average, then why not patent it?

      On consideration, Florida would make a good test site. It's common for drivers to be sited for going to slow, and there are 'gators on the side of the road that are as long as the car your driving.
    • It seems like Tesla failed to implement some common sense safety protocols here.

      Like going 74 mph on a road with a 65 mph limit? If the car is not even going to observe basic traffic laws the game is over before you start.

      • Rich dumbass ignores safety warning whilst speeding and it is the manufacturers fault?
        Personal responsibility does not apply?

        I notice that his family are not suing and in the litigious USA that almost means they agree.
      • Tesla updated the Autopilot feature to have a five-mile-over-the-limit maximum after this crash.

        The US is not like Europe vis-a-vis speed limits. Driving at or under the speed limit here is an uncommon and sometimes dangerous behavior. That's really stupid, but that's the way it has been ever since they set the interstate speed limits nationally at 55 mph.

        (Yes, I know that speed limits are higher than 55mph now, but that's when it started.)

    • No, the driver failed to implement some common sense safety protocols. Stop believing manufacturers have to take responsibility for the ends users unwillingness to take any responsibility for themselves.
    • If that's a requirement, why didn't the car just pull over and shut off?

      The main reason is that in this exact type of car (Tesla Model S) pulling over - specifically, the "change lane" part of it - can't be automated 100% yet.
      The car sense cars in neighboring lanes (BLIS - Blind Sport Information System) by using its ultra-sound sonars. Basically a type of souped up parking assistance with a little bit more range.
      This is good at detecting whether the space right next to the Tesla is free, this isn't that good at detecting incoming cars in the lane.
      (i.e.: if the Tesla only trust

    • Because Autopilot is nothing more than an expensive party trick.
    • That was my post :D

      On a related note, if I understood the news on TV correctly: Germany passed today laws to allow self driving cars on the roads.

    • Re:Simple question (Score:5, Insightful)

      by Strider- ( 39683 ) on Tuesday June 20, 2017 @08:13PM (#54657759)

      Why would the car continue to operate for 37.5 minutes of the trip if the driver didn't have his hands on the steering wheel? If that's a requirement, why didn't the car just pull over and shut off? It seems like Tesla failed to implement some common sense safety protocols here.

      The real problem is that Tesla's "Autopilot" is the worst possible solution to the problem. One of the realities of the human brain is that shifting your attention is hard... To put it in computational terms, the context switch is expensive. Even when the car is doing the driving, it theoretically requires the driver to be auditing it, and be paying attention to their surroundings in case things go sideways. Concentrating on something that you're not actively participating in can be quite difficult, as we already know. If the driver is actually, well, driving, they are already hopefully paying attention to the road and their surroundings.

      Until we get self-driving cars that can run without even having a steering wheel, it is a bad idea to have the computer half in control. The better solution is to work the other way. The driver does most of the driving, and the computer only takes over in emergency/collision type situations (avoiding that box on the highway, the idiot merging into your blind spot, etc...)

  • Two Things (Score:5, Insightful)

    by Mr D from 63 ( 3395377 ) on Tuesday June 20, 2017 @06:05PM (#54656943)
    What we know from this incident;

    1) The driver was responsible for the accident because he didn't maintain control
    2) Tesla Autopilot was not good enough on its own to prevent the car from driving into the truck.
    • Re: (Score:2, Insightful)

      > 2) Tesla Autopilot was not good enough on its own to prevent the car from driving into the truck.

      ^ THIS.

      That is the more troubling question that needs to be asked and answered. Why weren't there more fail safes such as ... ?

      * Why didn't the car slow down if you have your hands off the wheel for more then 5+ minutes?
      * Why didn't the car's sensor detect the impending crash?
      * Why didn't the car pull over the side of the road after 15 minutes of hands free driving?

      • There are some human interface issues that remain a challenge for these 'partially autonomous' driving modes. This article mentions some of the challenges with the handoff between autonomous steering and manual (not Tesla specific);.

        https://www.washingtonpost.com... [washingtonpost.com]

        But while autonomous or semiautonomous driving technology could help reduce collisions in general, questions about how and when to draw the line between manual and autonomous mode have yet to be fully resolved by engineers and researchers. Last year, for example, a study by Stanford University found that drivers often had trouble taking the wheel again after letting a computer drive, even momentarily. Drivers commonly over- or undercorrected with the steering wheel, even when they knew the handoff was coming, the research found. The effects were more pronounced if driving conditions had changed substantially since the last time the drivers were in control, according to a Stanford release.

        • by sl3xd ( 111641 )

          There are some human interface issues that remain a challenge for these 'partially autonomous' driving modes. This article mentions some of the challenges with the handoff between autonomous steering and manual (not Tesla specific).

          This.

          I don't have a Tesla, but my car has its own less-capable partially autonomous features. It's kind of hard to describe, other than there are corner cases which aren't handled very smoothly.

          For example: a car directly to the side of me starts to drift into my lane - I notice, and start to move to avoid him, but the car decides I'm drifting out of my lane and applies force to steer me back into the center of the lane -- and into the other car.

          In that situation the car is applying its own steering force.

      • Re: Two Things (Score:5, Informative)

        by guruevi ( 827432 ) on Tuesday June 20, 2017 @06:27PM (#54657097)

        There are various assumptions to make before pulling over the side of the road which may or may not be safer than simply having the car continue. You have to make sure there is an unobstructed emergency strip and you're not just careening the vehicle down a cliff. If your car makes the decision to go on a shoulder and something happens (or the shoulder doesn't exist), at that point the liability shifts because you've gone from passive "cruise control with intelligent lane following" to active intervention in a situation.

        • Yes! How are slashdotters missing this?

          There's a huge difference between a TOOL ("stay in the the lanes" ala essentially a next-gen cruise control), and an autonomous machine capable of making DECISIONS that override the driver.

          A Tesla is NOT a self-driving car. It's not autonomous. It's a CAR that has a next-gen cruise control. That's it.

          There's no difference between falling asleep with the cruise control on in your Lexus and eventually driving off the road, and falling asleep with the Telsa next-gen cruis

      • * Why didn't the car slow down if you have your hands off the wheel for more then 5+ minutes?

        Because that would kill more people that what it does now.

        Why didn't the car's sensor detect the impending crash?

        Because white truck, and completely across the road is something the system probably thought of as something like fog or clouds. There are not supposed to be big white boxes totally across the road... which is also why it warns you after to long to at least hold the wheel.

        Why didn't the car pull over the

        • by Ichijo ( 607641 )

          Why didn't the car's sensor detect the impending crash?

          Because white truck, and completely across the road is something the system probably thought of as something like fog or clouds.

          You are required by law to slow down when visibility is poor. So the question remains, why didn't it slow down?

      • Re:Two Things (Score:5, Insightful)

        by wonkey_monkey ( 2592601 ) on Tuesday June 20, 2017 @06:36PM (#54657155) Homepage

        * Why didn't the car slow down if you have your hands off the wheel for more then 5+ minutes?

        Slow down how much? And if nothing changes, then what? Come to a stop in the middle of the road?

        * Why didn't the car pull over the side of the road after 15 minutes of hands free driving?

        Because that's a much harder thing to do than simply keeping a car between standardised lane markings and away from relatively slow-moving traffic (not that it fully succeeded in the latter in this case).

        • Re:Two Things (Score:4, Insightful)

          by Trogre ( 513942 ) on Tuesday June 20, 2017 @08:51PM (#54657929) Homepage

          Slow down how much? And if nothing changes, then what? Come to a stop in the middle of the road?

          Of course, yes.

          If you ever find yourself in a situation where you are no longer able to safely drive your vehicle for any reason your first course of action is to stop it from moving, and of course activate your hazard lights.

          If the AI was not able to pull over then the correct action would be for it to just stop.

      • I wish I could remember enough details to do a proper search for a proper citation but I recall seeing on TV something about an airplane crash because the pilots ignored one too many warnings. The story goes something like this....

        A couple of pilots were flying a commercial jet with only one other crew member aboard. I'm not sure why the plane was empty but it gave the pilots some freedom to pull a stunt for bragging rights. This happened in the 1960s I believe, it was fairly early in the history of comm

      • * Why didn't the car slow down if you have your hands off the wheel for more then 5+ minutes?

        slowing down brings a slight risk of getting rear-ended, specially in the middle of other wise fluid traffic.
        that's why I suspect that the engineers at Tesla erred on the side of not stopping.

        * Why didn't the car's sensor detect the impending crash?

        Tesla in particular lack LIDARs and/or 3D-pair of stereo video camera.
        They only have a forward facing 2D video camera and a radar.
        These might get slightly confused by distance and size and might confuse the truck with a street sign much further.
        (Highway signs are rather reflective and might seem closer on the radar tha

  • but what could have been done to prevent such a tragedy?
    • Re:but... /s (Score:5, Insightful)

      by KGIII ( 973947 ) <uninvolved@outlook.com> on Tuesday June 20, 2017 @06:10PM (#54656979) Journal

      Nothing. Not a damned thing. Someone this stupid was going to take themselves out of the gene pool, sooner or later. I'm sometime baffled that we've managed to keep ourselves from going extinct.

    • by Guspaz ( 556486 )

      - Semi trailers could have been mandated to have crash bars under their trailer such that a vehicle will react more like a head-on impact instead of having its top sliced off
      - Auto pilot could have bene improved to be able to handle this scenario (this was implemented by Tesla after the crash)
      - The driver could have actually paid attention and maintained control of the vehicle
      - The vehicle could have been more aggressive in refusing to continue on autopilot without a driver response, such as slowly decelera

    • - add a forward facing LIDAR and or 3d-pair of stereo camera (instead of 2d), to supplement the current radar and 2d camera.
      This would help the car disambiguate between a truck just in front, and a highway sign much further in the same direction - both could look similar to the current set of sensors)

      - add backward facing cameras under the rear-view mirror :
      this would help the car change lane safely.
      Means that it could cover the scenario "user doesn't touch the wheel for the past 30min despite alarms" =>

  • by dlleigh ( 313922 ) on Tuesday June 20, 2017 @06:07PM (#54656953)

    Of course he was going to ignore a warning that said, "Hold steering wheel."

    Instead, the car should have said:

    "What the hell are you doing with your hands off the wheel, you idiot???! Are you trying to crash? Do you want to die? Do you want to make your kids orphans?"

    The warnings could get increasingly forceful as the car complains that its own safety is being jeopardized.

    "I don't want to go to a body shop. They use hammers! Kill yourself if you want, but leave me out of it."

    The accident was therefore Tesla's fault.

    • by ledow ( 319597 )

      Fuck his kids. They're already doomed.

      I'm much more concerned about other people who aren't doing stupid things.

  • You will never know, because you can't ask the guy the car killed.
  • by gurps_npc ( 621217 ) on Tuesday June 20, 2017 @06:24PM (#54657079) Homepage

    With current technology, people are trained to ignore meaningless errors.

    Proper error handling for anything important require you to take action, especially if you repeat the error.

    That is, if they want people to pay attention to a "keep hands on wheels" warning, the speed should drop significantly. Not as if the brake was applied, but instead as if the foot was taken off the gas (even if they tried to floor it.). Oh, and the brake light should flash to let people behind know you are slowing, even though no brake is applied.

  • by s1d3track3D ( 1504503 ) on Tuesday June 20, 2017 @06:24PM (#54657083)

    Tesla's partially self-driving car

    A partial self driving car is like sorta being pregnant, the car is either self driving or not, any grey area = not.

    This seems like Tesla getting the public to do QA for them untill they have a fully self driving car, it's clear the public does not know what "partially" means...

    • by chrisautrey ( 1960196 ) on Tuesday June 20, 2017 @06:51PM (#54657263)

      the car is either self driving or not

      Tesla did not do themselves any favors with the name. 'Enhanced cruise control' or something similar as a name would have gone a long way.

      • by Tom ( 822 )

        I've driven a bunch of cars with adaptive cruise control, automatic lane keeping and such things. Top-of-the line cars, current models, like the new BMW 5.

        None of them would be able to drive anywhere near the 30+ minutes of the Tesla by themselves. What Tesla has there is much more than enhanced cruise control.

    • by ccguy ( 1116865 )

      A partial self driving car is like sorta being pregnant, the car is either self driving or not, any grey area = not.

      You can actually be "sorta pregnant", such as having the fertilized egg stuck in the fallopian tubes (this will give a positive in all tests as the pregnancy hormones are produced, even if the pregnancy won't get anywhere and in fact you need to remove it immediately to prevent major issues).

      Maybe things are not binary even when they seem to be.

    • by Hentes ( 2461350 )

      I take it you prefer pump braking to ABS, then. Automation is not magic, completely self-driving vehicles are not going to appear from nothing in a day. Instead, manufacturers keep automating the mundane parts of driving, letting the human focus on the parts that actually need his attention.

  • "The driver used the vehicle's self-driving system for 37.5 minutes of the 41 minutes of his trip"
    "Driver Killed In a Tesla Crash Using Autopilot Ignored At Least 7 Safety Warnings"

    If your autopilot misses a truck, how can you say for sure the driver didn't die 30 minutes before the crash?

    https://en.wikipedia.org/wiki/... [wikipedia.org]
  • I wonder how many of these message that the drive had heard and learned to consider them as "false alerts"?
  • Buying a car with autopilot seems like buying a computer that can only work with three digit numbers.
  • Too soon (Score:3, Interesting)

    by duke_cheetah2003 ( 862933 ) on Tuesday June 20, 2017 @07:54PM (#54657681) Homepage

    I think this AI assisted driving should be taken out of the cars. This technology is not ready for the real world. It's just too tempting to turn this gadget on and doze off or some other totally stupid behavior, thinking the car can deal with driving on it's own.

    Having this incomplete technology in service could hamper efforts to convince government entities that self-driving cars can and will be safe, when an immature technology is turning out to be not so safe. And trust me, regulators are looking at this and saying to themselves, "If this can happen, this technology is not safe."

    Trust me, I want a self-driving car like yesterday, but the technology needs to mature more, more testing, in more situations needs to be done before this is ready for the end-user who's going to take a nap while his/her car drives itself.

  • by 140Mandak262Jamuna ( 970587 ) on Tuesday June 20, 2017 @08:08PM (#54657733) Journal
    It is basic safety design. Implemented even in steam engines. Trains are a good analog of autopilot because the locomotive drivers don't have to steer. They only control the speed manually. The biggest safety issue in locomotives is boredom. Eyes glaze over looking at the endless track, they ignore signals, their reaction time becomes very slow. Deadman's Treadle was introduced to make sure the loco drivers tread it once in so many minutes, else the train stops. In modern diesels they constantly issue alert and if the driver does not react it stops the train. Japan's bullet trains have their own methods to keep driver alertness. They always point to whatever they should be looking at, and look at it and say it aloud.

    Given the history of how to handle inattentive drivers on machines that require very infrequent action, they should have designed the auto pilot with random reaction testing alerts and challenges.

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...