California Says Autonomous Cars Don't Need Human Drivers (bloomberg.com) 202
Currently, California law requires that all self-driving cars used for testing purposes be done with a human behind the wheel, so that they can take control if necessary. While California has been fairly strict on how self-driving cars are to be used in the state, they appear to be relaxing several of the rules. "The state's Department of Motor Vehicles released proposed regulations Friday for autonomous vehicles, dropping an earlier requirement that a human driver had to be present while testing on public roads," reports Bloomberg. "The DMV also backed down on a previous rule that vehicles needed a steering wheel and pedals for the operator to take back control." From the report: "When we think of driverless vehicles they can either have conventional controls, which are steering wheels, pedals, things like that, or they cannot," said California DMV Chief Counsel Brian Soublet during a conference call with reporters. If companies test vehicles without conventional controls, they have to show the California DMV that they have approval from the National Highway Traffic Safety Administration, he added. NHTSA said in early 2016 that self-driving software systems, not just humans, can be considered drivers. "If California was going to keep that level of development activity in the state, what they did was necessary and timely," said Eric Noble, president of The CarLab, an automotive consulting firm. "They kind of had to do it because at some point manufacturers can't move autonomous vehicles forward without getting controls out of cars." The proposed regulations have a 45-day public comment period that ends April 24. That will be followed by a public hearing. During Friday's conference call, the California DMV said the rules should be completed by the end of the year.
Good (Score:5, Insightful)
This seems like an important next step. Expecting even a trained human to take over with only a few seconds (or less) leeway is crazy and cannot work.
I expect that these regulations will evolve a bit as we see which self-driving car developers can handle this and which ones cannot. There will likely be a few accidents, hopefully none serious. But since these cars have no egos and no temper, they're likely to drive far safer than the average human.
Re: (Score:3, Insightful)
Yeah, what could possibly go wrong, right?
Re: (Score:3)
Most? I think it's not most are terrible. Some are terrible, some are bad, some are so-so, some are pretty fair, some are good and some are excellent. Having a self driving car suddenly lose it's capability to judge where the center of the road is would be equivalent to a stone drunk human for example. As long as it happens less often then you get a wasted driver on a per mile basis I'd guess it's safer. Still, we're talking about a "testing" program here. If I'm a company that wants to develop and bu
Re: (Score:2)
If you are just labels to a Bell curve then, sure, that works for everything, but it is meaningless. You've not contradicted the idea that most human drivers, by the standards we would like for operating as 2 or 3 ton vehicle, are terrible. Most will get into several significant accidents in their lifetime.
You are also completely wrong about a self driving car not finding the center line being like a stone cold drunk. That car may have little clue where the line is, but it will STILL not just run over a ped
Re:Good (Score:5, Insightful)
A machine is likely to be safer at driving humans in very little time. I have been driving for more than 30 years and can attest that for at least the first 5 years I took too many risks and for the last five I probably have not been quick thinking enough for dynamic traffic situations. Society does not give a toss how safe you think you can drive, all society cares about is the total cost of road traffic accidents. Automatic driving is so close you can smell it.
Re: (Score:2)
I really think it's about that close too. What surprises me is that there isn't more movement to eliminating the pilot in commercial aircraft. It's to the point that the biggest threat to passenger safety is the pilot. We've seen at least two major catastrophes that were basically suicidal actions and that's leaving out the 9/11 attacks. Then there are the ones that are overtired or over medicated or intoxicated. I have the illusion of safety when in an automobile because I have control. If I die it's
Re: (Score:2)
What surprises me is that there isn't more movement to eliminating the pilot in commercial aircraft. It's to the point that the biggest threat to passenger safety is the pilot.
Problem is that the autopilot is still not good enough to replace pilots in all circumstances. We've had Airbus crashes where the autopilot overrode the pilots, resulting in hundreds of deaths. We've had situations where the pilot has saved hundreds of lives such as the one that landed in the E river or the Gimli glider. Perhaps the autopilot would have landed safely in the river but in the case of the Gimli glider, officially the plane could not glide (luckily the captain was a gliding enthusiast) and the
Re: (Score:2)
I think a few more suicidal pilots and the idea of computer control might become more attractive.
Re:Good (Score:4, Insightful)
... at any given moment. The problem is that every driver is terrible at least some of the time. People get distracted (both externally and internally), fatigued, can look in only one direction at a time, can't see in certain directions from the driver's seat, etc., all of which can effectively mimic impaired reaction time just as easily as drunk driving, albeit for a shorter period of time. Fortunately, most of the time, you don't need fast reaction time to drive safely.
Re:Good (Score:5, Insightful)
Expecting even a trained human to take over with only a few seconds (or less) leeway is crazy and cannot work,
True, but there are plenty of circumstances where there is more time for the human to intervene. Suppose the road is blocked somewhere, and there's somebody directing traffic and letting people drive over the sidewalk, or on the wrong lane, or explaining how to make a detour, or tell the driver to wait for the pilot car. Plenty of situations are too difficult for an autonomous car to handle, but not imminently dangerous, assuming that the self driving car is smart enough to stop when it notices the road is blocked.
Re: (Score:2)
Re: (Score:3, Insightful)
Even Tesla, the self proclaimed leader in this technology, is struggling to get the simplest things to work reliably; http://bgr.com/2017/03/02/tesl... [bgr.com]
Re: (Score:2)
Re: (Score:2)
I expect self driving cars to be hitting the road well before such technology is adopted. It also takes time to set it up and program it with the proper information, in case a road is blocked suddenly.
Re:Good (Score:4, Insightful)
Fortunately, while the new regulation is going to say that there doesn't have to be a human driver in the car, it doesn't actually prevent him from being actually present.
When opening for the autonomous car not having controls for the human, it doesn't much matter whether the human isn't prevented from being there.
The first time a cop tries to direct traffic around an accident, and there either isn't a driver in the car, or there aren't controls in the car for the driver to operate, this will be challenged. If it led to severe delays for important people, I expect it to not survive.
Re: (Score:2)
'I'm sorry Dave, I can't do that.'
Re: (Score:2, Insightful)
"Expecting even a trained human to take over with only a few seconds (or less) leeway is crazy"
Let me see if if I have this straight. An autonomous car is doing something nutty, like following the vehicle ahead of it into a gas station at excessive speed? And you think the way to handle that situation is to trust the car?
Re: for whom? (Score:5, Insightful)
Sure because once you have told the automated car where to go there is no possible way of stopping it along the way or changing its destination.
Also I would venture to guess that the majority of driving is commuting to work for a lot of people. And many of them would welcome this.
Re: (Score:2)
What Really Saves Gas? [edmunds.com]
Test #3 Use Cruise Control
Result: Surprisingly effective way to save gas
Cold Hard Facts: Up to 14-percent savings, average savings of 7 percent
Recommendation: If you've got it, use it.
Re: (Score:2)
As the article says,
One thing that's important to note: if you are in a mountainous area you should turn off cruise. It will try to keep you up to the speed you've set and will use a lot of extra gas downshifting to lower gears to accomplish this.
Which is why I hardly use cruise control and wonder about a self driving car.
Re: (Score:2)
Parent is probably wondering about it because results in that area should be applicable to non-self-driving cars, so why aren't we seeing vehicles coming out now that already handle hills decently? (or.. are we seeing just that in newer cars, just without fanfare?)
Re: (Score:2)
Depends on their goal. Efficiency or legality. I hate it when my vehicle tries to maintain the legal speed limit on the roads around here. Then there are the advisory speed limits (yellow signs before a corner), which are usually too low, at least in good conditions.
Re: (Score:2)
Re: (Score:2)
Slashcensorship hard at work (Score:2)
Subject says it all. Rational criticism not allowed when SJWs and social engineers are around with mod points.
Re: (Score:2)
What freedom? I think most people drive a car to get them to a certain destination when they want to. An automated drive can do that.
Many people also want to be in control, at least most of the time. Automated drive may or may not allow for that. If the driver can control the automated drive it will be a welcome thing in the traffic jam or on a boring part of a long trip. It can even give more freedom: it can allow the driver to take the car while drunk, when suffering from occasional epileptic seizures (wh
Re: (Score:2)
I belong to that minority who loves to drive and has a reasonable set of skills and even I wouldn't mind an autopilot in a traffic jam. I wouldn't think yet about that far future where cars won't have controls. I can imagine areas though that enforce autopilots.
Mostly though I think autopilot is a limited concept. The larger concept is that control happens at a higher level, even if it's just monitoring. We're rapidly moving towards a situation where every move is being monitored and where the second you ex
Its too early IMO (Score:3, Insightful)
I think its too early for autonomous cars to drive around without drivers. Imagine what happens when an accident occurs. Then the technology will be demonized. That would be horrible. Only allow autonomous cars to drive around without drivers once you are certain they are not just better than the average driver, but than 95% of all human drivers.
But I guess its like with most people who have a risky driving style: they say "who cares", until something horrible happens due to that carelessness, and then they are either unable to say anything any more, or are terribly sad.
It depends... (Score:3, Insightful)
If it is a 4,000 lb. passenger vehicle with human occupants going at 70 MPH, I agree. For a vehicle that will be carrying humans *anyway*, I don't see any need to remove the controls in the near future, even if they are not going to be used much. Maybe one day if you have unaccompanied humans who cannot be trusted with that option, but I think it's too early for that.
If vehicle + payload is less than a couple hundred pounds, about the size of a scooter, and doesn't go more than 35 MPH or so, then I think
Re: (Score:2)
Re: (Score:2)
We allow people on bicycles that can do about the same damage without requiring they prove themselves.
You can also regulate the design of it such that it's unlikely to do much damage to a typical pedestrian, even if things go very wrong. Think about how cars today in europe are required to give more to allow for pedestrian safety. Then amplify that further since you don't need to tradeoff against human visibility.
Re: (Score:2)
Re: (Score:2)
If vehicle + payload is less than a couple hundred pounds, about the size of a scooter, and doesn't go more than 35 MPH or so,
I explicitly said that in the beginning. I don't picture the future of delivery being giant trucks, but more scooter sized things in the scenario of humanless vehicles. It takes a lot of energy and fuel to move the giant trucks, so I imagine the balance for home delivered packages to be smaller than a passenger car, and the giant trucks reserved for large packages and similar special cases, and human occupancy for the foreseeable future for such vehicles.
Re: (Score:2)
Re: (Score:2)
You have people stealing packages today (they get left out a lot). A vehicle could conceivably be constantly streaming location, cameras streaming security feed out, and so on. Sounding an alarm and notifying police is possible, keeping about the same level of risk/reward as stealing packages today. A vehicle capable of driving itself should be able to notice if something isn't right, indicating breakin, being forced to stop, or being moved, and alert human operators to scrutinize that particularly units
Re: (Score:2)
We allow people on bicycles that can do about the same damage without requiring they prove themselves.
False logic. We actually have a very long history of bike riding and know very well the safety and risks. There are even rules and laws established to manage these risks. And on top of that there is human self preservation instinct that helps. Not so for autonomous vehicles.
Re: (Score:3)
"So if a vehicle carrying packages runs over a pedestrian that's ok?"
Hey, this is America. If we allow people to take precedence over commerce, we will lose the freedoms our forefathers fought for. Right?
Re: (Score:2)
Full agree. But when it has the size of a scooter, I wouldn't call it "car" any more. That's a different category of vehicle.
Re: (Score:2)
At least you're giving some thought to the subject. I think you are wrong about autonomous taxis, but not because they don't make sense. The problem is that to navigate the surface streets, parking lots, and driveways where taxis travel, you're going to need to sense a very wide variety of things including non-standard road signage, parades, snow, traffic signals directly between the vehicle and the sun, construction zones ... That's going to entail following 50,000 rules or, for all you and I know, mayb
Re: (Score:3)
I have a friend who is a truck driver who worries about driverless trucks one day putting him out of work. I laughed and told him that the first time some driverless 80,000-pound semi has a software glitch and piles full speed into a busload of kids, his future employment will be secured forever.
Re:Its too early IMO (Score:5, Insightful)
I wonder though how much that is currently done with big trucks will become smaller vehicles, but more numerous. One of the big motivators for piling tons of stuff onto one truck is because each vehicle needs an expensive driver.
Now there are other motivators, but in scenarios where the big truck is used because only because you need to amortize the large expense of a human driver, you'll probably see smaller things on the road when/if autonomous cargo transport happens.
Re: (Score:2)
"One of the big motivators for piling tons of stuff onto one truck is because each vehicle needs an expensive driver."
No, it's because freight containers come in two standard sizes, with road trucks being sized accordingly.
Re: (Score:2)
I said *one* of the big motivators. I know a lot of road truck trailers never touch a train or boat. Some definitely do. Some are going very directly from point A to B with a large amount of cargo that gets economies of scale and it makes sense to have a single big weighted thing. Some are driving convoluted delivery routes because it's cheaper than concurrently operating a lot of vehicles, and the convoluted delivery route with many stops becomes a burden compared to a hypothetical fleet of lightweight
Re: (Score:2)
"No, it's because freight containers come in two standard sizes, with road trucks being sized accordingly."
Some ARE like that. Some aren't Surely, you don't think that Walmart truck up ahead of you on a rural state highway is delivering 1300 34 inch flat screen TVs to the Walmart in North Hellandgone, Idaho.
Re: (Score:2)
There's a balance.
FedEx/UPS Trucks driving over a long route may lose out to more point to point operation of small vehicles. A large chunk of the manufacturing capacity of a nation going to a particular port is another thing.
Re: (Score:2)
Well, airplanes still have pilots even though they pretty much fly themselves. And airline pilot told me that the reason he is in the plane is mostly for passengers to feel safe.
They are just starting making conductorless trains even though in many cases the conductor does nothing but push a button. There is value in having a human on board that goes beyond operating the vehicle.
Re: (Score:2)
Truck drivers will still be needed for the last mile delivery. I don't think shippers will come up with a robot nimble and clever enough to deliver appliances and fitness equipment into homes (and unpack and install them when contracted to).
However this reminds me of our heading toward a 60% unemployment rate due to automation (store checkouts, stock clerks, fast food jobs, etc. are gradually going away. This is evident in many NYC pharmacy/convenience stores, for example) - what are we going to do about th
Re: (Score:2)
Re: (Score:2)
Fair enough but then you're increasing fuel and vehicle costs and creating a new logistics problem.
Re:Its too early IMO (Score:5, Insightful)
I have a friend who is a truck driver who worries about driverless trucks one day putting him out of work. I laughed and told him that the first time some driverless 80,000-pound semi has a software glitch and piles full speed into a busload of kids, his future employment will be secured forever.
If that was true human drivers would already be out of a job, since about 5000 people are killed every year in trucking accidents. While it's certainly likely for a new crop of Luddites to fly off the rails at every autonomous accident, I don't share your pessimism about what the outcome would be of their crackpot protests.
Re: (Score:2)
If that was true human drivers would already be out of a job, since about 5000 people are killed every year in trucking accidents.
Let me rephrase the original poster a little bit:
"the first time some driverless 80,000-pound semi has been hacked remotely and piles full speed into a busload of kids..."
An accident is one thing, but human drivers cannot be hacked.
A student of mine just had to delay his application to a cybersecurity PhD program because the website has been defaced and disabled.
Re: (Score:2)
While it's certainly likely for a new crop of Luddites to fly off the rails at every autonomous accident, I don't share your pessimism about what the outcome would be of their crackpot protests.
You just watch how fast legislation gets passed when CNN and every other media outlet does a week straight of coverage of the horrific tragedy caused by the first driverless semi to cause a major deadly crash. The first question every Congressman will be answering for weeks in every interview and press conference will be "What are you going to do to stop this from happening again?" Yes, people have accidents every day too. But Americans have come to accept that. But when a robo-truck kills, everyone and the
Re: (Score:2)
Re: (Score:2)
"I agree, we need standardized testing to actually license the software and hardware."
Maybe. I don't think the Silicon Valley folks understand how liability works. If they start killing foiks and destroying property, they're going to find out that license agreements are a dubious protection from predatory lawyers. Auto company programmers understand liability, but I think they have even less understanding of the complexity of a fully autonomous vehicle system than the kids working along El Camino Real do
Re: (Score:2)
OMG, you mean if the technology doesn't work and people give it a bad reputation because it doesn't work that's somehow "horrible"?
No, of course the reasoning is sound, but it would be horrible overall if the technology would be demonized and all its development on it abandoned before it even had the chance to mature because some people thought it already was mature enough to completely go driverless.
Re: (Score:2)
The law is saying that the car itself can be considered a driver, so unless you're spiking the gas tank I don't think it'll get a DUI.
Re: (Score:2)
Dunno - E85 gas is up to 15% ethanol - 30 proof.
A couple of gallons and .... wipeout.
"Great idea!" said the CIA (Score:5, Interesting)
In today's news, a North Korean businessman who was visiting the U.S. on a tourist visa was killed in a freak accident involving a driverless car in Los Angeles.
Either this backfires big time ... (Score:2)
... or California is going to be the first place to make a huge technological leap.
I'm sort of hoping for the second option.
Re: (Score:2)
No technology is perfect and no doubt people will still die in crashes but I think that overall the death total should drop over time as self drivers become the norm. When you consider around one third of highway deaths are alcohol related and then factor in things like cell phone usage and such you can see where autonomous vehicles should have an advantage. The problem is you can have drunks kill 10,000 people and the outcry from a few deaths due to sensor glitches in autonomous cars will drown that all
The irony of safety concerns. (Score:5, Funny)
"I don't trust these newfangled vehicles! Those computers will crash and cause accidents!"
...says the human ranting on social media, texting behind the wheel while driving on the freeway...
Re: (Score:2)
Standards are needed (Score:2)
Humans are required to pass a standards test. Machines (so far) are not.
Why is there not a standard being established by the self-driving industry with stakeholders from government and the public?
The standard needs to be there to set minimum guidelines for:
- Software vulnerability
- Computer redundancy (three computers checking each other - like the airplane industry)
- Obstacle detection
- Rule downloads/updates by
Re: (Score:2)
Good list, but I'd add one. There needs to be a remote manual override... if for some reason a police officer wants a car to stop, it needs to stop. If emergency vehicles need a clear path, the car needs to get out of the way.
Yes, this will mean there will be a minimum level of exploit possible by malicious individuals, but it's necessary to make autonomous vehicles behave appropriately on public roads.
Reason is clear (Score:2)
Premature given their disengagement rates (Score:3)
Re: (Score:2)
I would also want to make sure those tests reflect a wide variety of real-world driving conditions. An AI driving 1,000 miles on the relatively standardized interstate is WAY different than it driving 1,000 miles on poorly-marked country backroads in Bumfuck, Montana.
No Reason to Trust the Programming (Score:2)
Re: (Score:2)
The specific instance or the software platform?
If it's each and every car, then you've overwhelmed the DMV, mulitplying their load by maybe 10 fold.
Otherwise, the NHTSA approval is presumably pretty much that.
Although, the same standards don't necessarily work. Driving training and tests take a lot of human capability for granted. If you had something that could *just* pass the test, but otherwise have zero human capability, it could be still be a very dangerous vehicle operator.
Re: (Score:2)
I'd like to know where you got that number from. Exactly how many self-driving cars do you think each existing driver is going to buy?
Re: (Score:3)
Think about commercial fleets/motor pools/rentals. Replaced every 2-3 years, in some scenarios many cars for relatively few drivers. Generally speaking, every driver gets *one* road test for their entire lifetime. Saying that they will operate 10 vehicles in their lifetime is probably *conservative*.
Re: (Score:2)
"Exactly how many self-driving cars do you think each existing driver is going to buy?"
In the long run, zero. They will all be in fleets. Buying your own car will be like buying your own airplane, a specialty for the few.
Re: (Score:2)
They don't even test each class of car to pass a driving test. Let alone each version of the software of each class of car.
NHTSA approval is no such thing, it's a set of guidelines to car markers.
The car does not pass a driving test, yet it's allowed to drive. Not by maker, not by model, not by version number. Not at all, in any way does it need to pass a basic test of driving competency.
You will end up with something like a program the FAA uses - the manufacturer certifies the craft to perform under certain conditions. You are required to maintain it to a certain spec and ensure that it is operated according to spec. This will be 'easy' to do with commercial fleets, a tad more interesting for private vehicles.
You may well see most autonomous vehicles rented / leased instead of just owned by individuals. Of course, that is the general direction society is heading these days anyways. You
Re: (Score:2)
> A person cannot drive on the road without a driving test, so why should a car?
They're doing the driving tests now. The next step (allowing the drivers who passed to drive without a veteran driver present) is what this is about. Currently autonomous vehicles are essentially on their learner's permits. Once they pass the driving test and get their license, why not let them drive without the veteran driver's presence?
You started on a good analogy and then veered off the road with it. ;)
Re: (Score:2, Insightful)
A perfectly valid reason.
What people call "lazy" is what drives humanity forward. After all, farmers are just lazy guys who keep their food sources close by instead of hunting and gathering like real men.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Is every parking spot in the area going to be reporting whether it is empty or not?
That's certainly an option, yes. I've already seen parking garages where an overhead device checks for presence of the car in each spot. It would be easy to wire them together and allow 3rd parties to access the database. Also, the new NarrowBand IoT standard would allow simple standalone devices to be developed that could be installed in the floor of each parking spot and report every time a car enters or leaves.
Re: (Score:2)
Re: (Score:2)
Once it has all the information, it would be easy to add up fuel cost and parking fees for all available spots, and pick the cheapest.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Call me lazy, but one reason I want an autonomous vehicle is so I can get "valet-like" service at any restaurant. My town gets really busy on the weekends and finding a parking space near a restaurant is difficult. If I could stop in front, get out, say "go find a parking spot" then when I am done phone the car to come pick me up. That would be fabulous.
I see the biggest impact of autonomous vehicles, long term, as being the elimination of parking associated with destinations. The one-third of each US city devoted to parking lots will now be available for other uses. Apartment blocks will owe have parks. Strip malls will grow from today's L's and U's around parking acreage to fat O's that have a public center space for uses like outside seating. Shopping malls will become the cores of entire civic centers that will grow around them.
Autonomous cars will nee
Re: (Score:2)
Re: (Score:2)
Since cars would only be parked as demand slacks off from commuting peaks, an nowhere near where people are, none.
Re: (Score:2)
Re: (Score:2)
Other fools think insurance companies will be happy to pay for human drivers that aren't absolutely perfect and infallible. It's a ridiculous idea.
Re: (Score:2)
Re: (Score:2)
it is one thing to pay for human drivers to pay for the faults of human drivers and quite another for passengers to pay for the faults of their automated driver which they have no control over. It's like getting insurance to ride a train.
The insurance for riding a train is included in the fare, of course. For a self driving car, there's no reason why the owner couldn't be required to get an insurance policy. Alternatively, the manufacturer of the car could get a policy for you, and charge you a monthly (or per mile) cost for riding.
Re: (Score:2)
Re: (Score:2)
The owner is the primary person enjoying the benefits of the car, so it makes sense they pay the policy for possible accidents, either directly by taking out a policy themselves or indirectly in the cost of the car, or a monthly fee. This also allows them to choose their own coverage options.
If you don't accept, then don't buy a self driving car, and pay a bit more.
Re: (Score:2)
Re: (Score:2)
I'm not really sure why you think people will be happy to pay for accidents that aren't their fault.
It's the most practical thing to do. Because they get the benefit of riding a car without having to drive themselves. And they'll be paying less insurance premiums too, so I expect them to be happier than when they have to drive themselves, get involved in more accidents, and pay higher premiums.
if a manufacturer isn't held liable for automation, then how can they be held liable for any other kind of defect?
Same as it is now. When a car has manufacturing defects, or there is a case of gross negligence by the manufacturer, the insurance company will sue them.
Re: (Score:2)
Re: (Score:2)