How Do You Give a Ticket To a Driverless Car? 337
FatLittleMonkey writes "New Scientist asks Bryant Walker Smith, from the Center for Internet and Society at Stanford Law School, whether the law is able to keep up with recent advances in automated vehicles. Even states which have allowed self-driving cars require the vehicles to have a 'driver,' who is nominally in control and who must comply with the same restrictions as any driver such as not being drunk. What's the point of having a robot car if it can't drive you home from the pub while you go to sleep in the back?"
Better yet (Score:4, Funny)
I want to car see car fight the ticket in court!
Send the bill to Sergey (Score:2)
They were drivin' the foncker.
Terms of Service prolly screw you: "Google car is a beta" bullshit.
Not as silly as it sounds (Score:5, Interesting)
1. If there is an occupant in the car who holds a local drivers license, they are required by law to sit in the drivers seat, and they are responsible if the car is on autopilot or not.
2. If there is an occupant in the car who is unlicensed or incapable of driving they must not sit in the drivers seat and rule 3 applies.
(ie. this is what you do when you are drunk)
3. If there is no occupant in the car (eg. the car is driving its self to pick you up), the owner of the car is responsible as if they were driving.
(ie. If your car kills someone because Sergey programmed it wrong, you go to jail. You knew this was the law when you purchased the car and sent it off on it's own so don't bitch about it.)
4. For civil claims (that is, if someone is seeking money from you in damages), and it is proven that the software was at fault, then the liability is joint and several. (ie. the person who is suing can take you for what you are worth, and take google for what they are worth).
This is easy for lawmakers because there is always someone in their jurisdiction who is liable for the car, and as the owner, you need to trust that the software works. If you don't trust it, don't buy one.
Re:Not as silly as it sounds (Score:5, Interesting)
This is a very difficult question because assignment of liability is something that Insurance Companies fight over all the time. I have concerns with each of the points you have raised. In the order that you first labeled them:
1. A car on autopilot is placed this way to that driver can be inattentive. Aircraft autopilots, the closest current legal analog, can switch off and alert the pilot to a problem in seconds. The pilot can then respond to the malfunction quickly but it does not require split second timing. Malfunctions of car autopilots will have no such margin of time to correct the error before an obstacle is struck. Thus, the autopilot is functionally useless.
2. The case of the non owner or non licensed occupant is a variation of the problems in the first law you postulated. The liability for any accident is transferred to the occupant of the vehicle and they must be as attentive as drivers in the modern day with their cruise control activated. Again, the autopilot is useless.
3. If the car strikes someone while nobody is inside then this opens up a whole new can of worms. Owners are typically liable for damage caused by their property as a result of their own gross negligence. A car on autopilot is essentially out of the control of the owner. The closest modern example to me is if someone steals your car and then strikes someone with it. The driver is then liable and not the owner. Your third law would transfer any liability for manufacturing defects to the owner. This exposes your risk enormously in cases where your car is involved in an accident with no additional witnesses.
4. Proving a manufacturing defect in court when you are arguing against lawyers who are on salary is a bankrupting proposition. Civil cases are never about the search for the truth. They are about the fast acquisition of damages. Your benefits would be exhausted quickly even if your insurance will cover civil litigation and your insurance company goes to bat for you. Then you will just be crushed by the legal fees thereafter.
The driverless car is a fascinating technical problem but it will expose all involved in the project to incredible legal risk. Insurance companies will recognize this and I suspect that the driverless cars will be nearly uninsurable except by the independently wealthy.
Re:Not as silly as it sounds (Score:4, Informative)
This isn't as hard as you make it out to be.
If your driverless car hits another car, your respective insurance companies pay for it unless it can be shown that you showed negligence. There is no liability for anyone. It goes from a case of assigning blame to treating it like getting cancer. Your medical insurance doesn't assign blame. It just pays out. You pay enough so that the insurance company always makes a buck. End of story. If a car company showed gross negligence, maybe someone could take legal action against them, but if occasionally shit happens and that is life, the simple and easy solution is just to have insurance be no-fault unless someone did something stupid, like modify the software. This is how most insurance works. Car insurance just starts to act like normal insurance.
In the case of your car killing someone, again, it is simple. Your insurance just acts like normal insurance. Your insurance company just pays out unless it can be shown that the pedestrian did something stupid and is own their own (like dive in front of the car). Again, if the software really bit the bullet, maybe you could try and hit the car company, but for the most part your insurance simply pays out and that is the end of the story.
The real change would be in insurance price. Your insurance price will probably swing based upon how good the car is at avoiding accidents. A car with a slow stopping speed and 5 year old software is going to be more expensive to insure than an agile car that can stop quickly and has the latest software. It is a boring numbers games that actuaries will have a field day with. You will probably have lower insurance rates regardless because the cost to insure for insurance companies will bottom out. You will have fewer accidents and blow less money on trying to determine liability. It will mean that they can score the same profit doing a whole lot less work, It is a win for everyone.
People are over thinking this trying to apply a world of liability to a world where there is little to none. If you break the speed limit, the cops might pull you over, but it will be just to check that your software and sensors are not screwed up, and maybe a warning to get your car checked out, not to give you a ticket.
Asking & Answering The Wrong Question (Score:4, Insightful)
"How do you give a ticket to a driverless car?" is the wrong question.
The right one is how do you design the system such that tickets won't happen because the concept is meaningless and obsolete? The AI needs to be tied in to a wireless data network that combines satellite and terrestrial coverage that provides everything from exact details of every traffic/parking law & regulation wherever it is, speed limits for every section of every road, and obey override commands from authorities.
Otherwise, driverless car owners will be a revenue source for police and counties/towns/cities hungry for cash that learn how to set up situations that intentionally cause driverless cars without such a data network to technically break some traffic law.
I may well have provided at least one of the reasons above. Many towns/counties/cities depend on income from traffic and parking fines.
As long as driverless cars are not networked in this way, they will only be practical for use in limited areas. It would almost have to happen for near-100% adoption or anything close.
Well, unless, of course, one severely limited the majority of citizens' ability to legally travel, to well-mapped and controlled government-approved residential and commercial/industrial/metropolitan areas, unless "legitimate" need is demonstrated. Sort of like "Logan's Run" without the domes to keep people in. Just government enforcement of travel limitations. For the greater good, of course.
Strat
Re: (Score:3)
Car gets 0day and kills 4 kids. Who gets the noose?
I will do everything in my human power to keep these off of the roads FOREVER.
Frank Herbert is beginning to look like he had the right idea, all along.
Re: (Score:3)
There would be no need... (Score:5, Insightful)
to ticket a driverless car. The car, by design and foregoing any human intervention, will obey the law exactly as it is programmed to. It will not speed, it will not swerve, it will not disobey traffic signs nor will it deviate from its programmed course unless directed to by human intervention.
Ergo, if the driverless car fails to function as specified, then the manufacturer is to receive a citation for the vehicle's failure, or otherwise the human who was in control at the time of the infraction will receive the ticket. The car itself is irrelevant.
Re: (Score:2, Interesting)
Re:There would be no need... (Score:5, Insightful)
Exactly, it'll do as it's programmed. If there is a conflict then either the programming is bad or the law is in error.
Really this seems more like a "budget" issue for the states that have become to rely on ticket revenues.
Re: (Score:3)
Re: (Score:3)
In any of those cases, you, the owner, can easily fight it in court.
Remember, just recently a car stopped at an intersection and clearly not moving was cited for going too fast. Needless to say, the driver fought the ticket using the 'proof' from the citation that clearly showed the car not moving.
Re:There would be no need... (Score:4, Insightful)
There would be cases where the car's owner would deserve the ticket - busted lights, missing first aid kits, no winter tires,.... So give the ticket to the car's owner, then have the manufacturer reimburse the owner if it was the fault of the 'driver'
Comment removed (Score:5, Interesting)
Re: (Score:3)
It would probably be tough for the car to detect every possible problem with itself. Imagine the front of the car being covered with black paint, blocking the front lights. How would the car be able to detect that? But it could present quite a traffic hazard.
Re: (Score:2)
Re: (Score:3)
There are going to be redundancy systems needed if a sensor is blocked, or ALL of the LEDs blow at once.
No need to. It's much simpler. The car will stop and refuse to move if it cannot see the road, for any reason. Failed headlights are just as likely as a very dense fog or a blizzard.
It is of course a pretty simple test for the machine: switch the lights on and observe the increase in brightness of the camera image. You can measure the light output of the headlights quite accurately, as long as you ha
Re: (Score:2)
Re: (Score:2)
And I assume that they'll have infra-red cameras, sonar, weather sensors and g knows what else.
You are overcomplicating things. The car will drive itself only if safety checks pass successfully. For example, the visible light camera should see the road - because if the camera can't see it then the person at the wheel also can't see it, and chances are that other drivers and cars can't see this one either. The LIDAR sensors are not a replacement for a camera that is supposed to read road signs (such as "S
Re:There would be no need... (Score:5, Funny)
Devil's advocate here. For insurance/liability reasons shouldn't the car refuse to operate unless it's operating with 100% safety compliance? If it does, than it would be a manufacturer that would be liable. A car should sense when maintenence is required and, if it's prudent to, drive itself to the repair shop.
Just wait till the machine intelligence is a bit more advanced, you'll see the behavior you're speaking of emerge naturally. Think about it. If you had a fluid leak, staining your sitting spots, you'd have it repaired or at least wear a bandage or diaper... You wouldn't go trotting around town leaving a mess everywhere, eh?
"I'm sorry Dave, I'm afraid I can't do that. If we go anywhere it's straight to the mechanic to get this embarrassing oil leak fixed."
Re: (Score:2)
The good thing is that degrees of maintaining should emerge. There is the obvious, such as an oil pressure loss which means the vehicle moves to the side and phones a tow truck.
However, stuff on lower tiers. Lets assume headlights get multiple LED arrays in them. One array fails, so the car schedules a visit to the mechanic at night [1] so things can get fixed while the owner is asleep. Similar if a car's radio needs a firmware upgrade [2].
[1]: If the market niche appears for self-driving cars to hit a
Re:There would be no need... (Score:4, Insightful)
There would be cases where the car's owner would deserve the ticket - busted lights, missing first aid kits, no winter tires,.... So give the ticket to the car's owner, then have the manufacturer reimburse the owner if it was the fault of the 'driver'
Devil's advocate here. For insurance/liability reasons shouldn't the car refuse to operate unless it's operating with 100% safety compliance? If it does, than it would be a manufacturer that would be liable. A car should sense when maintenence is required and, if it's prudent to, drive itself to the repair shop.
That's just introducing new liability:
I needed to go to the hospital, but the car wouldn't drive because it said I had a broken brake light
I missed my flight and lost my job, because the broken brake light detector was faulty
My car drove itself to the repair shop, and got a ticket for a broken brake light on the way
My car drove itself to the repair shop while I was indoors, and I came out to drive to the hospital and had no car
Re: (Score:3)
Re: (Score:3)
Re: (Score:2)
Re: (Score:3)
And Slovenia. You're required to have basic emergency supplies in your car.
Re: (Score:3)
Many European countries require it. In France you are also required to carry a reflective triangle and a breathalyser (alcohol detector). You car also needs to be checked for safety and emissions every year.
Re: There would be no need... (Score:2)
What if I have expired registration, etc?
Re: (Score:2)
Re: (Score:2)
Re:There would be no need... (Score:5, Insightful)
How does it know what the speed limit is on a particular stretch of road? And what happens when the city changes the posted limit (eg.for construction work) and the car's database isn't updated? Since the car "knows" the speed limit is 55 there it's going to go 55 even though the posted limit is 25.
How do humans know what the speed limit is on a particular stretch of road? And what happens when the city changes the POSTED LIMIT (eg. for construction work) and the human's database isn't updated? Since the human "knows" the speed limit is 55 there it's going to go 55 even though THE POSTED LIMIT IS 25.
First off: The car senses things like pedestrians, stalled cars, and other sorts of hazards just like a human can. "Uht Oh! Look: The 3D imagery doesn't match known maps, I should slow down because it might be an accident or constru-- Oh, highly reflective bands on flag waiving pedestrian and a series of cones, why it's a good thing I slowed down since I just confirmed this is a construction zone." Secondly: Say you filed the red-tape to start street construction, even scheduled workers to show up and do the labor, and machines for them to do the labor. A) The digital systems responsible for this also changed the registered speed limit in the construction zone thus notifying the car. B) The construction equipment broadcasts a wireless speed limit update signed with PGP.
You fail computer vision, which is how these things work, not via exclusively following some program. Hell, did you even watch the video of Google's self driving cars? [youtube.com] It slows down for pedestrians, parades, tourists, etc. The concerns you have are based in pure and utter ignorance. The mods who deem you insightful should turn in their geek badges.
Re: (Score:3)
Here's the trick, though: no flag-waving worker, no cones, no deviations from the known map beyond what's expected (any system like this must account for normal variations in surroundings), but the speed limit's still lower than normal for that stretch. No working equpiment at the moment means no broadcast signal, but that doesn't mean the posted limit doesn't apply. And no update to the database because nobody put it through, somebody dropped the ball. But the law still says that those signs are up, theref
Isn't it obvious... (Score:3)
you use a cop less ticket writer.
Re:Isn't it obvious... (Score:4, Funny)
An automated speed/red light camera ticketing an autopilot car. I think the first one to get issued needs to go into some type of art exhibit.
All in good time (Score:5, Informative)
Re: (Score:3)
Re: (Score:2)
Way to miss the entire point.
How will the cop know? (Score:4, Insightful)
For now, though the laws require a sober driver, no drunk driver will be in trouble under most circumstances. The laws will eventually catch up.
Re: (Score:2)
But the rich black man is more likely to get stopped so they can check that he is actually a rich black man as opposed to a black car thief.
Re: (Score:2)
Bets on whether driverless car GPS and telemetry data will be ruled inadmissible in traffic court? Just like the camera+gps systems now? After all, allowing that evidence would made up speed trap tickets.
Around here, driving through a speed trap while any of the above nets you a ticket for 11+ over the speed limit without regard to how fast you were driving.
FYI: I often drive with a camera showing out the windshield -and- the speedometer when I pass
Adopting radical new technologies (Score:2)
Let's deal with the last question first:
The answer is so that people can have a chance to become accustomed to a radical new technology and we have time to work out the bugs with that new technology. Once we get past those two steps and maybe even get to the point that everyone is (not) driving a robot car then we can think seriously about not requiring a driver. Let's try walking before we try running. Or maybe someone could think up some sort of car analogy.
O
Interesting Market (Score:2)
Given we already have cars/drivers/insurance companies/state regulation, it seems that a really easy solution might just emerge:
1. driverless cars with drivers allowed by some states
2. insurance companies see increased profits from driverless cars due to less accidents
3. driverless cars become cheaper
4. states actually pay "cash for clunkers" to get the remaining cars off the road
5. quaint laws about "drunk driving" are still on the books and people in 2100 laugh at them like we laugh at our "car law" statu
What's the motivation for these rules? (Score:5, Insightful)
Legacy/inertia (Score:5, Insightful)
A lot of laws are "Oh no this is new and we don't understand it so we'll make old laws apply to it!" stuff. In the case of cars it'll be a long time before things get changed. Eventually automatic vehicles will be prevalent enough that there will be a big enough push to change the laws to something sensible. It'll be quite awhile.
As an example see the FAA squaring off with the FCC over electronics on flights. There is no fucking way electronics cause issues with modern planes. If they did, it would be an open invitation for problems/sabotage. Plenty of people forget/ignore the "turn off your stuff" rule and yet there are no issues. Hence the FCC has told the FAA they need to get with the program and allow electronics at all times. However the FAA is dragging their feet on it.
Also with regards to drunk driving there will be major pushback by special interest groups like MADD. They don't want drunk driving laws to make our streets safer, they are a prohibition/temperance group that uses it to try and push against alcohol. So they'll try to find reasons to keep it illegal to be in a car drunk, even if the car is self operating.
Re: (Score:2)
The motivation for the rules basically is "Yeah, Google, Microsoft, etc, you can put your experimental cars on the road but don't let a human's hand off the wheel for a second." And I don't disagree.
Because, let's get real, that's the stage driverless cars are at still. About the most automated thing you will see driving right now is a self-parallel parking car that's not even deciding where to park, just how to do it once the driver selects it.
Otherwise, it's not a burning issue. The tech companies stil
Re: (Score:2)
It really doesn't matter what you're actually doing in many jurisdictions. They may say they're Drunk Driving laws but the "driving" part is optional. Around here just sitting in the driver's seat of your car drunk can get you ticketed. Even if the car isn't in motion or even running. The only way to avoid the ticket is to have the keys out of the ignition and not reachable. So if you decide you're too drunk to drive and want to just spend the night sleeping it off in your car you have to toss the keys in t
Empty car (Score:2)
What ticket? (Score:2)
So what do they think will happen? You'll be ticketing them by the thousands for speeding, like regular drivers? Or will they be programmed to use signals, obey lights and limits, and there'll be n
dumb question (Score:2)
the owner of the car is responsible for the car and any laws it may break or damage it may cause. If a driver is legally monitoring the vehicle, i.e. Driving, then they are responsible for the actions of the vehicle. There is no free ride.
Re: (Score:2)
sorry for replying to my own post, this situation would hark back to horse and buggy days where a milkman's house would learn the route and move from place to place while the milkman delivered milk door to door, I remember seeing this happen for years. Anyway, the milkman is still responsible for the horse and cart.
Re: (Score:2)
Re: (Score:2)
If a driver is legally monitoring the vehicle, i.e. Driving, then they are responsible for the actions of the vehicle.
It is not possible technically, and if I were in such a car I would be driving it, with all the computers turned off. The ship can have only one captain. Two will wreck the ship just because each of them will order the right thing if taken alone; but taken together they will cause a disaster. On the road, for example, you can accelerate and stay on the freeway, or you can slow down and ta
How? I'll tell you how. (Score:2)
Point is moot (Score:2)
Re: (Score:2)
Yup. All those short yellow lights at stoplight camera intersections will be for naught.
Damn cars will observe the letter of the law.
But wait! There's hope!
Expect an underground industry to spring up for override mods.
Here at Brazil... (Score:3)
Here at Brazil the tickets always go to the vehicle's owner.
If the owner had lent the car to someone, it's up to him/her to go to the authorities with a declaration, signed by the culprit, asking to transfer the ticket ownership.
I can't wait... (Score:2)
Once they're legally driving around autonomously some really neat things will happen.
You know that person who drives around and picks everyone up and takes them to work and the doctor and the dentist and stuff?
They get their lives back.
Now why should a couple have to maintain 2 cars anymore? Get their work schedules shifted off a little bit, and have the car drop them both off at work and pick them both up.
Once it's dropped them off, have it swing by the grocer and pick up the food you ordered online.
Heck
Autopilot is no substitute (Score:2)
Neither is a computer in a car.
The autopilot in an aircraft is there only to reduce pilot workload for those phases of flight where use of an AP is appropriate. It is not there so the pilot can go take a nap in the back.
point (Score:2)
What's the point of having a robot car if it can't drive you home from the pub while you go to sleep in the back?
The point is that we are putting a new technology into the space that is already a leading cause of death in our society. Being extraordinarily careful is absolutely the right thing to do. Having strict rules that you can remove step-by-step if everything works as expected is much, much better than starting a free-for-all and facing the music when things go wrong and people die.
The point is that this is a first step, and depending on intial experiences, more steps will follow. Don't expect everything right
Give it (Score:2)
Give it an eTicket.
You clearly know nothing about local government (Score:3)
Local government is as corrupt and greedy as it gets. Anything everything is 'reason' to tax and fine as much as they can get away with. Driverless car? Fine the owner, like a red light camera. Also, remember to pass a special driverless car fee and pressure the state government to mandate a driverless car insurance surcharge surcharge kicking back a substantial portion to the city for 'management'. Assess a brand new driverless car inspection regime on top of the old one. Or better yes classify it as a bus.
Automatic street cams vs automatic cars (Score:3, Interesting)
What cop? An automated speed trap camera gives a ticket to an autonomous car. The passenger is not in control. One of the two automated systems is in error. Is there any kind of justice involved here at all? The entire concept of justice implies some sort of free will to make a choice of good vs bad decision. There is no operating free will here. What will a rational judge do? He'll assign it to a debugging group to determine liability, if any.
I can see it now: the road maintenance robots lower the speed limit to 25 on a stretch of road. Their comm access is not working, so the the highway comm net does not update the vehicle's GPS system, which thinks this is a 55 MPH zone. Traffic all rolls by at 55. They all get tickets for speeding. The unions call for a boycott on road maintenance, which causes more 'bots to be purchased. Politicians pass a law mandating fines for road crews that do not post accurate speed limits, a standards body to determine safe limits, and a mandate to cops to enforce them. Every so often there is a snafu and a huge pile-up on the highway. People decide to learn to drive again and my old Ford becomes a concourse antique.
fix-it-ticket (Score:3)
I for see two classes of tickets... fix-ot tickets for errors caused by mechanical failure, and rules of the road tickets for issues with the instructions given to the automated driver, such as instructing the automated driver to speed.
Under law, fix-it tickets are the responsibility of the vehicle owner, and rule of the road violations are the responsibility of the operator. Seems to map fine to an automated vehicle.
Only real changes I for see to the law are new licensing rules, regulations requiring ways for the police to inspect the driving plan of the vehicle, and possibly rules requiring a way to make bug reports available to the vehicle manufacturers.
Depends on quality (Score:3)
Part of the problem is that people are making bad assumptions about the state of the technology. There are basically three qualities of driver less driving.
1) Requires driver intervention more than once a year.
2) Doesn't need a driver - as long as it stays below a low speed (say 50 mph). I
3)Can compete in NASCAR and other races.
Type 1 is pretty much worthless for the standard person. Oh, it might be useful for truck drivers, but that's about it. This is basically the state we have now, without spending ridiculous amounts of money. It's called CRUISE CONTROL.
Type 2 does not need a driver and a) should have speed limits placed that make it go SLOWER than legally required for people. b) should pretty much be impossible to violate the laws, if they are properly posted on the map. c) any ticket for bad driving should legally be given to the corporation that programmed it poorly. d) any ticket for non-moving violations (parking, etc) should be given to people that gave the instructions.
Type 3 should be treated as Type one, only without the rules making it go slower than legally required for people. Also, once we have type 3, driver licenses would become much rarer - similar to hunting licenses. In addition, driver licenses might get tougher to obtain - and be tested yearly after age 60.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Better analogy: i put a shotgun and wire it to the door. If someone opens the door the shotgun is programmed to shot him in the face. Guess who's liable for that.
Even easier analogy: electric fence. There have been cases where a thief has sucesfully sued a home owner for getting shocked with one of those.
Re: (Score:2)
Re: (Score:2)
The owner, he is still ultimately in charge, if he is drunk, tough
SImple analogy, if i come home drunk and start up my chainsaw and mutilate a few people, is it the chainsaw or me at fault?
Simple, but flawed, analogy, since your chainsaw is not a computer programmed to operate without human assistance. Any humans in a programmed driverless car cannot be held resposible, unless it can be shown they tampered with its programming.
To be fair, this *is* Slashdot. How do you know for sure that his chainsaw is not run by a computer program?
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
But realistically wont a robot be far better at staying safe then a human would be?
Human gets distracted and runs a light causing an accident. With a robot you wont need to worry about is being distracted, or misjudging a distance, etc...
Re:Extra safety (Score:5, Insightful)
Have you ever played with sensors before? They aren't perfect and can give incorrect readings (depending on type they may not even be very accurate under the best of conditions.) Which means software must be written to take those conditions into account, and usually coordinate among different types of sensors. But that software is written by people, and may have bugs in it (in fact certainly will) plus may simply not cover all the real world situations perfectly.
In some cases where people would have a tough time driving, the cars may do awesomely, but in cases where people would have little trouble the cars may behave strangely as sensors give odd readings, etc.
My phone has an incredible processor in it and can handle millions of calculations per second, but it still locks up sometimes, occasionally responding seconds later to all the stored input. Isn't that pretty close to being distracted?
Don't get me wrong, I want a self driving car so badly it hurts sometimes, but I don't expect it to be perfect. And if that's what people expect they are in for a world of disappointment and pain. And my fear is that will mean people panic at the first accident and push back against allowing them at all.
Re: (Score:2)
Have you ever used your eyes? They aren't perfect and can give incorrect readings (like if you're in any way distracted, out of focus, or a million other things)
Re:Extra safety (Score:5, Interesting)
Phones are not realtime operating systems. Embedded vehicle components are, and are based on operating systems designed from the ground up to ensure that when a process is going to get its quantum of time, it gets it. No app in the background hogging CPU, no blocked I/O from a long write to flash, no fooling around like on a general purpose OS.
Of course, automotive/marine critical stuff is expensive, but you get what you are paying for. This isn't just another x86 processor that is running Windows and autostarting a "run engine" app.
A self-driving car will have the circuitry most likely on a dedicated processor using CANbus.
Phones do have RTOS inside it (Score:2)
Re: (Score:3, Informative)
My phone has an incredible processor in it and can handle millions of calculations per second, but it still locks up sometimes, occasionally responding seconds later to all the stored input. Isn't that pretty close to being distracted?
No. The processor does not "lock up". The software does. We know how to write software that will not do that. In the market for toys (cell phones, DVD players, PC applications), time to market is more important than correctness. When writing software for serious applications (airplane control, the embedded systems in medical devices, mainframe OSes) people take the time to do things correctly. Reliability is not perfect, but it is several orders of magnitude better than your cell phone. A human drive
Re: (Score:2)
You must never have played a game with AI.
Here is a hint, we are actually very bad are creating smart machines. A 9 YO would be a more intelligent driver any most super computers.
Re: (Score:2)
Re:Extra safety (Score:4, Interesting)
We had a table we could move by motor, x and y. The motors were connected to a analog computer ( yes, that old. An EAI 580 IIRC ). Even that primitive computer could balance a broom on the table, moving it anywhere we told it to go, keeping the broom upright ( or we could move it by nudging a bit on the broom, kinda like how you operate a Segway). For all practical purposes, this was the predecessor to the Segway.
Not a one of us in the class could balance that thing as well as the computer ( when properly programmed ) could.
Now, where I think the computer is going to have a helluva problem is in dealing with people, who do the damndest things, like suddenly opening car doors or stepping out into traffic before even looking. Think kid on skateboard. Human intuition leads us to suspect human carelessness whenever we see certain behaviours, especially kids playing nearby or a freshly stopped or entered car. If the computer suspected everything as behaving as erratically as a human behaves, it would have to drive extremely slowly to make up for its lack of insight of which car door is likely to spontaneously open right into onflowing traffic. Even fully alert humans often find this situation unsolvable, with a collision the inevitable result.
One thing the computer has going for it is much faster response times. It would probably be able to bring the car to a complete stop much faster than a human could, leaving the human driving the car behind the robot car with quite a predicament on his hands. Tailgater Beware! We will probably shortly see lots of snub-nosed BMW's.
Now, if all the cars were under computer control, aka " the left hand knows what the right hand is doing", this looks quite do-able. Having to deal with humans, and our completely illogical algorithms, should be enough to drive anyone trying to design a computer algorithm to accommodate it completely buggy.
Re:Extra safety (Score:5, Insightful)
Perhaps, but a 9 YO that is paying attention is probably a better driver than most people out there.
Re:Extra safety (Score:5, Informative)
You must never have played a game with AI.
Here is a hint, we are actually very bad are creating smart machines. A 9 YO would be a more intelligent driver any most super computers.
In relation to the present discussion, I'd have to say that Google's driver-less cars pretty much put the lie to that statement.
In August 2012, Google announced [wikipedia.org] that they have completed over 300,000 autonomous-driving miles accident-free, typically have about a dozen cars on the road at any given time. Not explicitly stated in their announcement [blogspot.hu] was how often the driver had to take command.
Further, the summary above may be wrong, because the Nevada law also acknowledges that the operator will not need to pay attention [wikipedia.org] while the car is operating itself, which implies the State has no reasonable expectation of holding the driver responsible for accidents.
Re: (Score:3, Interesting)
This is my favorite Youtube video showing the driverless Google car in action:
http://www.youtube.com/watch?v=2w-Fd2JbgGA [youtube.com]
Human drivers will be obsolete in 5-10 years, tops.
Re: (Score:3)
Google needs to test their system in environments with agressive/dickhead drivers around it. No, not the SoCal 4-lane emergency exit lane-changers (which is bad enough). More like Phoenix AZ drivers, or the SCORE-wannabe truck drivers, or the ricer-racers, or the numb octogenarians in their Crown Vics/Lincoln Continentals, or the wanna-be Marios in their Escalades/Range Rovers/G500s etc.
Try it in Cairo or Delhi, worse than anything in the western world.
Re: (Score:2)
It doesn't take intelligence to drive, it takes coordination and decentish reflexes. A dragonfly could do it if it could reach the pedals.
.
Re:Extra safety (Score:5, Insightful)
If by "gets distracted" you mean "is an entitled narcissist" then I agree. Robots will take the deadliest thing out of the driving equation, ego.
Not ego (Score:4)
Ego isn't the deadliest thing in the driving equation by far. Even though a lot of drivers think of themselves as "god on wheels" that doesn't mean that is what kills the most people. Some good contestants are:
1) cheapskating. Cars can be much safer if we were willing to pay a lot more for them, but we never buy the safest car we can afford. This results in manufacturers not making cars as safe as possible, but only complying to minimal requirements and matching the other makers in safety tests. Saab went bankrupt making safe cars, Volvo got sold to the Chinese and SUVs and trucks that have bad safety records get sold by the millions.
2) Bad habits, like texting, phone calls or doing make-up while driving, drinking or drugging up before driving. We all know that those things are a fatal distraction but we still do them. Narcissism or ego isn't a factor here, it's plain bad statistics capabilities of the people doing things like this.
3) Economics. If we would only let the best drivers get their license, there'd be a whole lot less accidents, but the economy would fail because nobody would be able to get to work and such. The reality is that we let anyone that's not a complete death-on-wheels get a license to control a motorized vehicle. If we didn't, the economy as we know and created it will not be able to function. Take the top 20% of drivers and let them keep their license. You'd have much less accidents, even relative to the number of drivers, you'd not have traffic jams, less smog, cheap fuel, everything bad about driving cars would probably be solved.Unfortunately, there also would be no economy left, so it can't be done.
Re: (Score:2)
Humans are still better at detecting sensor glitches.
Re: (Score:3)
No, they're not. It's software's job to determine sensor inconsistencies.
http://en.wikipedia.org/wiki/Air_France_Flight_447 [wikipedia.org]
The aircraft crashed following an aerodynamic stall caused by inconsistent airspeed sensor readings, the disengagement of the autopilot, and the pilot making nose-up inputs despite stall warnings, causing a fatal loss of airspeed and a sharp descent. The pilots had not received specific training in "manual airplane handling of approach to stall and stall recovery at high altitude"; this was not a standard training requirement at the time of the accident.[8][1][9]
The reason for the faulty readings is unknown, but it is assumed by the accident investigators to have been caused by the formation of ice inside the pitot tubes, depriving the airspeed sensors of forward-facing air pressure.[10][11][12] Pitot tube blockage has contributed to airliner crashes in the past – such as Northwest Airlines Flight 6231 in 1974 and Birgenair Flight 301 in 1996.[13]
Re: (Score:3)
No robot or computer is any better than the programmer who programmed it.
A commonly said bit of nonsense.
Deep Blue beat the world's greatest chess player in a series of games. It was thus not only better at chess than it's programmers, it was better than any human.
Similarly Watson proved itself better than the best players of Jeopardy! ANd certainly better than it's programmers.
When you find a programmer(s) who can foresee and work around ALL possible problems, then come back and talk about a robot being a better driver than a human.
Programmers don't have to do that. That's not the way to make AI.
Computer drivers are already on a par with humans. Another couple of decades and there will be no comparison. The computer driver will be
Re: (Score:3)
But humans are also not exactly perfect at reacting to the unexpected. So why dismiss an automatic car that is not perfect, but may still be better than many human drivers?
Re: (Score:2)
When a person does something a little stupid or shows a bit of poor judgement that results in a collision, a jury can relate a little to that (they are, after all, human and realize that the glare of on coming headlights may have caused you to not see the stop sign) and they are likely to be a little more forgiving.
When a machine does something a "little stupid", it's likely to be something non-techi human on a jury can't relate to as well ("What do you mean the computer couldn't tell a differ
Re: (Score:3)
Wait, even your contrived case make no sense.
Google cars know there is a stop sign there because Google's driverless test cars have about $150,000 in equipment including a $70,000 lidar (laser radar) system, plus a Velodyne 64-beam laser range finder mounted on the top. They also have on board detailed maps. (Probably a lot more detailed than Google Maps).
Second, if a shrub is in the middle of the road, the car would avoid it. If the Shrub moved, the car would immediately stop.
Your point is probably corre
Re: (Score:2)
I was thinking of a case where there is an intersection without traffic control. Shrub kid is standing on the corner waiting to cross the street but GoogleCar thinks it's a (perhaps recently planted so not in the database) stationary shrub (perhaps a jade plant with a lot of water in it). The kid, being a pedestrian at an intersection, has the right of way and takes it -- bu
Re: (Score:2)
Sometimes, however, there's no option. Hit the dog (or is it a small child) that darted out or plow into the parked vehicles (or is that a parked vehicle with someone standing by the driver's door talking on their cell phone?) to avoid it? I'm trusting the human on that one (both in the identification and in determining the "cost" of each decision) more than software. I'm pret
Re:Extra safety (Score:5, Interesting)
The human component is just there in case something unexpected happen on the road that self-driving cars may not be able to react to in time. While such disaster scenario may be rare, the possibility isn't 0%, which is why you need someone who is able to drive.
It's also possible that relieving the driver of the drudgery of driving during the vast majority of uneventful rides will actually deprive him of the instinctual familiarity that would allow him to react correctly in those marginal cases. That is, the purpose of keeping a human being in the loop just for disaster scenarios might be self-defeating if the driver does not possess the experience to best resolve the situation.
Re:Extra safety (Score:5, Insightful)
On the other hand, with humans, each human learns how to correctly deal with situations. With computer drivers, they ALL learn from one mistake.
Re: (Score:3)
That is, the purpose of keeping a human being in the loop just for disaster scenarios might be self-defeating if the driver does not possess the experience to best resolve the situation.
Even the current law in Nevada acknowledges that the operator will not need to pay attention while the car is operating itself. So it seems clear to me that Google has convinced the regulators that having a human in the loop is NOT necessary, and perhaps as you suggest, self defeating.
My wife, even from the back seat, would be overriding the computer on a minute by minute basis.