Please create an account to participate in the Slashdot moderation system


Forgot your password?
AI Robotics Transportation Hardware

How Do You Give a Ticket To a Driverless Car? 337

FatLittleMonkey writes "New Scientist asks Bryant Walker Smith, from the Center for Internet and Society at Stanford Law School, whether the law is able to keep up with recent advances in automated vehicles. Even states which have allowed self-driving cars require the vehicles to have a 'driver,' who is nominally in control and who must comply with the same restrictions as any driver such as not being drunk. What's the point of having a robot car if it can't drive you home from the pub while you go to sleep in the back?"
This discussion has been archived. No new comments can be posted.

How Do You Give a Ticket To a Driverless Car?

Comments Filter:
  • by Anonymous Coward on Monday December 24, 2012 @08:20PM (#42384765)
    So what you're saying is, cars don't commit DUI's... car drivers do. Bizarre thinking.
  • by SternisheFan ( 2529412 ) on Monday December 24, 2012 @08:39PM (#42384837)

    There would be cases where the car's owner would deserve the ticket - busted lights, missing first aid kits, no winter tires,.... So give the ticket to the car's owner, then have the manufacturer reimburse the owner if it was the fault of the 'driver'

    Devil's advocate here. For insurance/liability reasons shouldn't the car refuse to operate unless it's operating with 100% safety compliance? If it does, than it would be a manufacturer that would be liable. A car should sense when maintenence is required and, if it's prudent to, drive itself to the repair shop.

  • by Todd Knarr ( 15451 ) on Monday December 24, 2012 @08:52PM (#42384885) Homepage

    How does it know what the speed limit is on a particular stretch of road? And what happens when the city changes the posted limit (eg.for construction work) and the car's database isn't updated? Since the car "knows" the speed limit is 55 there it's going to go 55 even though the posted limit is 25.

  • Re:Extra safety (Score:5, Interesting)

    by Wrath0fb0b ( 302444 ) on Monday December 24, 2012 @09:04PM (#42384947)

    The human component is just there in case something unexpected happen on the road that self-driving cars may not be able to react to in time. While such disaster scenario may be rare, the possibility isn't 0%, which is why you need someone who is able to drive.

    It's also possible that relieving the driver of the drudgery of driving during the vast majority of uneventful rides will actually deprive him of the instinctual familiarity that would allow him to react correctly in those marginal cases. That is, the purpose of keeping a human being in the loop just for disaster scenarios might be self-defeating if the driver does not possess the experience to best resolve the situation.

  • Re:Extra safety (Score:5, Interesting)

    by Anonymous Coward on Monday December 24, 2012 @10:44PM (#42385285)

    Phones are not realtime operating systems. Embedded vehicle components are, and are based on operating systems designed from the ground up to ensure that when a process is going to get its quantum of time, it gets it. No app in the background hogging CPU, no blocked I/O from a long write to flash, no fooling around like on a general purpose OS.

    Of course, automotive/marine critical stuff is expensive, but you get what you are paying for. This isn't just another x86 processor that is running Windows and autostarting a "run engine" app.

    A self-driving car will have the circuitry most likely on a dedicated processor using CANbus.

  • Re:Extra safety (Score:4, Interesting)

    by anubi ( 640541 ) on Monday December 24, 2012 @11:00PM (#42385341) Journal
    I remember a control systems class at university.

    We had a table we could move by motor, x and y. The motors were connected to a analog computer ( yes, that old. An EAI 580 IIRC ). Even that primitive computer could balance a broom on the table, moving it anywhere we told it to go, keeping the broom upright ( or we could move it by nudging a bit on the broom, kinda like how you operate a Segway). For all practical purposes, this was the predecessor to the Segway.

    Not a one of us in the class could balance that thing as well as the computer ( when properly programmed ) could.

    Now, where I think the computer is going to have a helluva problem is in dealing with people, who do the damndest things, like suddenly opening car doors or stepping out into traffic before even looking. Think kid on skateboard. Human intuition leads us to suspect human carelessness whenever we see certain behaviours, especially kids playing nearby or a freshly stopped or entered car. If the computer suspected everything as behaving as erratically as a human behaves, it would have to drive extremely slowly to make up for its lack of insight of which car door is likely to spontaneously open right into onflowing traffic. Even fully alert humans often find this situation unsolvable, with a collision the inevitable result.

    One thing the computer has going for it is much faster response times. It would probably be able to bring the car to a complete stop much faster than a human could, leaving the human driving the car behind the robot car with quite a predicament on his hands. Tailgater Beware! We will probably shortly see lots of snub-nosed BMW's.

    Now, if all the cars were under computer control, aka " the left hand knows what the right hand is doing", this looks quite do-able. Having to deal with humans, and our completely illogical algorithms, should be enough to drive anyone trying to design a computer algorithm to accommodate it completely buggy.
  • by xQx ( 5744 ) on Tuesday December 25, 2012 @12:11AM (#42385565)
    I really don't know why this is a difficult question, I see a simple law that solves the problem:

    1. If there is an occupant in the car who holds a local drivers license, they are required by law to sit in the drivers seat, and they are responsible if the car is on autopilot or not.

    2. If there is an occupant in the car who is unlicensed or incapable of driving they must not sit in the drivers seat and rule 3 applies.
    (ie. this is what you do when you are drunk)

    3. If there is no occupant in the car (eg. the car is driving its self to pick you up), the owner of the car is responsible as if they were driving.
    (ie. If your car kills someone because Sergey programmed it wrong, you go to jail. You knew this was the law when you purchased the car and sent it off on it's own so don't bitch about it.)

    4. For civil claims (that is, if someone is seeking money from you in damages), and it is proven that the software was at fault, then the liability is joint and several. (ie. the person who is suing can take you for what you are worth, and take google for what they are worth).

    This is easy for lawmakers because there is always someone in their jurisdiction who is liable for the car, and as the owner, you need to trust that the software works. If you don't trust it, don't buy one.
  • Re:Extra safety (Score:3, Interesting)

    by TooMuchToDo ( 882796 ) on Tuesday December 25, 2012 @12:25AM (#42385615)

    This is my favorite Youtube video showing the driverless Google car in action: []

    Human drivers will be obsolete in 5-10 years, tops.

  • by Anonymous Coward on Tuesday December 25, 2012 @05:07AM (#42386167)

    This is a very difficult question because assignment of liability is something that Insurance Companies fight over all the time. I have concerns with each of the points you have raised. In the order that you first labeled them:

    1. A car on autopilot is placed this way to that driver can be inattentive. Aircraft autopilots, the closest current legal analog, can switch off and alert the pilot to a problem in seconds. The pilot can then respond to the malfunction quickly but it does not require split second timing. Malfunctions of car autopilots will have no such margin of time to correct the error before an obstacle is struck. Thus, the autopilot is functionally useless.

    2. The case of the non owner or non licensed occupant is a variation of the problems in the first law you postulated. The liability for any accident is transferred to the occupant of the vehicle and they must be as attentive as drivers in the modern day with their cruise control activated. Again, the autopilot is useless.

    3. If the car strikes someone while nobody is inside then this opens up a whole new can of worms. Owners are typically liable for damage caused by their property as a result of their own gross negligence. A car on autopilot is essentially out of the control of the owner. The closest modern example to me is if someone steals your car and then strikes someone with it. The driver is then liable and not the owner. Your third law would transfer any liability for manufacturing defects to the owner. This exposes your risk enormously in cases where your car is involved in an accident with no additional witnesses.

    4. Proving a manufacturing defect in court when you are arguing against lawyers who are on salary is a bankrupting proposition. Civil cases are never about the search for the truth. They are about the fast acquisition of damages. Your benefits would be exhausted quickly even if your insurance will cover civil litigation and your insurance company goes to bat for you. Then you will just be crushed by the legal fees thereafter.

    The driverless car is a fascinating technical problem but it will expose all involved in the project to incredible legal risk. Insurance companies will recognize this and I suspect that the driverless cars will be nearly uninsurable except by the independently wealthy.

  • by iMactheKnife ( 2556934 ) <> on Tuesday December 25, 2012 @01:43PM (#42388585)

    What cop? An automated speed trap camera gives a ticket to an autonomous car. The passenger is not in control. One of the two automated systems is in error. Is there any kind of justice involved here at all? The entire concept of justice implies some sort of free will to make a choice of good vs bad decision. There is no operating free will here. What will a rational judge do? He'll assign it to a debugging group to determine liability, if any.

    I can see it now: the road maintenance robots lower the speed limit to 25 on a stretch of road. Their comm access is not working, so the the highway comm net does not update the vehicle's GPS system, which thinks this is a 55 MPH zone. Traffic all rolls by at 55. They all get tickets for speeding. The unions call for a boycott on road maintenance, which causes more 'bots to be purchased. Politicians pass a law mandating fines for road crews that do not post accurate speed limits, a standards body to determine safe limits, and a mandate to cops to enforce them. Every so often there is a snafu and a huge pile-up on the highway. People decide to learn to drive again and my old Ford becomes a concourse antique.

Loose bits sink chips.