Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Japan Robotics Transportation

Toyota To Let People Ride In Self-Driving Prius 282

fergus07 writes "Toyota is to show an autonomous Prius at Tokyo Motor Show. Dubbed the Toyota AVOS (Automatic Vehicle Operation System), the car will be available for members of the public to take 'back seat' rides at the show, demonstrating first hand how the Prius can avoid obstacles, be summoned from a parking garage and park itself."
This discussion has been archived. No new comments can be posted.

Toyota To Let People Ride In Self-Driving Prius

Comments Filter:
  • by G3ckoG33k ( 647276 ) on Monday November 21, 2011 @05:29PM (#38129022)

    First self-driving crash - who to blame, or sue?

    The developers? The owner? Toyota?

    Class action rush hour on Route 66?

  • by M0j0_j0j0 ( 1250800 ) on Monday November 21, 2011 @05:36PM (#38129114)
    I hate to agree with you, but i think its true, no one will tolerate a self driving car crash, even if it is just one. Even trains front crash time to time , something we think should be impossible to happen. Being benevolent, lets assume one of those car crashes , another driver fault, not a clear one, but his fault, what are the makers going to do defend themselves with system logs?
  • by Anonymous Coward on Monday November 21, 2011 @05:37PM (#38129136)

    Under current law, the person behind the wheel in the drivers seat is considered the operator, and liable for whatever the vehicle does. The owners liability (assuming they weren't driving) is dependent upon their insurance, and the fact that the vehicle is autonomous is irrelevant. The developers, assuming they had not signed an unprecedented and absolutely retarded employment contract, have no personal liability. Toyota could only be found liable if it was proven that a defect in the vehicle caused the crash.

    Simple fact is, before autonomous cars will really become commercially viable, a lot of laws have to change, mainly around liability of the manufacturer since they're taking on more responsibility. Most likely though, the operator will retain the majority of the liability, and we're unlikely to see in our lifetimes a car where you can punch in a destination and take a nap. It'll be more like an advanced cruise control. The operator still has total ability to control, is required to keep hands on the wheel and attention on the road at all times, and is responsible for intervening in the case of an emergency.

  • by timeOday ( 582209 ) on Monday November 21, 2011 @05:43PM (#38129234)
    I don't think this is as big a deal as people always fear. The person operating a machine normally takes responsibility for what it does under their direction. Nobody says, "that backhoe just dug a cellar," they say, "I dug a cellar" (even though 99.99% of the caloric expenditure was by the backhoe). Nobody says, "Excel just computed our monthly budget," they say, "I just worked out our monthly budget" (even if Excel did 99.99% of the calculations). Only when we're thinking into a future we don't yet understand does it seem like the machines will be making all these "intelligent" decisions. Once the machine is in hand and understood, we feel like we are making the decisions (even though the machine is actually making thousands every second, as with an airplane autopilot). Our perception of intelligence on the part of the machine disappears. Once we know what to expect from them we simply laugh at those who don't and assume they are idiots (pertinent example [snopes.com]). People even feel this way when working through human subordinates. "George Washington crossed the Delaware River." It doesn't mean he rowed the boat.
  • by danparker276 ( 1604251 ) on Monday November 21, 2011 @05:44PM (#38129250)
    Yet no one seems to care. 500 US troops die a year in the middle east and it's a huge deal. These are 35,000 deaths that can easily be avoided. And that's only in the United States Yeah there'll be a few deaths, but probably 99% of the 35,000 will be avoided. Everyone should be forced to own one of these considering how many pedestrians are run over. People have to get over their own greed to drive a car fast though.
  • by Hentes ( 2461350 ) on Monday November 21, 2011 @05:48PM (#38129316)

    If the biggest problem with this technology is who to sue, then I'm not worried about it.

  • by DanTheStone ( 1212500 ) on Monday November 21, 2011 @05:51PM (#38129350)
    This is actually very easy to deal with. The driver is still liable. The insurers decide, based on the cars, the expected crash rate for autonomous vehicles. They don't really care about individual situations, they care about overall numbers. They can choose how much to charge if it's an automated driver, and how much if it's a physical driver, and pay out if it fails. It's really not a hard system. If autonomous vehicles are safer drivers, they will take over a lot faster due to significantly reduced insurance costs relative to physical drivers.
  • by Neil Watson ( 60859 ) on Monday November 21, 2011 @05:51PM (#38129352) Homepage

    So many people die from cars being driven by people now it could hardly be worse.

  • Re:About time! (Score:5, Insightful)

    by mrsquid0 ( 1335303 ) on Monday November 21, 2011 @05:53PM (#38129382) Homepage

    Self-driving cars are the way of the future. Why drive when you don't have to? Once people get over the fear of trusting the software they will realize that their time is far too valuable to waste driving.

  • by CaptSlaq ( 1491233 ) on Monday November 21, 2011 @05:57PM (#38129440)
    Assuming fail-safes are in place for malfunctioning sensors. As cheap as some things are made these days, I find the promise of sufficient redundancy highly suspect.
  • by Urban Garlic ( 447282 ) on Monday November 21, 2011 @06:06PM (#38129538)

    This seems like a pretty narrow concept of freedom. I'm kind of uncomfortable with self-driving cars myself, I have the control-freak instinct, I currently drive a stick-shift mostly for that reason. But it really is pretty hard to argue against either safety or practicality of self-driving cars.. I'm assuming that the self-driving car really is more like a taxi than a bus, in that if I decide half-way to my destination that I want a different destination, I can just make it so, and that will be that, and furthermore that if I want to take the scenic route down along the creek instead of the freeway, I can get that too.

    So, I can still pick my time of departure, my route, and my destination, and change my mind in mid-drive, only my freedom to operate the vehicle has been removed. Yeah, it bugs me a bit, but I don't know if I'm ready to die for it.

    And where's the line? In my city, it's hopelessly impractical (and maybe illegal) for me to ride a horse to and from work. Is that an unacceptable infringement on freedom of movement? Should I die for that one too?

  • by frosty_tsm ( 933163 ) on Monday November 21, 2011 @06:42PM (#38129994)

    That's the rational approach, but it's not how it'll be perceived.

    Americans have their head in the sand about driving deaths for years.

    James Bond: You'll kill 60,000 people uselessly.
    Auric Goldfinger: Hah. American motorists kill that many every two years.

  • by Anonymous Coward on Monday November 21, 2011 @07:29PM (#38130496)

    Right. I was surprised to learn that driving a car is half as deadly as being in the armed forces at a time when the US is involved in two wars. I had no idea.

  • Future headline (Score:5, Insightful)

    by sco08y ( 615665 ) on Monday November 21, 2011 @08:08PM (#38130782)

    Simple fact is, before autonomous cars will really become commercially viable, a lot of laws have to change, mainly around liability of the manufacturer since they're taking on more responsibility. Most likely though, the operator will retain the majority of the liability, and we're unlikely to see in our lifetimes a car where you can punch in a destination and take a nap. It'll be more like an advanced cruise control. The operator still has total ability to control, is required to keep hands on the wheel and attention on the road at all times, and is responsible for intervening in the case of an emergency.

    Since we're doing predictions, I'm going to predict a future headline:

    "Study shows operator intervention responsible for causing or exacerbating majority of autonomous vehicle accidents."

  • by jpmorgan ( 517966 ) on Monday November 21, 2011 @08:23PM (#38130860) Homepage

    There is a legal principle... I don't remember the latin, but a rough translation is 'the law is not stupid.' Legal decisions are made by judges, not bureaucrats or computers blindly following the rules. That's the essence of a common law system: the legal system is based on an understanding that reality is too complex to legislate completely, and judges have the authority to interpret how law is applied to reality as necessary. A literal interpretation is best if possible, but judges have leeway. Precedent then exists to ensure that the law, as actually applied, is consistent.

    So, I suspect that if you try just sitting in the passenger seat and get into an accident, the judge will determine that:
    1. You're still the operator.
    2. You're an idiot.

    And you'll probably get charged with dangerous driving too.

  • by CrimsonAvenger ( 580665 ) on Monday November 21, 2011 @09:26PM (#38131390)

    There are only so many jobs to be had, and when two stupid people have nine children, they've just created seven people who are more likely than the first two to be unemployed. Get a damn condom.

    Given the projected ratios of earners to SSA recipients in the next 50 years, those seven extras are going to be needed to keep the SSA from collapsing.

    Do remember that the SSA wasn't designed for operations with fewer recipients than workers supporting same. And that our lower-than-replacement rate growth accompanied by increased life expectancy (I just read that a baby born this year in the West (USA, Canada, Western Europe) has about a 50% chance of reaching 100) will make that whole social security thing a real problem by and by.

Lots of folks confuse bad management with destiny. -- Frank Hubbard

Working...