Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Transportation Software Hardware Technology

Selling Full Autonomy Before It's Ready Could Backfire For Tesla (arstechnica.com) 190

An anonymous reader quotes a report from Ars Technica: Tesla has an Autopilot problem, and it goes far beyond the fallout from last month's deadly crash in Mountain View, California. Tesla charges $5,000 for Autopilot's lane-keeping and advanced cruise control features. On top of that, customers can pay $3,000 for what Tesla describes as "Full Self-Driving Capability." "All you will need to do is get in and tell your car where to go," Tesla's ordering page says. "Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed." None of these "full self-driving" capabilities are available yet. "Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction," the page says. "It is not possible to know exactly when each element of the functionality described above will be available, as this is highly dependent on local regulatory approval."

But the big reason full self-driving isn't available yet has nothing to do with "regulatory approval." The problem is that Tesla hasn't created the technology yet. Indeed, the company could be years away from completing work on it, and some experts doubt it will ever be possible to achieve full self-driving capabilities with the hardware installed on today's Tesla vehicles. "It's a vastly more difficult problem than most people realize," said Sam Abuelsamid, an analyst at Navigant Research and a former auto industry engineer. Tesla has a history of pre-selling products based on optimistic delivery schedules. This approach has served the company pretty well in the past, as customers ultimately loved their cars once they ultimately showed up. But that strategy could backfire hugely when it comes to Autopilot.

This discussion has been archived. No new comments can be posted.

Selling Full Autonomy Before It's Ready Could Backfire For Tesla

Comments Filter:
  • Considering that they are topping the table for car-ownership satisfaction rates, I don't think it's such a big issue. https://www.consumerreports.or... [consumerreports.org]

    • by Kristoph ( 242780 ) on Tuesday April 17, 2018 @05:55PM (#56455273)

      I happen to have a Tesla S and I do love the car.

      They are a joy to drive. The acceleration is second to none. The handling is great, almost on par with the ( now deprecated ) hydraulic steering on BMWs's I used to drive. The cabin technology is great. I also really like the fact that the car gets free updates that actually improve it on a continuous basis!

      I also feel extra pleased with my purchase because I get free EV charging in my office building, premium EV parking at most malls, and I just learned that Tesla will install a supercharger ( which I also get for free ) at my local mall. Heck I even get premium parking and free power at Ikea! ( I would get these with a lower cost EV but their really just a 'feel good' bonus. )

      The autpilot is def misnamed. It's nowhere near autonomous. Just now it's more like a junior co-pilot. Would I like it to be better? Yes I sure would. But I will say that once you know it's limitations it is very helpful and - contrary to popular opinion - it does an excellent job in all weather conditions except heavy snow which covers the road. I'd say 70% of my daily commute is now handled by autopilot.

      • I guess that's cool if you go to stores and like being at stores all.. the.. time. Me? I like to stop in a park when I stop. You keep drinking the kool-aid.
      • I'd say 70% of my daily commute is now handled by autopilot.

        How do you keep yourself focused while the car is on autopilot?

        • How do you keep yourself focused while the car is on autopilot?

          Usually by reading a book or posting on slashd0.,-;@
          no carrier.

      • by rtb61 ( 674572 )

        It does not matter, what car makers do or do not do. Autodrive should not be sold until a set of standards has been set, for what it can detect and how it should react to what it can detect and those standards should be set by regulation and followed. No roll your own experimentation, real world standards that can be followed, that a consumer can expect without having to drill down through the tech details. So being an Australian, can the system detect a kangaroo approaching the vehicle when you are doing 1

        • by Rei ( 128717 )

          Autodrive should not be sold until a set of standards has been set, for what it can detect and how it should react to what it can detect and those standards should be set by regulation and followed.

          Do you apply this to everything that can control the steering wheel, accelerator and/or brakes? So, say, all TACC needs to go under your rules?

          • by rtb61 ( 674572 )

            What my rules,"ISO is an independent, non-governmental international organization with a membership of 161 national standards bodies. Through its members, it brings together experts to share knowledge and develop voluntary, consensus-based, market relevant International Standards that support innovation and provide solutions to global challenges. You'll find our Central Secretariat in Geneva, Switzerland. Learn more about our structure and how we are governed." https://www.iso.org/home.html [iso.org]. Read my lips S

            • by Rei ( 128717 )

              I'll repeat my question: So, say, all TACC needs to go under your rules?

              • by rtb61 ( 674572 )

                How the hell would I know, I am not an international body of standards. Mt guess there would be an overriding autonomous vehicle standards with a whole slew of sub-standards tied to it, covering what ever they consider to be appropriate, this tying into other existing vehicle standards. I am just a commenter on slashdot not ISO or have any association with any standards body, I have just worked with them, hundreds of different ones and appreciate their worth and benefit and the safety they provide the commu

  • What happens when they start selling them and the courts find that all liability is with the software/hardware manufacturer? I guess its nothing a quick chapter 7 bankruptcy can't fix...
    • Re:Chapter 7? (Score:5, Informative)

      by ShanghaiBill ( 739463 ) on Tuesday April 17, 2018 @06:20PM (#56455403)

      What happens when they start selling them and the courts find that all liability is with the software/hardware manufacturer?

      Then the manufacturer will pay for insurance, rather than each individual paying for their own. It will just be built into the price of the car, but will be less expensive and more efficient than the current system because of better transparency and lower transaction costs.

      Consumers will win, since they will save money and hassle. Car manufacturers will win since "no-insurance-needed" cars will sell better. Insurance companies will lose, since they will be selling to informed manufacturers (who may opt to self-insure) rather than to confused consumers.

      • by zippo01 ( 688802 )
        How do you do that when the cars can also be driven in manual mode? What about comprehensive coverage? My window was broken out, my car was stolen? Non-driving issues and more. How do you handle that? How do you afford to pay for the car insurance up front? I don't think it is as simple as you think. Also insurance has caps, which can be sued over. I have 300,000 in insurance per accident, but I can easily be sued for much more. When you have a company with deep pockets, you just keeps suing. Insurance can
      • by bws111 ( 1216812 )

        What happens when an accident happens because the owner did not do proper maintenence (bad brakes, bald tires, mechanical failure, didn't take the vehicle in for updates, etc)? The manufacturer isn't going to cover that.

        What happens when someone damages your car? The manufacturer isn't going to cover that.

        'No insurance cars' is a pipe dream.

        • What happens when an accident happens because the owner did not do proper maintenence

          Very few accidents are a result of deferred maintenance. Nearly all are from impaired driving and human error.

          bad brakes, bald tires, mechanical failure, didn't take the vehicle in for updates, etc)?

          Tesla brakes don't wear out. The automatic regen absorbs 95% of braking energy. The brake pads last the life of the car.

          Updates are OTA.

          There isn't really much mechanics to fail. The only moving part in an electric engine are the bearings.

          Tires wear out faster, because of the fast acceleration, so that is an issue. But existing liability law already covers out-of-warranty abusive use.

          What happens when someone damages your car? The manufacturer isn't going to cover that.

          Why not?

          • by zippo01 ( 688802 )
            Insurance has caps per claim, which can be sued over and above. I have 300,000 in insurance per accident, but I can easily be sued for much more. When you have a company with deep pockets, you just keeps suing. Insurance can't cover the loses and it falls apart. The only reason you or I aren't sued is lawyers know we don't have money.This is not a simple as you are making it.
  • Selling Full Autonomy Before It's Ready Could Backfire For Tesla

    Selling Anything Before It's Ready Could Backfire For Anyone

    Of course, now it looks more like an Onion's headline, but that is, in itself, a hint...

    • by AmiMoJo ( 196126 )

      It's actually worse than TFA makes out. The original web page was promising it would take your kids to school for you. They have since been editing it to reduce the promised functionality, but anyone who bought it with the old text (and there are plenty who kept screenshots) is in a position to sue if they don't deliver it.

      Musk is saying 2020 now, by which time people who bought it shortly after launch will have had it for five years. Many of them will have run out their lease periods by then. Many of the c

  • by MpVpRb ( 1423381 ) on Tuesday April 17, 2018 @05:23PM (#56455111)

    I also thought about how hard it would be to make a real autonomous vehicle that worked under all conditions

    Getting to 90% has been done. Good weather, good visibility, few unexpected hazards

    Getting to 95% will be harder, and it gets exponentially harder as you asymptotically approach 100%

    The billion dollar question is...How close is close enough?

    No matter how good it gets, someone will always sue, claiming it isn't perfect

    The law needs to be adjusted to accept the reality that nothing is perfect

    • It only has to be as good as, or better than, humans... ...which aren't anywhere near 100%.

      • by AmiMoJo ( 196126 )

        No, Tesla sold it as "get out of your car and it parks itself". People won't be happy if they get an alert on their phone from the car pleading for help because it's stuck in a car park and doesn't know what to do and there is a long queue of human drivers behind it getting angry can you please get an Uber and save it.

        Informally Musk promised you would be able to summon the car from the other side of the country. It's got to be damn near 100%.

      • It only has to be as good as, or better than, humans...

        For what? To improve highway safety? Perhaps, though that really depends on how automated vehicles handle being in traffic with human drivers.

        But they also have to be economically viable, and that requires not just that they be better than the average person, but that they be perceived to be better than the individual buying them. That's a much higher threshold, given that most people think they are better than the average driver. But people won't want to buy a vehicle that they think won't be as safe

    • by Ichijo ( 607641 )

      No matter how good it gets, someone will always sue, claiming it isn't perfect

      Tesla is betting that the $5,000 per vehicle that they are charging for the self-driving capability will bring in more revenue than the cost of lawsuits when the technology fails. Sometimes they win the bet and sometimes they lose the bet. This is a good thing because it gives them a financial incentive to improve their technology.

      Absolving them from liability because "nothing is perfect" only invites apathy and technological sta

    • The billion dollar question is...How close is close enough?

      You only have to be SLIGHTLY better than the average human driver, and you start saving lives by getting marginal drivers off the road.

      Judging from how I've seen people drive around the world, 80% is PLENTY HIGH.

      I am serious; if you took some of the worst drivers today and gave them self driving cars with existing tech, you would be saving lives and reducing accidents.

      • I am serious; if you took some of the worst drivers today and gave them self driving cars with existing tech, you would be saving lives and reducing accidents.

        No, you won't. The worst drivers frequently have their license pulled. The worst drivers are already taken of the road.

        The remaining drivers will average perhaps three to four accidents over their lifetime, with the odds of a fatality so small it's hardly a rounding error (source [forbes.com]). The average american drives 13,476 miles per year (source [dot.gov]). Figure on a 40 years worth of driving (giving *YOUR* argument the benefit of bias here), we're looking at 500k miles with a non-fatal accident every 125k miles at wors

      • by jeremyp ( 130771 )

        80% of what? 80% of trips? 80% of miles? 80% of different types of conditions?

        People drive every day and most have had only a tiny number of accidents. I think we do a lot better than 80%

      • You only have to be SLIGHTLY better than the average human driver, and you start saving lives by getting marginal drivers off the road.

        The stated condition has nothing to do with your presumed result.

        I am serious; if you took some of the worst drivers today and gave them self driving cars with existing tech, you would be saving lives and reducing accidents.

        Ah, and here we have the caveat needed for your result: someone else's money.

    • by Kjella ( 173770 )

      No matter how good it gets, someone will always sue, claiming it isn't perfect. The law needs to be adjusted to accept the reality that nothing is perfect

      Adjusted how? Self-driving cars already killed their first pedestrian, I don't see any manslaughter charges filed. Liability for damages could be, but you don't get infinite damages for a wrongful death even if it's due to faulty products or recklessness. Here for example $750000 [hminjurylaw.com] for a life. Here's $2.2 million [pendaslaw.com]. Here's $2 million [mnlawoffice.com]. Here's a $950000 [mercurynews.com]. Probably the most expensive one I saw that's actually settled is $9.5 millions [latimes.com], not juries making crazy judgement that'll go on appeal. This review (pdf) [campbell.edu] across

    • by mileshigh ( 963980 ) on Tuesday April 17, 2018 @08:14PM (#56455853)

      The law needs to be adjusted to accept the reality that nothing is perfect

      The problem (traffic, roads, laws, standards) will get redefined to fit the new AI solution, same as when we transitioned from horses to cars. When you get enough autocars out there, the roads will begin to get engineered to mitigate the weaknesses of AIs, bit by bit.. This is exactly what happened to cars: we created and adapted roads, laws, enforcement, etc to match car's needs and continue to do so. The Model T was high off the ground to deal with the rutted, muddy dirt roads (or no roads at all) they were likely to encounter. Today, we have aerodynamic skirts a few inches off the pavement for efficiency. Pavement--smooth pavement--is simply assumed.

      That's what always happens with any disruptive technology: we end up adapting everything, including ourselves, to meet it part way.

      An example of things that will probably be changed soon than later: road construction zones will be required to implement certain protocols (signage, markers, notifying some central database, whatever) to make them easier for AIs to traverse. Failure to do so will entail liability for accidents.

      My guess is that true full autonomy will first roll out in a big way on certain long-haul trucking routes. Many freeways are a fairly clean, well-defined situation and the prize for trucking companies is too big to ignore. Those parts of the chosen freeways that are problematic for the AIs will be upgraded, either due to lobbying by large trucking firms, and/or those firms'll kick in some of their own $ to make those changes happen sooner.

    • by eepok ( 545733 )
      Thank you for asking "How close is close enough?". It's the most important question in all of AV and here's the answer: "It has to be nearly perfect. Being 10x better than the current driving population is not good enough."

      We're near 40,000 road deaths per year in America. Now imagine that in 15 years, Tesla, Waymo, and Uber together took over the entire auto industry with Level 5 Autonomous Vehicles and replaced every vehicle on the road with their identically-performing AVs. In that first year, road d
  • by Anonymous Coward on Tuesday April 17, 2018 @05:23PM (#56455117)

    The people who think it takes hard AI to achieve full autonomy in a self driving car vastly overestimate the cognitive abilities of human drivers. The computer does not need a complete and entirely correct model of the environment to be a better driver than a person. People are very easily overwhelmed by complex traffic situations and make tons of mistakes. Roads are designed to enable safe traffic regardless of these cognitive deficiencies. Computers can take advantage of that too.

    • by Mr D from 63 ( 3395377 ) on Tuesday April 17, 2018 @05:31PM (#56455153)

      The people who think it takes hard AI to achieve full autonomy in a self driving car vastly overestimate the cognitive abilities of human drivers..

      And others vastly underestimate the challenges of designing a system that can perform better than humans without human oversight. Particularly in an environment that was designed specifically for human drivers interacting with other human drivers. We have a very long way to go.

      • And others vastly underestimate the challenges of designing a system that can perform better than humans without human oversight

        And still OTHERS appear to be utterly ignorant as to the state of the art in self-driving car research and delivery.

        Kind of strange for a place like Slashdot to have some many people so very, very ignorant of technology.

        • by Mr D from 63 ( 3395377 ) on Tuesday April 17, 2018 @07:59PM (#56455783)

          And others vastly underestimate the challenges of designing a system that can perform better than humans without human oversight

          And still OTHERS appear to be utterly ignorant as to the state of the art in self-driving car research and delivery.

          Kind of strange for a place like Slashdot to have some many people so very, very ignorant of technology.

          Please explain the "state of the art" for all those ignorant people. You could be one of them as far as I know.

        • And others vastly underestimate the challenges of designing a system that can perform better than humans without human oversight

          And still OTHERS appear to be utterly ignorant as to the state of the art in self-driving car research and delivery.

          Kind of strange for a place like Slashdot to have some many people so very, very ignorant of technology.

          As I pointed out in my reply above, current SDC capability is orders of magnitude (two orders, to be exact) worse than the average human driver.

          We aren't ignorant of tech, we're just better at stats than you are.

    • by hazardPPP ( 4914555 ) on Tuesday April 17, 2018 @05:51PM (#56455259)

      The people who think it takes hard AI to achieve full autonomy in a self driving car vastly overestimate the cognitive abilities of human drivers. The computer does not need a complete and entirely correct model of the environment to be a better driver than a person. People are very easily overwhelmed by complex traffic situations and make tons of mistakes. Roads are designed to enable safe traffic regardless of these cognitive deficiencies. Computers can take advantage of that too.

      And people who make claims such as yours tend to forget that roads were designed for humans, not computers, and that some things that humans do very easily are very difficult for computers to do (and of course, vice-versa).

      If we had roads which were designed for computers, I am sure computers would very quickly outperform human drivers. The problem is - we don't.

      I believe that self-driving efforts are focused on the wrong thing: trying to reproduce a human driver, and claiming it's better if it, on average, messes up less than a human. Instead, it should be focused on trying to create an infrastructure that supports self-driving vehicles. Does that mean they will be able to drive on every imaginable road? Probably not...but likely on 95% of roads (such e.g. all roads and streets in cities), once the roll-out is complete. In this case, it probably will be safer...but if it satisfies a bunch of conditions first.

      The other problem that people who claim "humans are not that smart, computers are better than them on average" miss is the variability among humans. No human driver is identical; almost all make mistakes, but the types of mistakes that are made (and the situations which they are made in) can differ widely. However, software flaws are replicated identically across many units...potentially millions of them. Joe in Chicago being a bad driver does not affect Bob in Cleveland. An undetected bug in Tesla's Autopilot will affect all Tesla owners in the world potentially, and probably under the same (or very similar) conditions. You might be in a situation where you have a 100% probability of a screw-up under certain conditions. That is simply not the case with any human driver. Now imagine if say, 20% of the cars on the road are self-driving...what you are potentially setting things up for is a black swan type of event: software-driven cars may be much safer 99% of the time, but could be prone to major screw-ups 1% of the time that will dwarf the combined effects of bad human driving.

      Variability among human drivers also allows for evolutionary selection: reckless drivers will typically die, or have their licenses taken away from them. This does not remove all of the bad drivers - but does remove a great deal of them over the long run. How do we do that with self-driving cars? A destruction of one self-driving car with a flaw will not end it, because then likely all other cars of the same model and series have the same flaw. Removing just that particular car won't do it, you'd have to do a recall...now recalls can already get quite bad, imagine all of the recalls we're gonna have with self-driving vehicles (where the authorities are bound to more paranoid)...and it won't all be software problems you can patch, there will be hardware problems too.

      I'm not saying these are all insurmountable problems. They can be addressed. I'm just saying there's a lot more to autonomous vehicles than the techno-optimistic "self-driving cars just need to cause less accidents on average than humans" stance. A lot more. It cannot be reduced just to a single metric. Or just a bunch of numeric metrics.

      • by havana9 ( 101033 )
        You could design "roads" for automatic drivers, they're called rails. You have automatic underground lines, and even regular trains have an almost automatic control, maximum speed and braking is automatically controlled bi signals. Of course the human intervention is required for oddball situations, like a failing engine or unexpected cows in the track, but I suppose that the money invested on self driving lorries could be more succesfully used for freight train infrastructure. By the way the electic train
        • I totally agree. This is exactly where you see in which situations computers are superior to humans (e.g. achieving a reliable, consistent 90 second headway between fully loaded subway trains at rush hour) and what you need to achieve that (it's not just stuff in the train, but also on the rails, signals, in the stations). The electric train is a solved problem, but a battery-powered one could be useful on less frequently used rail lines where it doesn't make sense to electrify (and here it would displace d
    • by 110010001000 ( 697113 ) on Tuesday April 17, 2018 @06:05PM (#56455315) Homepage Journal
      It does take hard AI. The dumbest person is infinitely smarter than the best computer.
      • by AmiMoJo ( 196126 )

        This. Tesla sold the capability of the car to go find a parking space and then return when you summon it later. There are all sorts of edge cases that a human would be able to deal with easily, like poor road markings or someone else parked badly or road works, but which AI will struggle with.

        Considering they can't even get the system to drive straight in the middle of a lane yet (it's prone to ping-ponging between the lines) the current estimate of 2020 for this feature (that they have been selling since 2

    • While I agree that AI's outdriving humans is considerably closer than we give it credit for. Simple human falacies, normal panic, and human tendency to sensationalize new causes of death and completely tone out the deaths we are used to. In a world where the laws were written by people who understood statistics, and the media was more concerned with facts over views (or a world where accurate facts was the way to get views). Self driving vehicles are at most 5-10 years from catching up to humans, and thus
  • by bobbied ( 2522392 ) on Tuesday April 17, 2018 @05:30PM (#56455145)

    No internal combustion engine and we are discussing a backfiring Tesla? How's that possible?

    (To you literalist... I'm making a joke.. )

    • Oh, I dunno, Tesla's logo is that big letter 'T', which reminds me way too much of a Ford Model T, and with those, if you had the spark advance set too far forward, they'd backfire like you wouldn't believe, so maybe with a Tesla it's just 'sympathetic backfiring'. xD
  • by Jodka ( 520060 ) on Tuesday April 17, 2018 @05:47PM (#56455241)

    Why would anyone purchase non-existent driving software with their Tesla when, presumably, that feature could be purchased separately later, at a time when it really exists? You are only giving Tesla a free $3,000.00 loan for an indefinite period by ordering it now with the car.

    • Because people are really stupid. Really stupid. People loan the government money tax-free and celebrate when they get a "refund" on their taxes.
      • Because people are really stupid. Really stupid. People loan the government money tax-free and celebrate when they get a "refund" on their taxes.

        At least there's no risk of the government going bankrupt and you being stuck as an unsecured creditor.

    • Why would anyone purchase non-existent driving software with their Tesla when, presumably, that feature could be purchased separately later

      When my wife bought her Tesla, she was told it would cost much more than $3000 to buy full-autonomy later.

      • They told us it would be $4,000 if it was purchased later. Only a $1,000 difference.

        • "Only a $1000 difference". Christ.
          • by Corbets ( 169101 )

            "Only a $1000 difference". Christ.

            Oh, come on now. That may be a lot of money for you, but for anyone who’s buying a Model S or X, it’s a rather insignificant sum.

        • by dgatwood ( 11270 )

          They told us it would be $4,000 if it was purchased later. Only a $1,000 difference.

          The difference is that folks who pay the $3,000 have a locked-in price (unless Tesla goes bankrupt), whereas the people waiting to spend the $4,000 might find that in a few years, that price has gone up considerably. There's nothing contractually obligating Tesla to maintain the $4,000 price forever, as far as I can tell.

          • by jeremyp ( 130771 )

            On the other hand, they have got $3,000 now that they can spend on things that exist.

          • Conversely, those who paid $3000 now may find it difficult to get that money back if Tesla has financial problems and can't fulfill their promise.
        • by heson ( 915298 )
          Be fooled out of $3000 (now) or $1000 (in a far future) decisions, decisions.
  • by cunina ( 986893 ) on Tuesday April 17, 2018 @05:48PM (#56455243)
    The Tesla Theranos Edition.
  • >> hugely (something)

    So is Trump an editor now?

    I've already heard some use "did a Telsa" to mean "stupidly destroy oneself by stepping into an obviously dangerous situation" much like Tesla's car rammed a well- marked concrete barrier at full speed.
  • by Rick Schumann ( 4662797 ) on Tuesday April 17, 2018 @06:09PM (#56455331) Journal
    In fact, I think Tesla is probably the least of the offenders in this case; all (so-called) 'self-driving car' developers are rushing their 'product' to market, because they've spent so much more money 'developing' it than they ever thought they'd need to, only to find that it's way, way more complex than they ever thought it was, and they need to start showing a profit or heads will roll. Therefore they expect us, the general public, to be their 'alpha-testers' (not even BETA-testers, the damned things aren't even that good).
    • by kriston ( 7886 )

      I drive a vehicle with the "Honda Sensing" option. Honda goes out of their way to make sure you don't rely on it. Lane keeping doesn't function under 40 MPH. Lane departure works above 30 MPH. Road departure works most of the time. Pre-emptive braking makes odd decisions but overall will keep you from rear-ending someone. Adaptive cruise control cuts out when you go slower than 20 MPH (my main complaint) with a loud, screaming tone. I hope someone finds a way to hack stop-and-go cruise control into H

  • by fozzy1015 ( 264592 ) on Tuesday April 17, 2018 @06:15PM (#56455375)
    Both for their cars and their factory. They're getting sued for their FSD fraud, and they've wasted time and money over auto mating their Fremont factory for the Model 3 line. Now Musk is going to have a 24/7 shift for the Model 3 line. How ever close they were to positive margins on the $60K Model 3 have now become that much more elusive. And the $35K Model 3? Phffft.
  • Elon Musk has been warning everyone that AI would be dangerous. Perhaps it will be for him but not in the way he envisioned.

  • by internet-redstar ( 552612 ) on Tuesday April 17, 2018 @08:34PM (#56455927) Homepage
    It's a deceptive article because: - it focuses on LIDAR while nobody has proven that it's required to do 'full self driving', as nobody could accomplish that yet.
    - everybody who is tired of waiting on the feature can ask their money back from Tesla.
    - not a lot of people who have enough money to buy these cars are stupid enough to 'get confused'
    - it's a duplicate of a similar story which ran 1 year ago. Tesla Autopilot has gotten better and will get better
    - Tesla has already promised that if a hw upgrade is necessary (which is likely), they will upgrade free of charge.

    Aside from that, we love the AP capabilities of our Tesla's.

  • You could expand this to most of the current AI goldrush. It wouldn't be the first time that capabilites of AI have been oversold.

"Your stupidity, Allen, is simply not up to par." -- Dave Mack (mack@inco.UUCP) "Yours is." -- Allen Gwinn (allen@sulaco.sigma.com), in alt.flame

Working...