Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Google Robotics Transportation

People Trust Tech Companies Over Automakers For Self-Driving Cars 152

Lucas123 writes "Consumers appear more willing to use a self-driving car from a leading technology company, such as Google, over an auto manufacturer like Ford or Toyota, according to a new study from KPMG. Based on polls of focus groups, technology companies scored highest among consumers, with a median score of 8 on a scale of 1 to 10, with 10 as the highest level of trust. Premium auto brands received a score of 7.75, while mass-market brands received a score of 5. Google is the brand most associated with self-driving cars, according to the study, while Nissan lead the mass auto producers in recognition for autonomous technology; that was based on its pledge in August to launch an affordable self-driving car by 2020. 'We believe that self-driving cars will be profoundly disruptive to the traditional automotive ecosystem,' KPMG stated." I suspect that when autonomous cars start arriving for ordinary buyers, there will be a lot of co-branding, as there is now for various car subsystems and even levels of trim.
This discussion has been archived. No new comments can be posted.

People Trust Tech Companies Over Automakers For Self-Driving Cars

Comments Filter:
  • Duh? (Score:5, Insightful)

    by Anonymous Coward on Saturday October 12, 2013 @05:15PM (#45110769)

    I'd trust the quality of an AI from a machine learning company (Google) more than one from a mechanical engineering company (Ford).

    Its not like Google is making the "Car" part of it. They are getting that from a car company. I'd trust a car company to make the car portion far more than I'd trust Google, but that is not the issue at hand.

    • Its not like Google is making the "Car" part of it. They are getting that from a car company. I'd trust a car company to make the car portion far more than I'd trust Google, but that is not the issue at hand.

      Though to be a Devil's Advocate.. Isn't Tesla out "car-companying" the car companies?

      No, I'm not going to buy a Model S nor a Model X.. But they seem to be doing something right.

  • WRONG! (Score:3, Interesting)

    by Anonymous Coward on Saturday October 12, 2013 @05:19PM (#45110793)

    People trust the *idea* of self driving cars from tech companies over those from automakers. But, when reality bites and the consumer is presented with the Google Nexusmobile vs. the new GM AutoDominator, you'll see a very different sentiment.

    People say all sorts of things... until it comes time to pay for it or put their own lives at risk.

    • by Mashiki ( 184564 )

      People trust the *idea* of self driving cars from tech companies over those from automakers

      Some people may be too old to have missed the massive screw-over that NA automakers pulled during the 70's with the "they'll buy what we'll tell them to buy." And the same with the Japanese/Korean automakers in the 90's. Or the European in the 70's 80's and 90's. The reason why this makes sense is because plenty of people do remember that, and as well those who may not have...have parents who do.

      • by Mashiki ( 184564 )

        I should probably add, now that I think on it. My parents Volkswagen from the 70's had a shorter warranty than hard drives do. 1 year, or 12,000KM, and that was right from the showroom floor.

    • by Belial6 ( 794905 )
      Having just returned from Disneyland, it struck me that they would be a very good foot in the door for a smart company wanting build public trust in autonomous cars. If Disney could convince Anaheim to perform an emminent domain and allow Disney to make a raised roadway just far enough to get away from the buildup around the park, Disney could partner with Google to produce self driving cars that would be in real use. Just as the Disney monorail was first daily operating monorail in North America, it woul
  • by Justpin ( 2974855 ) on Saturday October 12, 2013 @05:25PM (#45110809)
    TBH its not a question of trust per se, since it will be mass produced to the lowest bidder who will most likely cut corners. It is a bigger question of who is liable when it goes wrong? Right now the nut behind the wheel is liable, yet if we put an AI in charge, what happens when it goes wrong? The opt out / easy method is to still make the 'operator' liable. Will the 'operator' have to be awake at all times and focusing on the road for when something goes wrong? Because if so then although it may well self drive the fact it needs to be constantly monitored kinda negates a large part of its autonomy. I mean computers never go wrong right?
    • I don't think that the liability will be a barrier for the implementation of autonomous cars. Using the same logic, seatbelts would have never been made legal. Both autonomous cars and seatbelts reduce total the number of deaths on the road (assuming the AI will ever get there) while sacrificing a smaller number who are technically "killed" by the technology itself
      • I don't think that the liability will be a barrier for the implementation of autonomous cars.

        Seat belts are not a good comparison as seatbelts can't cause multi-vehicle accidents.

        Liability absolutely will be a factor in our overly litigious society . The first time there is an accident with a fatality, the lawsuit against the manufacturer will be enormous. I don't think Google isn't interested in actually building the cars, but more into developing the tech and then licensing it to the automakers. Essentially what they do with Android. That gets them out of the loop for liability as they can bl

        • by kkwst2 ( 992504 ) on Saturday October 12, 2013 @06:19PM (#45111017)

          In what way are we a long, long way away? If you're talking about an affordable driver-less car, I'd agree. If you're talking about laws being passed that allow their mass adoption, probably. But the technology is there. They can basically do everything you've suggested (home to work, detours, deer, kids), in many cases much better than people can.

          • I still see a number of issues that need resolved. A human driver is capable of reacting to a much wider range of problems and required responses, although I grant that human drivers have slower response times and sometimes react badly to surprises. What will a driverless car do if the road is blocked? Will is sit there confused, will it attempt a 5-pt turn in the middle of the road, etc. Will it drive around the construction guy holding up a stop sign? Does it know that it's supposed to yield to peopl

            • do you have a map link for that error?

            • by Ambassador Kosh ( 18352 ) on Saturday October 12, 2013 @11:02PM (#45112209)

              I wish more people would stop at crosswalks for pedestrians. A few days ago I had to jump back from a crosswalk because a guy in his truck just went roaring through the crosswalk. He did not even slow down and he was easily going over the speed limit.

              I would love it if there was a better method to deal with that. It doesn't matter that the law says he was clearly in the wrong for that. If he had hit me the law would do me no good at all. I want self driving cars because too many people are aholes and cops can't deal with all of them by a long shot.

              I have had far too many people that I had to dodge out of the way of because while they where driving through an intersection to turn they picked up their cell phone to look at something on it as they drove through the crosswalk.

              Humans make too many mistakes to allow them to drive when we have better technology available. We did not used to have that choice and now we do. If you think you are truly a much better driver than average and don't make any of these mistakes then you should be able to take tests and prove it and then could drive a car under your primary control but an AI as backup so that in the event of a failure it would override your control. That way if you fall asleep at the wheel, don't pay attention etc you still can't run over a person walking or on a bike.

      • I don't think that the liability will be a barrier for the implementation of autonomous cars. Using the same logic, seatbelts would have never been made legal. Both autonomous cars and seatbelts reduce total the number of deaths on the road (assuming the AI will ever get there) while sacrificing a smaller number who are technically "killed" by the technology itself

        That was then Now we get sued for not knowing hot coffee is hot. How safe are these things? I would have to assume that the self driving electronics would have to be manufactured similarly to human rated rocket parts, or at least to avionics specs. Pretty pricey stuff, that. But even they fail. I for one would also like to have an understanding of what the failure mode is on these devices. Seat belts, air bags all are designed to protect in accidents. An autonomous vehicle that loses it's autonomy via comp

        • McDonalds should have been nailed to the wall even worse for what they had done. The coffee was not just hot. It was hot enough to cause 3rd degree burns within a few seconds. They had dealt with a lot of complaints from it for quite a while before that.

          You should probably read up a bit more on the hot coffee incident before just taking McDonalds side. They had some very good PR companies do a lot of work to blame the victim on that one.

          Everyone is aware that hot coffee is hot but if you spilled coffee from

          • Not to mention that the lid came off the coffee cup because the coffee was hot enough to melt the cup just above the waterline. That was the result of either too hot of a cup of coffee or too poorly constructed cup they were serving it in. Either way, there was good reason for the lawsuit to end against McDonalds.

            As for the liability, the states that allow the self driving cars can simply pass a law limiting the liability of them and make the owners of the cars obtain an insurance policy large enough to cov

  • Yeah but... (Score:5, Insightful)

    by Virtucon ( 127420 ) on Saturday October 12, 2013 @05:32PM (#45110835)

    I ripped my OnStar box out of my car because of their tracking policy so what makes me want to trust either the car makers or the technology companies? If Google does it you can be sure I'll be spammed with tons of ads and my every move will be tracked, mined and sold to any company or government they choose. I can see where auto makers will eventually do the same, Before self driving we'll all now have boxes tracking our distance in the name of eliminating gasoline taxes for roads which adds another dimension to all this data gathering on our movements. Until we get the privacy laws straight we shouldn't be considering self driving cars.

    • Until we get the privacy laws straight...

      I'm just not seeing that happening anytime soon, so what exactly are you saying? Progress must halt indefinitely? Or do you really think we are on the brink of the general public suddenly becoming outraged at the privacy infractions of the entites that "serve" them?

      • Look, nobody is saying that progress has to stop but not all progress is good. In this case it's very dangerous for us to assume that Google has everybody's well being in mind, remember the wifi mapping (snooping) that went on with their street view cars? [slate.com] That class action suit is still moving forward and it shows now insidious these privacy breaches can be in the name of innovation. Society in general is losing privacy in small and sometimes in very large ways
        all in the name of progress or government sno

    • I ripped my OnStar box out of my car...

      The only thing you ripped out of your car was the portions of OnStar that are for the consumers. The government portion of OnStar is embedded in the ECU and can not be removed without providing an entirely different Engine Control Unit. Just do not ever buy GM products to avoid any of this government tracking nonsense.

      • I ripped my OnStar box out of my car...

        The only thing you ripped out of your car was the portions of OnStar that are for the consumers. The government portion of OnStar is embedded in the ECU and can not be removed without providing an entirely different Engine Control Unit. Just do not ever buy GM products to avoid any of this government tracking nonsense.

        Well removing the Satellite transponder is more effective than me starting to wear an AFDB... [zapatopi.net]

  • by thegarbz ( 1787294 ) on Saturday October 12, 2013 @05:43PM (#45110863)

    This is no surprise really. Who would you trust to program a computer in charge of your life? A company who revolutionised the way we communicate and interact with technology? A company which offers incredible services which make our lives better thanks to gobbling up talented software engineers.

    Or.

    A company who's greatest innovation in the past 5 years is asking congress for handouts, and designing a touch screen interface for a car radio where the only new feature is that it is now far more difficult to use.

    • by Joining Yet Again ( 2992179 ) on Saturday October 12, 2013 @06:51PM (#45111189)

      A company who revolutionised the way we communicate and interact with technology?

      "Revolutionised"? Microsoft did that once in the early '80s, with IBM; and a second time in the mid-'90s, when it supplied a desktop OS with a TCP/IP stack. Apple's 1984 and iDevice UIs were similarly brilliant.

      Google's just a bunch of incremental improvements to existing tech to increase the quality of product to an ad brokerage platform.

      A company who's greatest innovation in the past 5 years is asking congress for handouts,

      Which company is this?

    • This is no surprise really. Who would you trust to program a computer in charge of your life?

      A company that I'm a rabid fanboy of?

      Or.

      An old skool company that's not nearly so l33t?

      There, fixed that for you.

      • Company? Which company did I mention? Apple, Google, Microsoft? Maybe Ford, GM, or Mitsubishi?

        I hate about 2/3rds of them yet pick any one of them and slot them into my post and it'll still suit just fine.

    • Trust Rain Man (Score:5, Interesting)

      by Okian Warrior ( 537106 ) on Saturday October 12, 2013 @07:29PM (#45111377) Homepage Journal

      This is no surprise really. Who would you trust to program a computer in charge of your life?

      You trust a true nerd: Someone who is obsessive about correctness, some distance down the Asperger's spectrum, and who's convinced that the consequences of having a bug are their fault. Hygiene and dress-code are secondary.

      I used to code aircraft avionics software (microcontroller stuff for altimeters, airspeed, cabin pressurization, &c). Some of my avionics-related courses asked "are you willing to be the first passenger in an aircraft running this software? raise your hand", and typically mine was the only hand showing.

      There's a mindset for making safety-certified software, and not everyone has it. Most people rationalize doing a poor job by denying responsibility: the boss told them to do it, they have to feed their family, everyone else does it, and so on and so on. It's the mindset that allows the NSA get away with rights violations: no one takes responsibility at any level.

      A true nerd is a little like Rain Man, and will feel responsible for accidents that happen because of his mistakes. In my mind it feels like walking a tightrope over a canyon with no net - I'm always scared of screwing up and I have this mental image of screaming people plunging to their doom. I'm not making this up, the image sometimes pops into my mind while I'm on a project.

      I don't trust my coding skills, of course: there has to be a QA department with testers going over the code, proper paper trails and procedures, independent customer testing, and management that cares about quality. With all this, it still takes courage for me to work on an aircraft project.

      I've met people who do and others who do not have this mindset. One FAA engineer (DER - Designated Engineer Responsible [wikipedia.org]) asked about whether using a 1-byte code checksum (at startup, to verify code integrity) was sufficient and maybe 2-bytes would be safer, and *nothing else* about the project. A 2nd FAA engineer tested the system through literally all the specifications, verifying that the product did what it was supposed to do. As uncomfortable as the 2nd DER was making management, I'd much rather work with him: he understands what's at stake.

      I don't think it's a case of trusting Google over Ford, or even an application company versus a car company. It's the mindset of the people making the product, and the level to which they feel responsible for the final product. It's only a little bit the mindset of management.

      tl;dr: It's not the type of company, it's the type of individuals who make the product.

      • by lorinc ( 2470890 )

        I'm not really sure I want to reduce all my trust to the emotional mindset of the programer. Emotions are often proven to be very instable, and in case of critical systems the consequences will be desastrous. I'd prefer to use a well defined and robust methodology, ideally involving proven programs.

    • by ebno-10db ( 1459097 ) on Saturday October 12, 2013 @07:58PM (#45111487)

      Who would you trust to program a computer in charge of your life? A company who revolutionised the way we communicate and interact with technology? A company which offers incredible services which make our lives better thanks to gobbling up talented software engineers.

      Or.

      A company who's greatest innovation in the past 5 years ...

      The car company (most of which didn't ask for handouts, for example, Ford or Toyota). Google, for all their cleverness, has never produced anything that's safety critical. I seriously doubt their culture is suited to it. It's very different from "let's play with this cool new idea". That's why progress in cars and airliners is slower than with non-safety critical software. If the car companies need help with the software, they're better off hiring people, or companies, from aerospace. Ever look at writing code to DO-178B Level A? That's what you need for fly-by-wire systems, where a bug can kill you. It's also very tedious and boring work.

      • IMO the difference between safety critical and business critical are not as great as you think. If you can make something with 99.999% up-time without a handled fault, it can be used for either application. Most of the current systems for cars don't seam to be designed not to fail, but to fail back to mechanical system. That sort of safety critical is probably less helpful for a autonomous vehicle, than the output of a company that makes system redundant enough to keep operating regardless. Obviously go

      • Google, for all their cleverness, has never produced anything that's safety critical. I seriously doubt their culture is suited to it. It's very different from "let's play with this cool new idea". That's why progress in cars and airliners is slower than with non-safety critical software.

        We've seen what needs to happen with Google happen in reverse with console games. It used to be that if you released a console game, it had to be bulletproof. Your code was going to be burned into ROM or a CD for all ete

      • You mean like the Google self driving car which has done some 500000km without accident?

        You don't need anything special for safety critical software. You just need to pick an architecture and code in a bug free way. Google have the expertise to design their own hardware and have shown some ludicrously good uptime in their data centres so they are quite capable of writing good software too.

        • You mean like the Google self driving car which has done some 500000km without accident?

          Google press releases tell you very little about the actual test conditions. Were these cherry picked routes that had been carefully mapped? Heavy traffic, inclement weather (not sure they know about that in SV), etc. How many times did the driver decide "oops, better switch to manual control because I know these things don't work well under these conditions".

          You just need to pick an architecture and code in a bug free way.

          Why haven't other people thought of that? The secret to quality software is to leave out the bugs.

          Google have the expertise to design their own hardware

          There is a small difference between servers/data cen

  • by Animats ( 122034 ) on Saturday October 12, 2013 @05:50PM (#45110885) Homepage

    I wouldn't trust a web-oriented company with self-driving car development. They're into "agile","features now, quality later if ever" and "release and patch" development. That's not how avionics are developed, and production self-driving cars need avionics-quality software.

    Self-driving needs the engineering discipline that comes from having to pay for a recall when it doesn't work, and paying for damages when it hurts someone.

    • by DarkOx ( 621550 ) on Saturday October 12, 2013 @06:58PM (#45111221) Journal

      That is because you have sense. Most people however do not, and automotive engineering ( a few spectacular failures aside ) is so good people are spoiled. When they have to reboot their Android phone, or Windows crashes ( admittedly rarer than it once was ), or their chrome cast thingy glitches and they have to restart their video, they shurg it off and don't think about it. "That is just how computers work after all"

      When the slightest little thing gets off on their car they freak-out and take to the dealer right away and its a major memorable event in their lives. Engine stumbles a bit in the pouring rain, its "hey I only have 70k miles on this thing $CARCOMPANYs are shit," it could not have anything to do with the fact they have never replaced those 10 year old plug wires.

      The fact is modern automobiles are incredibly reliable given they conditions they have to operate in, but peoples expectations are very very high as a result; peoples expectations around tech are much lower, but the resulting perception is $TECHCOMPANY is better than $CARCOMPANY because for so many people their car has become something they don't even think about except when something is amiss, they think about $TECHTOY all the time though and remember the positive experiences more.

      • The fact is modern automobiles are incredibly reliable...

        You mean modern non-American automobiles are incredibly reliable. I suppose you can rely on American automobiles to start having severe problems as soon as the warranty has expired. I have some "high mileage" vehicles, one with over 200k and another with over 130k, and I figure they are not even halfway through their useful life. Obviously, they are not American; otherwise, they would both be non-functional at this point.

        To be fair, there are a few lines of American built trucks that have that kind of relia

  • by hsa ( 598343 ) on Saturday October 12, 2013 @05:54PM (#45110921)

    No sane auto manufacturer is going to take the risk of legal liability. There will be accidents. And there will be lawsuits.

    I am betting all the research that goes on at Ford or Toyota is just for the patents - they don't ever want to go in to production. Too risky.

    • by Animats ( 122034 )

      No sane auto manufacturer is going to take the risk of legal liability. There will be accidents. And there will be lawsuits.

      There's a financial engineering solution to that.

      I am betting all the research that goes on at Ford or Toyota is just for the patents - they don't ever want to go in to production. Too risky.

      That was true for a long time. GM played around with automatic driving as early as the 1950s. It's not true any more. GM now intends to have full-auto driving in Cadillacs by 2016.

      • by tftp ( 111690 )

        GM played around with automatic driving as early as the 1950s. It's not true any more. GM now intends to have full-auto driving in Cadillacs by 2016.

        They have to. The population ages quickly, and old people are not as good as they once were. I will be glad to buy a self-driving car when I cannot safely drive anymore because that will give me the same mobility that I have today, with even less hassle.

    • The solution to this problem, is to get all people who buy a self driving car to sign an EULA saying that there is a manual override function and they must pay full attention at all times in case of an accident. If the car senses an impending accident it immediatley returns control to the driver. All responsibilities resolved.

      People who are riding in a 2 ton hunk of metal being propelled by explosive fuel should pay attention to the road and the situations around them even if the car drive itself.

      Or you cou

    • by kkwst2 ( 992504 )

      You're assuming that their liability risk will go up. I suggest exactly the opposite. Sure, there will be accidents, but they will be far fewer than with human drivers. The rate of accidents where the computer controlled car is at fault will likely be 100 times lower. Even if bias against computer controlled cars will make lawsuits in those situations much more likely, and payouts much higher, I wouldn't be surprised if an analysis shows that their overall liability should decrease substantially.

  • by Bill Dimm ( 463823 ) on Saturday October 12, 2013 @05:54PM (#45110923) Homepage

    Google car: Will be named the "Beta." It will work great for the first three years, then Google will shut it down. If you have any problems, you will find that there is no customer support number.

    Microsoft car: Will be named "Ding." You'll be cruising down the highway when the control panel suddenly says "Rebooting to install updates in 9...8...7..." Owners of the first few versions will have close encounters with telephone poles. Nobody will sell you car insurance.

    Apple car: Will be named "iDrive." The car will cost $300k and will look modern and sexy. Build quality will be excellent. No matter what destination you enter, you'll end up in Camden because it uses Apple maps.

    Oracle car: Will be named "Oracle Car." The car will cost $400k and you'll need an expensive consultant to make it work.

    • Apple car: Will be named "iDrive." The car will cost $300k and will look modern and sexy. Build quality will be excellent. No matter what destination you enter, you'll end up in Camden because it uses Apple maps.

      The battery will be sealed in a black box and welded to the frame. You will either have to go to a specialist Apple shop to get it replaced or learn how to weld.

    • SAP car: Will refuse to go anywhere until you Send Another Payment.

      IBM car: You will have to argue with Watson about where you want to go.

  • You're goddam right I'd sooner trust a tech company to build a sef-driving car. Because, as we all know, computers never crash.

  • by David_Hart ( 1184661 ) on Saturday October 12, 2013 @06:41PM (#45111131)

    I wouldn't take much stock in this. All it really reveals is what we already know, Google has had a lot of publicity around their self-driving cars and thus are more popular and would show up more in web conversations (which is where they got their data - MOBI).

    Personally, I would trust car manufacturers much more than Google to deliver a self-driving car. Google is developing the technology but it's up to the car company to tweak and integrate it safely. This is no different than the other tech components created by various companies and integrated into our vehicles (i.e. Radio, GPS, Follow Cruise Control, Traction Control, heated seats, etc.). When we go to buy a car it will simply be listed in the specs. For example: Heated seats, Alpine Infotainment system, Quadra-Trac II traction control, Google Autonomous Drive, etc....

  • by PopeRatzo ( 965947 ) on Saturday October 12, 2013 @06:58PM (#45111219) Journal

    Marketing has done so much damage to people's ability to reason.

    How are automakers not "tech companies"? Do they think it's magic making the car go forward? Do they think the 50 or so microprocessors that are in every new car are powered by faeries?

    Seriously, it's like saying they trust McDonalds over farmers to produce food. Oh wait...

  • by heretic108 ( 454817 ) on Saturday October 12, 2013 @07:12PM (#45111275)
    Self-driving cars means that people will be able to drink and "drive" to their hearts content, legally and safely. This will help to rejuvenate the ailing club/pub scene and maybe restore the live entertainment industry to grace. It would make sense for liquor companies, pubs and clubs to invest substantially in autonomous vehicle tech. Anyone up for a new "Roaring 20s"?
    • Wouldn't it make just as much sense to have on hire designated drivers with a chase car to drive the designated driver back to the bar, free with x amount of total drinks for the group? That way you can atleast get some jobs out of the deal.

  • by asmkm22 ( 1902712 ) on Saturday October 12, 2013 @07:14PM (#45111289)

    Maybe I'm just a control freak, but I don't think I could trust a self-driving car under any circumstances, regardless of manufacturer. Maybe that will change over time, but I'm not counting on it.

    • Maybe I'm just a control freak, but I don't think I could trust a self-driving car under any circumstances, regardless of manufacturer. Maybe that will change over time, but I'm not counting on it.

      No you're just suffering the delusion that you, a big bag of water capable of distraction, fatigue, reduced alertness, complete with variable response time, a propensity for aggression against others who may not quite be driving to your liking, are in some way a better driver than a fit for purpose machine.

      I'm actually hoping the future heads down a penalty route, where drivers who drive themselves are forced into a slow lane of a freeway while computer controlled vehicles are doing 180km/h and never causin

    • Then you better stop driving right now, because, if your car is less than 20 years old, i can assure you there is quite a bit of electronics between your gas pedal and the engine, and even more between the brake pedal and the wheels.

  • by plopez ( 54068 ) on Saturday October 12, 2013 @08:27PM (#45111591) Journal

    By law. They have to acknowledge defects, call "recalls", and fix them. Software companies do not. Software companies can sweep things under the rug. So news broadcasts talk about car companies defects but you never really hear much about software defects. This gives software companies an aura of competence the usually do not deserve. It is a matter of perception.

    Personally, I would trust car companies first. Not only are they liable, but by being forced to face up to their defects their product is incredibly safer and cheaper, in terms of inflation adjusted TCO, than ever. And getting much greener to boot.

    BTW, this is a perfect example of what good government regulation can do.

    • by Belial6 ( 794905 )
      The thing is, the reason software companies sweep defects under the rug is because they can. If they produce safety equipment, they will just have to do the recalls also. Car companies are just a likely to sweep any defect they can under the rug. Look at the parts of cars that are not safety related, and you will frequently find that they are crap. You will also find that car companies will happily make excuses as to why the problems are not really problems.
      • by plopez ( 54068 )

        It is always scary when I go into a hospital or doctors offices and see medical equipment running Windows. Do you really think that software companies will not continue to avoid liability by creating convoluted EULAs? Or a "accept this EULA or you don't get a warranty" type situation.

        • by Belial6 ( 794905 )
          It is strictly a matter of what regulation and the courts allow them to get away with. Do you think that car companies assume any liability that they could avoid?
  • Based on polls of focus groups, technology companies scored highest among consumers, with a median score of 8 on a scale of 1 to 10, with 10 as the highest level of trust.

    The geek builds his statistical arguments on sand.

    The poll of focus groups was conducted June 10 to 27 and included three diverse consumer groups that included 32 people from Los Angeles, Chicago and Iselin, N.J. One-third of those surveyed were premium vehicle owners who were more interested in autonomous vehicles and self-driving technology.

    KPMG conceded that the small number of people participating in the focus groups, while valuable for the qualitative and directional insights, was ''not statistically valid.''

    Consumers would prefer to buy a self-driving car from Google over Ford [computerworld.com]

    Iselin rang no bells whatever and I had to look it up:

    Iselin is a census-designated place and unincorporated community within Woodbridge Township, in Middlesex County, New Jersey, United States. As of the 2010 United States Census, the CDP population was 18,695.

    The racial makeup of the CDP was 41.47% (7,753) White, 6.72% (1,257) Black or African American, 0.33% (62) Native American, 46.12% (8,623) Asian, 0.00% (0) Pacific Islander, 2.26% (423) from other races, and 3.09% (577) from two or more races. Hispanic or Latino of any race were 7.12% (1,332) of the population.

    An area known as Metropark, consisting primarily of office parks and large office buildings, lies in the southwestern corner of Iselin and spills over into neighboring Edison. The New Jersey Transit and Amtrak Metropark Station is named for this area.

    In addition to a Hilton Hotel and the train station, Metropark also features the headquarters of Ansell Limited, Engelhard Corporation (acquired by BASF in 2006) and Eaton Corporation's Filtration Division. Other corporate residents in the area include Siemens AG, Tata Consultancy Services, Wells Fargo, JPMorgan Chase & Co., Accenture, Level 3, BT (British Telecom), UBS AG and TIAA-CREF.

    Iselin, New Jersey [wikipedia.org]

    Iselin's Asian population is Indian.

    Iselin lies just west of Staten Island and is for all practical purposes just another corporate suburb of midtown Manhattan.

  • Both have safety records that are frankly just horrible. Both cut corners because consumers either don't know or don't care how bad the product is, as long as it's sufficient for their needs and they don't have to worry about it.

    As long as you have to police the apps that get on phones, there's patch Tuesday for MicroSoft and similar mechanisms for other "tech", I wouldn't trust any of those companies driving my car. The way they work is focus on features and functionality and not on security and resilienc

  • by PPH ( 736903 ) on Sunday October 13, 2013 @10:37AM (#45114245)

    A Software Engineer, a Hardware Engineer and a Departmental Manager were on their way to a meeting in Switzerland. They were driving down a steep mountain road when suddenly the brakes on their car failed. The car careened almost out of control down the road, bouncing off the crash barriers, until it miraculously ground to a halt scraping along the mountainside. The car's occupants, shaken but unhurt, now had a problem: they were stuck halfway down a mountain in a car with no brakes. What were they to do?

    "I know", said the Departmental Manager, "Let's have a meeting, propose a Vision, formulate a Mission Statement, define some Goals, and by a process of Continuous Improvement find a solution to the Critical Problems, and we can be on our way."

    "No, no", said the Hardware Engineer, "That will take far too long, and besides, that method has never worked before. I've got my Swiss Army knife with me, and in no time at all I can strip down the car's braking system, isolate the fault, fix it, and we can be on our way."

    "Well", said the Software Engineer, "Before we do anything, I think we should push the car back up the road and see if it happens again."

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...