Become a fan of Slashdot on Facebook


Forgot your password?
Robotics The Almighty Buck The Courts Your Rights Online

Robots May Inspire Suits Against Programmers 202

cpu6502 writes "Robert Silverberg wrote a recent editorial about the dangers of robots and the legal consequences for their programmers and engineers: 'Consider malicious kids hacking into a house that uses a robot cleaning system and reprogramming the robot to smash dishes and break furniture. If the hackers are caught and sued, but turn out not to have any assets, isn't it likely that the lawyers will go after the programmer who designed it or the manufacturer who built it? In our society, the liability concept is upwardly mobile, searching always for the deepest pocket.'"
This discussion has been archived. No new comments can be posted.

Robots May Inspire Suits Against Programmers

Comments Filter:
  • Maybe... (Score:5, Insightful)

    by DWMorse ( 1816016 ) on Sunday January 16, 2011 @10:39AM (#34896660) Homepage

    Maybe... But last I saw, Ford Motor Company wasn't liable for drunk drivers that use their vehicles to drink and drive, resulting in death or destruction of property. This makes me think that engineering a product doesn't necessarily make you liable for someone that breaks it apart.

    Now, if your product was a security software, and you advertised it to supposedly prevent this...

    • Re:Maybe... (Score:5, Insightful)

      by Anonymous Coward on Sunday January 16, 2011 @10:42AM (#34896684)

      A more apt car analogy would be suing ford because someone broke into your car. Or suing Microsoft because somebody hacked your server. There's zero precedent to these types of lawsuits, and adding "with a robot" doesn't change that. This article is fucking retarded, even for Slashdot.

      • If this is a likely scenario then where are all the people suing Microsoft because Windows let hackers install malicious code which deleted data. This happens frequently and yet I'm not aware of MS getting sued.
        • I recall the MS EULA specifically prevents any liability. It's near the bit that forbids the use of general-purpose Windows licences in the operation of nuclear facilities or other places there there is potential for epic fail. Maybe robots will have an EULA too, with a similar clause.
          • Re: (Score:3, Interesting)

            by 91degrees ( 207121 )
            Why do people always assume that just because it's in the EULA, it's legally binding. It's a factor, certainly, but generally courts take a dim view of companies trying to weasel out of their legal obligations with this sort of thing. There's still an expectation that the software is fit for purpose.
            • by tftp ( 111690 )

              There's still an expectation that the software is fit for purpose.

              But it's not you who defines the purpose, it's the manufacturer.

              Imagine that someone took a MS flight simulator software, connected it to a real airplane somehow, switched the autopilot on, and the airplane fell to the ground. Can he sue Microsoft?

              If the manufacturer tells you that the software is not fit for certain uses, you'd better believe it. Software for autopilots may come with formal proofs of correctness.

              If I were the robot

      • You mean like how "on the internet" doesn't change anything regarding lawsuits or patents? =p

      • I wouldn't be so sure. "With the internet" sure changed a LOT, things that are (il)legal under normal circumstances suddenly flip legality when "with the internet" is added to it.

      • Except there are a lot of lawsuit happy morons out there that don't accept responsibility for their own actions and want to sue everyone who's even peripherally involved in even the vaguest sense so long as they have money.

        You've heard about the moron that ignored the warning signs, climbed over the fence (by the warning sign) and tried to pet a dolphin that bit him? (I think the article said it was really minor, maybe didn't even break the skin.) He still got 10% of the insane amount he was asking for.

    • Re:Maybe... (Score:4, Funny)

      by WrongSizeGlass ( 838941 ) on Sunday January 16, 2011 @10:47AM (#34896718)
      I'm not worried at all because I have a Robot Lawyer ®.
    • Does this mean I can sue Microsoft when Windows crashes and I lose a bunch of data that's not backed up?

      I mean, they programmed it to do one thing and it broke and caused me fiscal harm. That's the same thing, right?

      Oh wait, that's right - I clicked 'Agree' when I installed Windows and waived away all of my rights to sue for bad programming. Problem solved (for them).

    • There are no security holes within the car involved in drunk driving. However if the programmer did do something that made it easier for people to break into something then maybe the company should be held liable. Just as Ford would be held liable if they put shitty brakes in their car to cut costs.
      • by Surt ( 22457 )

        Since Ford had the option to but in a breath-lock starter system, but instead chose a shitty non-breath-lock starter system to save costs, should they be held liable?

        • If the brakes met safety standards and weren't defective then there is no problem with them as long as they operate as expected.

          Using cheaper brakes isn't really the same as writing poor code and not testing it because you want the product out now and think it's acceptable to patch it later.
      • Did something that made it easier to reprogram it? Like, say, making it programmable?

        That's pretty much the point of having a programmable robot. To be able to reprogram it. If you insist in car analogies, how about saying that the maker of the navigation system should be held liable if someone breaks into your car and installs some bogus maps?

        • My microwave is programmable but that doesn't mean I can install code on it.

          Also the premise doesn't state that the robot is meant to be programmable but that the hacker broke in and modified through unintended methods which means something wasn't secure.

          If the owner of the home picked a poor password it's his fault. If the robot's code had a gapping hole in it then it's the company's fault.
          • No, but do you think it would be hard for someone who knows a bit about microwaves to turn it into a potentially dangerous device so it cooks you instead of your soup the next time you turn it on?

            And we're not even talking passwords here. How about rigging your gas oven to let it leak so the next time you strike a match you blow up?

            Once someone is in your home, it becomes trivial for him to turn your tools against you. That doesn't mean that the makers of said tools are responsible for it. What's a "gapping

      • by Xugumad ( 39311 )

        > However if the programmer did do something that made it easier for people to break into something then maybe the company should be held liable.

        Okay, but define "easier" for me. I mean, do we have it run Linux, and need security patches on a week to week basis? Maybe OpenBSD, that really raises the bar for cracking the software. Or, go all the way, have the whole software stack proven mathematically.

        What I'm getting at is that actually, the level of security expected is very vague, and hard to determine

        • If something works as intended and it's open that is different from selling someone a product which clearly is broke, worry about patching it later and in the mean time people end up losing data through poor coding.

          Maybe we should spend more time ensure things are safer rather than just worrying about getting it out now. I think it's equally unfair to say that the company isn't responsible for anything in a EULA as it is to say everything is the programmer's fault.

          In a lot of cases with poor code it's
    • by McGiraf ( 196030 )

      If I can get sued for my code by anyone who buys it, I better get paid every time someones buys it.

    • by b4upoo ( 166390 )

      An even better example is the making of hand guns. The manufacturers have rarely been held liable for the way the gun was misused unless there was some really unusual wrong doing in the distribution of the weapon.
      In many ways our laws are able to be manipulated by language. For example a gun is not a weapon. It is like a whiskey bottle. If you try to bash someone with a whiskey bottle it becomes a weapon. If, instead of hunting, target shooting, or simply col

    • by mangu ( 126918 )

      last I saw, Ford Motor Company wasn't liable for drunk drivers that use their vehicles to drink and drive, resulting in death or destruction of property

      But Ford did get sued, successfully, when the damage was caused by cost cutting and bad engineering [].

    • Repeat after me.

      You can sue anyone you want to and they will probably cough up settlement money just to make you go away.

    • No, but you certainly can sue when, say, the hydraulics in the car aren't strong enough to apply the breaks when the temperature drops below -5 C.

      Is the manufacturer liable for producing a product which has an inherent flaw? Yes.
      If a system is to be connected to the Internet, should security be a necessary design component? Yes. HIPAA, SOX, and FERPA all mandate that reasonable security measures to prevent tampering or access be taken for health, large financial, and public school information for example

    • There have been quite a few suits agains firearm manufacturers though. What's the difference?

    • by danwiz ( 538108 )

      Maybe... But last I saw, Ford Motor Company wasn't liable for drunk drivers that use their vehicles to drink and drive

      Maybe ... But there have been strong attempts to make gun manufacturers liable for the misuse of their products.

      Also, as robots become more commonplace in our homes, the "Think of the children!" rally cry could be misused here too.

  • by Haedrian ( 1676506 ) on Sunday January 16, 2011 @10:39AM (#34896664)

    Or even news?

    What happens if kids break into your house and break your dishes? You sue their parents? You sue the school for not teaching them well? You sue the government for not putting enough money in education?

    There is no logic to who gets sued. Suing is an interesting part of physics - whenever there is a "Lots of money" gradient, and a "Has worse Lawyers" gradient, the suing target moves.

    Now I'll just be off suing microsoft for my latest virus. Brb.

    • by tukang ( 1209392 )
      It depends. If I bought a lock that was advertised as safe and the kids picked it with a paper clip then I may very well sue the company responsible for that advertisement. I think software is no different. You have to look at what was promised and what was delivered. The sophistication of the hack - actually the hole - also matters in determining if there was negligence. If the kids used a backdoor program that the devs forgot to take out or if the devs forgot to do something as simple as defending against
    • Subj.

      Also, I'm lodging a complaint against you for disturbing my mental balance by insinuating that suing random entities is somehow NOT good. Prepare to pay me ONE BILLION DOLORS!

    • Actually, I would go to their parents, tell them about it and ask them nicely to not let them do that again. Then I would buy new dishes.

      Would people really go to court over this? Seriously?

      • by Surt ( 22457 )

        Well, given the kids were said to have broken in, if the parents didn't volunteer to replace the dishes, and said dishes were sufficiently expensive to replace ... I could well imagine using small claims court to repair that defect in the parents.

        Some people like to eat using fancy dishes. Like $500+ per plate. Not me, but I'm just saying, this scenario could have been about a serious amount of money. Less so if you eat off of $0.25 IKEA plates.

  • by Dolphinzilla ( 199489 ) on Sunday January 16, 2011 @10:45AM (#34896702) Journal

    This in my opinion is a major reason our society is so screwed up - why should we even consider it reasonable that lawyers can go after software engineers and programmers to "make someone pay" because the real criminals have no assets. Product liability insurance is a major reason why some things cost so much and until we break the cycle and get the lawyers out of control (most of them run our governments)these frivolous lawsuits will continue - in the end the only people that really win are the lawyers. This is the same argument as going after a Glock handgun designer because one of their weapons was used to shoot someone - its absurdity to the max

    • why should a software company hold no responsibility for anything their software does or lacks? I don't think you can blame a programmer for a problem with their software if the problem originates outside of their code but I do think companies should be held responsible for pumping out shit software.
    • by Sique ( 173459 )

      I completely disagree with you on that.
      A product designed by an engineer is sold to a layman, who is by definition not able to assess the inherent dangers of its use. If he was, he wouldn't need the service of an engineer to design the product in the first place. A layman is not an engineer. An engineer thus has either to transfer all the knowledge necessary to operate the product safely to the customer, or to make its use not dangerous, even the use the product was not originally intended for. And of cours

      • The need to point out dangers that are not obvious to a customer is, in principle, a sane approach. But it entirely depends on the level of common sense you can legally assume the consumer to possess. In europe, you usually assume the consumer to be mostly sane, and not retarded. In that case, the system works fine. In the US, you sometimes seem to assume that the customer is the most redarded dickhead nature was able to create. In that case, you are stuck with insanely stupid warnings like "hot fluids mus

    • Advertising using blatant falsehoods is likely more of a damper than liability losses.

      The level of lies about the functionality and usability of most software (and most other things) boggles the mind. The lost time and effort for folks trying to dig into the truth is immense, and if anything it should be easier to take these companies to task for wild claims that are only farcically supported by reality at best.

  • Most software licenses have waivers of liability, and have a limit on the monetary damages. The limit is usually the purchase cost of the software. So, you can get a refund, and that's it. The only place I see that isn't waived are safety-critical applications, like medical devices, nuclear devices, vehicles, and factory floors. These are typically hard real-time systems. Besides, you can always blame the owner for not patching the system! The "unlock your car or home from your iphone" apps really worry me.

    • I didn't even bother reading this article, the whole idea is ridiculous. If this was really a problem, then Microsoft would have been sued into oblivion before this century started.

      • by jbengt ( 874751 )

        If this was really a problem, then Microsoft would have been sued into oblivion before this century started.

        Because Microsoft sold so many defective robots in the '90s?

    • by jbengt ( 874751 )

      Most software licenses have waivers of liability, and have a limit on the monetary damages.

      Liability clauses can work both ways.
      If a robot company produces and sells robots that cause harm, then they can be sued under product liability laws, which apply strict liability standards (in the US, anyway), which means that absent post-sale changes to the product, the producer/seller is liable, period. (If the harm is caused by a hacker, that's a different story.)
      However, if a software company provides a service to a robot company, then the software may fall under a different liability doctrine in wh

      • Hm. Do AV companies pull this same stunt? I mean, if there's any particular software that should produce liabilities (as in "in exchange for money, you protect my system" kinda deals, thus excluding free licenses or open source stuff), you'd think it'd be the ones hootin' and hollerin' that they're needed to protect you from internet boogeymen.

  • because the legal system encourages profiteering over reparations. especially for the law firms. it is natural that these firms would use any excuse to have people sue other people for all kinds of bullshit.
  • by RyanFenton ( 230700 ) on Sunday January 16, 2011 @10:56AM (#34896780)

    1. Manufacturers will very likely isolate their product from function, only selling unprogrammed tools with APIs, to companies who resell the devices with an OS with strict functionality limitations, and DRM-like lockouts to isolate themselves from liability.

    2. Companies will be careful in the beginning to set precedence that allows them to bypass such liability. Likely they'll create a set of manufactured "harm" scenarios, with honest but complicit victims with a vested interest in blocking most future lawsuits based on indirect liability.

    Only once liability precedence has been set will the APIs open up on consumer tools from the major manufacturers. The court system may be insane in many ways - but they function to the needs of large companies - mostly as a negotiation device, and a filter for amount of money owned ("You must be this rich to use the court system").

    Ryan Fenton

  • But programming has been around long enough that 1) I am sure there is an instance of this already and 2) There have been plenty instances of bad things occurring already that it should have happened if it was going to.
  • This article could potentially give Dr. Forrester some bad ideas...Joel already has enough to deal with!

    -Crow T. Robot
  • Yes, and lets sue car manu's for making cars that can kill people.

    Or gun makers since guns kill people.

    Or the president, since he's, well, in charge.

    I'm suing slashdot for these crappy articles.

    Seriously, wtf is wrong here? I know it's sunday, but is this really news?

    • by Surt ( 22457 )

      Guns are a little different from your other cases because having killed a person, you've only proved the guns fitness for purpose. You've made any liability suit for poor design harder, not easier, in that case.

  • Microsoft got ever sued for the damaged caused by the thousands of virus/botnets/trojans/intrusions caused by the "security" of their software? Not even got hit for delaying applying patches for known or being exploited vulnerabilities ever.
  • isn't it likely that the lawyers will go after the programmer who designed it or the manufacturer who built it? In our society, the liability concept is upwardly mobile, searching always for the deepest pocket.'"

    If there's someone that will pay them for doing so, then sure, they may try. But why single out robots when there's already a device in most peoples' homes that is already being hacked for malevalent purposes? When is the last time anyone has brought a suit against Dell (and it went anywhere) beca

  • by rwa2 ( 4391 ) *

    Simply have human operators responsible for "monitoring" the robots. They take all the liability if something goes wrong.

    After all, that's why (largely autonomous) light rail / subway trains pay college students / poor people / etc. to sit in the cab and hold the "door open" button for the train.

    Probably also why we'll never have fully automated cars and passenger aircraft as well. Easiest to just blame the driver / pilot / etc. for failing to handle the situation appropriately. Or at least they're their

  • You build a product that might possibly injure somebody, you buy insurance. But if the product does injure somebody, they can't sue your insurance company. Heavens to Betsy, some juror might find out that an insurance company is really the defendant! Can't have that, juries are dumb! We'll make them sue the programmers or engineers. We'll have some stooge geek sitting in the dock, and we'll pretend through the entire trial that the engineer is really going to be personally liable for paying the verdict

  • by MikeRT ( 947531 ) on Sunday January 16, 2011 @11:47AM (#34897088)

    Instead of locking the punks up, make them pay the victim 7 times the value of the damaged property. Deny them welfare until they've paid it back. If they commit another felony while they're still paying it off, double the sentence for that felony.

    On the surface, it may sound harsh, but if they do $1000 of damage to their neighbor and the court makes them pay back $7,000 as restitution and punishment instead of booking them in the pokey for two years, which is less disruptive? Having to pay back $7,000 with no interest at 1-2x minimum wage or doing prison time and then trying to find a job?

    • by artor3 ( 1344997 )

      So wealthy crooks can laugh off their sentences? It's hard enough to get a conviction against the rich with their teams of expensive lawyers, and you'd want to make it so that should they actually lose, it can all go away with some tiny check?

      Also, what do you mean, "deny them welfare"? Are you one of those ancient conservatives who still rails against "Welfare Queens"? That system was eliminated in 96. Or do you mean welfare in a more general sense, like food stamps and disability insurance? In that c

      • So wealthy crooks can laugh off their sentences?

        Bernie Madoff scammed $50B. Under a punitive restitution system, he and his conspirators would be looking at $350B in damages.

        Obviously, they could never pay that much off. That's part of the point. He and his family will be left utterly ruined for the rest of their lives for such a great financial harm.

    • First of all, no lawyer is going to accept 33% of minimum wage as the contingency fee. So any such punishment would be fought tooth and nail by the ambulance chasers to prevent a precedence. So the counsel to the plaintiff would not seek it.

      The victim does not care for the justness of punishment meted out to the perps. They want compensation, only when compensation is impossible they will settle for revenge. So the plaintiff does not seek it either.

      The defendant would rather walk away free and let some

  • Programmers program. Engineers design. And the manufacturer of the robot would be no more likely to be sued than Ford would if the kids had smashed in the side of the house with a stolen Taurus.

    If the kids had found a Sawzall in the basement and used it to trash the house do you think the homeowner could hold Milwaukee Electric Tool liable?

    • If the kids had found a Sawzall in the basement and used it to trash the house do you think the homeowner could hold Milwaukee Electric Tool liable?

      You wouldn't think so, but then you see things like the Louisville Slugger case, where the manufacturers of a baseball bat were found liable for the death of someone hit by a batted ball, and you realize anything can happen in the court system.

      • If the kids had found a Sawzall in the basement and used it to trash the house do you think the homeowner could hold Milwaukee Electric Tool liable?

        You wouldn't think so, but then you see things like the Louisville Slugger case, where the manufacturers of a baseball bat were found liable for the death of someone hit by a batted ball, and you realize anything can happen in the court system.

        Without getting into weather or not the verdict was correct, the argument was that aluminum bats are much less safe than wood and H&B should have known that when they started making and selling them. I think similar arguments will be made (and no doubt have been) for robotic devices as they start causing problems. Defective or negligent design is the responsibility of the manufacturer; of course the definition of defective or negligent is open to interpretation.

  • People have already been killed by robots [] 30 years ago, so it isn't exactly a new thing that robots can do harm. Also why shouldn't the companies be liable? If you build something that is dangerous enough to do serious harm and sell it to lay persons, you better make sure that it has enough build in safety mechanisms and doesn't just go crazy because some script kiddy came along and wanted to have some fun.

  • 1 A robot may not injure a human being or, through inaction, allow a human being to come to harm.

    2 A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.

    3 A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

    4 A robot must not make a mess and if it does it must clear up after itself.

  • It is highly unlikely that the programmers or manufacturer of the original device would be liable. There are two main reasons. First, the wrongdoing of the hackers is almost certainly a superseding cause of the damage, which negates liability for negligence on the part of the programmers or manufacturer. Second, the product was not defective when it was sold and it was modified from its original condition, both of which negate products liability.

    The law is stupid sometimes, but it is not that stupid.

  • Maybe take one more step: allow robots to own assets, and have them pay taxes? That way you do not have to go after the programmers.

    Every droid his/her own bank account.... That would be interesting. (hers will probably be bigger than his)

  • Industries that have failed or may fail that face the same problem as this post include Aviation (they gained some protection from Congress via the 1984 GARA act), Education (teachers have to make their plans dumbed down for all, cut field trips due to liability issues, etc), Medicine (the cost of medical care is high because of the liability costs for valid care that somebody may have got a different opinion on).

    The American Tort Reform Association has a good short writeup on the Impacts on the Economy due []

  • That's as it should be. If you're doing something dangerous, you need to take responsibility for it.

    When I ran a DARPA Grand Challenge team, we took out a really good commercial liability policy. We had hardware stall timers, an electromagnet in the accelerator system that had to be energized to get out of idle, a separate battery and relay system which slammed on the brakes if the stall timer tripped, a backup anti-collision radar system, and a separate emergency stop radio link which had to send a si

  • The company that I work for, which I shall not name, is a Fortune 500 company. There is a lot of money there. Because of this, everything they do, and I do mean everything, is vetted by one of the legal teams to minimize liability.

    When I was going through one of the training classes, it was pretty much explained that way. If we have something delivered, and the company that we hire to do the delivery causes some damage to the client's site, we're the better target for a lawsuit because we have more money th

  • Until software companies and programmers get sued software will not significantly improve. That is why I say there is no such thing as software engineering. Engineers get sued, programmers don't. They won't be true engineers until they have to carry malpractice insurance.

  • by automandc ( 196618 ) on Sunday January 16, 2011 @02:40PM (#34898222)

    As an attorney, reading this question invokes the same reactions that many of the /. crowd would have if I started trying to opine on the technical failings that would allow our mythical vandals to reprogram the hypothetical robot.

    Not to get too technical, but just because you sue the company doesn't mean you win. The liability insurance that even the smallest companies carry would cover the legal costs of having such a suit dismissed. (For the technically inclined, look up comparative negligence and the proverbial "intervening bad actor").

    The homeowner (the ones suing) would probably be found more responsible for not following basic security etc.

    As others have pointed out, software companies have long been given practically a free ride in harm caused by poorly written software. First, they have been allowed to disclaim the standard warranties of fitness and function. This is akin to buying a car that the manufacturer won't promise to actually work or be safe. If Ford told you that they wouldn't guarantee that pressing the brake pedal actually engaged the brakes, would you drive that car? Yet every piece of commercial software we use specifically says that there is no promise that it will work at all, or do what the purchaser wants.

    Here is a counter hypothetical (more realistic as it has actually happened). A relative dies in a plane crash. The FAA investigation conclusively shows that the accident was caused by a bug in one of the key computer systems. Should you sue: the airline? The manufacturer (boeing/airbus)? The subcontractor that wrote the software?

    The answer is, you sue the airline, and the system is set up so that anything you win from them, they can then sue to recover from the party up the chain. Thus, everyone's liability is ultimately apportioned according to their degree of fault (note, yes it is a gross simplification). This is why people writing software for critical systems (ones where a failure can cause property damage or injury) need a good lawyer to write their contracts/licenses. They law has allowed programmers to avoid their responsibilties for a long time, so if a sw company doesn't take advantage of that, it is their own fault.

    Consider, there is no educational or professional certification required to write and sell software that controls an infant incubator used in an NICU, but you need a government license to drive to the store. Programmers and engineers have been getting a sweet deal in liability for years, so it's awesome to hear them still complaining.

  • Yeah, I've had some problems with my robot. He betrays me for money, he drinks, smokes and gambles. But still, he's my best friend. I don't think I'll be suing MomCorp.
  • by Xugumad ( 39311 ) on Sunday January 16, 2011 @03:26PM (#34898520)

    Wait, is he claiming robots will suddenly make software have more real world consequences? If so, I'd like to introduce him to Therac-25... []

    Short, not too squeamish version: Software bug in rare cases allowed radiation overdoses. People died.

"Well, social relevance is a schtick, like mysteries, social relevance, science fiction..." -- Art Spiegelman