Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Robotics Science

Robots in Medicine 135

eberry writes "The Cincinnati Children's Hospital Medical Center will use a robot to mix intravenous medications and prepare its syringes. The robot, about the size of three refrigerators strapped together, can fill 300 syringes an hour, each with a custom dose and a bar-code label routing it to a particular patient. The robot should reduce the potential for errors and improve patient safety. The robot still needs further approval by the Ohio State Board of Pharmacy, but that should come within a month. It should be noted that five Cincinnati hospitals already use computerized pill-dispensing systems." On the other hand, reader Bobbert sends in a cautionary note: "'A group of German patients has filed a lawsuit against financially beleaguered Integrated Surgical Systems Inc., alleging that the Davis company' Robodoc surgical robot is defective and dangerous, according to a company filing with the Securities and Exchange Commission.' So now with robotic surgery, both the doctor and the robot can liable for damages. Next thing you know, telecoms will be liable for medical malpractice if the network connections fail during remote robotic surgery."
This discussion has been archived. No new comments can be posted.

Robots in Medicine

Comments Filter:
  • by fembots ( 753724 ) on Tuesday January 04, 2005 @03:49PM (#11256709) Homepage
    Will these robots "sense" possibles error in the prescription though? For instance if the doctor entered the incorrect dose, an experienced nurse might just be able to pick it up, but a robot will just do as told.

    It reminds me a tail strike incident [taic.org.nz] where the pilot entered the incorrect weight and the system didn't pick it up. The incident report stated that the weight/speed combination should not have been allowed by the system at all, but nobody wrote that checking code at the beginning.
  • by drsmack1 ( 698392 ) * on Tuesday January 04, 2005 @03:52PM (#11256746)
    I hate frivilous lawsuits, but at least with a human doing the filling of drugs there is some common sense that can be a fail-safe. With a machine all it takes is a bug to have 300 vials of poison dealt to unsuspecting patients. Won't there still need to be human oversight?
  • by Icarus1919 ( 802533 ) on Tuesday January 04, 2005 @03:53PM (#11256774)
    Until the system is fixed so doctors and nurses don't have a constant case of jet lag from being up for different shifts every day, introducing new ways to prevent careless errors is the best way to save lives.
  • Poor analogy... (Score:2, Insightful)

    by rednip ( 186217 ) on Tuesday January 04, 2005 @03:54PM (#11256784) Journal
    This analogy is unfair...
    ' So now with robotic surgery, both the doctor and the robot can liable for damages. Next thing you know, telecoms will be liable for medical malpractice if the network connections fail during remote robotic surgery."
    When you build a product, there is (at least) an implied warranty that it is fit for a specific use. A surgical robot, *should* be able to conduct an operation. We aren't talking an apples and oranges thing here. I think the auther is trying to place a back end comment about tort reform.Now tell me again why we need tort reform...

    oh, yea, Malpractice is up 25% in 10 years (but medical costs have risen much higher...).

  • by tygerstripes ( 832644 ) on Tuesday January 04, 2005 @03:57PM (#11256821)
    To the best of my knowledge, it's human oversight that causes most drug administration accidents...
  • Telecom liability (Score:3, Insightful)

    by GoofyBoy ( 44399 ) on Tuesday January 04, 2005 @04:00PM (#11256858) Journal
    "Next thing you know, telecoms will be liable for medical malpractice if the network connections fail during remote robotic surgery."

    Telecoms usually have a clause for any business loss due network disruptions. I think that would apply here.
  • by paranode ( 671698 ) on Tuesday January 04, 2005 @04:01PM (#11256878)
    People just don't like to trust machines. Some of this is for good reason, but all faults in machines lead back to human error. If humans incorrectly filled, say for example, 200 prescriptions a year and ended up killing 10 people it would be bad and maybe some people would get sued and some folks would lose their licenses. If a machine made one mistake in the course of years that resulted in a death, we'd have everyone up in arms talking about how this could have been prevented and that we're letting people die at the hands of evil machines and then we'd have a battery of laws passed against machines. Unfortunately this sense of losing control takes over people and fear kicks in, even if the machine is 100 times more accurate than a human at the same task.
  • Having been through chemo, I know that the first thing the nurse did each time was show me each of the syringes that were to be injected into my IV. Each was labelled with the medicine name and dosage.

    Its called the "Five Rights" it is how you are supposed to verify the patient : Right Drug ?
    Right Dose ?
    Right Route ?
    Right Patient ?
    Right Time ?

    I dont care if it's Tylenol, the nurse should ask you this each time he/she gives you anything.

  • by xenocide2 ( 231786 ) on Tuesday January 04, 2005 @04:16PM (#11257040) Homepage
    In theory, pharmacists are indespensable human elements that go over these prescriptions, ensuring that bad combinations of drugs aren't administered (some combinations of drugs have a habit of reacting in the blood stream and forming a precipitate that clogs veins and arteries, among other things). In theory, doctors are supposed to be superhumans as well, and not prescribe these combinations.

    In reality, there are no super humans. Its not something the medical profession enjoys admitting. New studies of drug interactions come out regularly, and few can really keep up with the pace. If you were to test a pharmacist and a robot during a month long study, I'd expect that either the robot wins, or the pharmacist winds up being extra dilligent on behalf of the study and ties it for perfection.

    You act like its impossible to program in failsafes, like nobody knows exactly how much is too much, let alone poor helpless software engineers. Certainly, lives are put at risk in both avionics and medical computing. In this case, however, one of the core duties is to check exactly for these things, which places extra emphasis on an already important task.
  • by Mr. Cancelled ( 572486 ) on Tuesday January 04, 2005 @04:30PM (#11257205)
    Anyone remember the episode wherein a teenage hacker hacks into a medial facility and changes their medical software so that people are given overdoses of insulin?

    In the episode, the hacker was a teenager who was under the impression that the medical facility had blinded his father, and made the changes as a form of revenge.

    In the real life version, I'm going to guess that we'll have people threatening to do something similar unless they're paid off.

    Not that I'm against such changes. I just lost my Grandmother to a similar situation (someone gave her the wrong medicine as near as we can tell at this point), so any technology that can eliminate such errors, or help to reduce them, is welcomed by me and my family. I just think the Law & Order episode illustrates that no automated system's 100% foolproof. We still have to protect them from the script kiddies and such, but this is a huge step towards eliminating human errors, at least.
  • by Anonymous Coward on Tuesday January 04, 2005 @04:58PM (#11257560)
    The area I reside in has a LOT of railroad tracks crossing the road.

    Most of these crossings are "blind" because of trees, curves, etc...

    I've never once in my life checked to be _sure_ that there was no train coming when the warning lights weren't flashing.

    And I've never thought about this fact until now.

    How many times have I trusted my very life to what must amount to nothing more than a simple relay circuit?!

    I say bring on the bot. :)

That does not compute.

Working...