Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
AI Medicine Google Software Hardware Science Technology

Deep Learning Algorithm Diagnoses Skin Cancer As Well As Seasoned Dermatologists ( 44

An anonymous reader quotes a report from ExtremeTech: Remember how that Google neural net learned to tell the difference between dogs and cats? It's helping catch skin cancer now, thanks to some scientists at Stanford who trained it up and then loosed it on a huge set of high-quality diagnostic images. During recent tests, the algorithm performed just as well as almost two dozen veteran dermatologists in deciding whether a lesion needed further medical attention. The algorithm is called a deep convolutional neural net. It started out in development as Google Brain, using their prodigious computing capacity to power the algorithm's decision-making capabilities. When the Stanford collaboration began, the neural net was already able to identify 1.28 million images of things from about a thousand different categories. But the researchers needed it to know a malignant carcinoma from a benign seborrheic keratosis. Dermatologists often use an instrument called a dermoscope to closely examine a patient's skin. This provides a roughly consistent level of magnification and a pretty uniform perspective in images taken by medical professionals. Many of the images the researchers gathered from the Internet weren't taken in such a controlled setting, so they varied in terms of angle, zoom, and lighting. But in the end, the researchers amassed about 130,000 images of skin lesions representing over 2,000 different diseases. They used that dataset to create a library of images, which they fed to the algorithm as raw pixels, each pixel labeled with additional data about the disease depicted. Then they asked the algorithm to suss out the patterns: to find the rules that define the appearance of the disease as it spreads through tissue. The researchers tested the algorithm's performance against the diagnoses of 21 dermatologists from the Stanford medical school, on three critical diagnostic tasks: keratinocyte carcinoma classification, melanoma classification, and melanoma classification when viewed using dermoscopy. In their final tests, the team used only high-quality, biopsy-confirmed images of malignant melanomas and malignant carcinomas. When presented with the same image of a lesion and asked whether they would "proceed with biopsy or treatment, or reassure the patient," the algorithm scored 91% as well as the doctors, in terms of sensitivity (catching all the cancerous lesions) and sensitivity (not getting false positives).
This discussion has been archived. No new comments can be posted.

Deep Learning Algorithm Diagnoses Skin Cancer As Well As Seasoned Dermatologists

Comments Filter:
  • With thyme and cumin I suppose.
  • by Anonymous Coward on Wednesday January 25, 2017 @06:54PM (#53739275)

    Wow, both cancer AND seasoned dermatologists!

    I'm sure it can diagnose the heavily seasoned dermatologists, but how does it handle the merely lightly seasoned?

  • by Dread_ed ( 260158 ) on Wednesday January 25, 2017 @06:56PM (#53739285) Homepage

    ...but they aren't programmed by evolution to disregard 95%+ of it. Pretty much the exact opposite actually. We tune, prune, select, and evolve these algorithms to do this one thing really well. Frankly, its a wonder humans can do as well as we do. A testament to our pattern matching skills, adaptability, and lack of immutable hard wiring in the 'ol thinky thinky bits.

    Something I saw that might be able to help humans get a step up on the algorithms, or actually amalgamate humans and computers, is this:
    Video Magnification []

    Maybe cancerous skin lesions absorb slightly different light wavelengths? If so, magnification of the minuscule differences could pinpoint it. Fun to ponder.

    • Glad I clicked on your link :D
      Here's one of the related links: Detecting Pulse From Head Motions in Video [].

    • I think the point of this article is that the algorithm is at parity, or better, in sensitivity AND specificity.

      What I wonder is how much time and money it is going to take to get this out and helping people. The computer time is virtually free compared to human evaluators - we could have tanning bed like devices where you strip, get in, and get photographed all over with controlled lighting and perhaps some multi-spectral augmentation on the imaging, should be able to beat a trip to the dermatologist in t

  • by Anonymous Coward

    The app has been available for years.
    Maybe this will help with the FDA.

  • Be sure not to miss the AI-bandwagon!
  • by Gim Tom ( 716904 ) on Wednesday January 25, 2017 @08:29PM (#53739683)
    I have had both Basal and Squamous skin cancers since the 1990's and keep a close watch on my own skin. If I see anything suspicious I have a note book where I keep a note of what I saw, when and where. In some cases I will take a close up picture of it. Both Basal and Squamous cancers tend to appear and go away when they are very small and by doing this I have a record of "something" reappearing in the same location. Following the old adage that once is happenstance, twice is coincidence, but three times is most likely enemy action I will call for an appointment with my dermatologist and show them my records or pictures. For the last ten years I found every skin cancer well before the dermatologist would have seen it during an annual exam.

    It did not used to be that way since for many years I had the same dermatologist or group and they got to know my skin about as well as I learned to. However, after that with almost yearly shifts in medical networks due to changes in insurance providers where I worked (always either the lowest bidder or highest campaign contributor), it got where I didn't see the same one twice until I got on Medicare. The patient-doctor relationship SHOULD be long term and more than just a diagnostic code and EMR's. I think it is going to get a LOT worse before it gets better so learn to know your own body and be assertive about your care.
    • So, a skin flicks for a cause :-) That's not a bad idea, taking pictures to keep track of whether things are growing or changing shape. Maybe you should submit it as a story on do-it-yourself health monitoring that could save your life.
  • So now we can have phone-booth style human body diagnostic on every street corner.... and I'll still need to go to Mexico to be able to afford it myself.

  • Could diagnose bacterial infections as well as human experts.

    Trouble was, needed to input the data correctly. Which means recognize symptoms.

    Also, automated essay marking does a better job than human markers, when compared to marks by experts.

    Trouble here is, we are comparing Artificial Intelligence with human stupidity.

    And "Deep Learning" is not a technology. It is a marketing term. []

    for a better assessment of what is real.

    • Why can't computers already think? Why has 60 years of research failed to produce a single intelligent robot? What has been learnt, what are the technically difficult problems, and when are they likely to be solved?

      Why isn't society already self-regulating? Why has 60 years of research failed to produce a single new form of government? What has been learnt, what are the technically difficult problems, and when are they likely to be solved?

    • by Anonymous Coward

      Posting AC because I'm an ex-employee of a company that already has a product on the market that does this (though specifically targeting melanoma), with sensitivity and specificity results that rival the top dermatologists in the US. It's available in the US and Germany last I checked. There was news about the fracas they had with the FDA over the regulatory issues they had, if you pay attention to that kind of thing.


      I'm goddamn horrified of skin cancer now. Not really beca

  • Deep Learning Algorithm Diagnoses Skin Cancer As Well As Seasoned Dermatologists

    "I'm sorry Sam, it looks like you have a potentially fatal condition."

    "What, skin cancer?"

    "No, it's worse than that. You are infected with a seasoned dermatologist".

  • Ok, put one of these probes in your mouth, ear, and butt.

    Wait, Wait, wait *this* one goes in your butt.

  • by Goldsmith ( 561202 ) on Thursday January 26, 2017 @12:38AM (#53740415)
    Reading an article like this, I can just hear my regulatory affairs officer having a heart attack in my head.

    I realize that very few people here have ever had dealings with the FDA. The FDA regulates the interstate marketing of medical tools (including software) and drugs, everything they do comes from that core mission and authority. Press releases and statements are pretty central to that mission. You should try to limit press on your product to what the FDA agrees you've proven. Depending on your views of the government, medical ethics, and your risk tolerance, "should try to" in that statement might be "must" or "should pretend to."

    Software focusing on "health" isn't really regulated, but "diagnostic" means this is medical. If you think they made a diagnostic tool for skin cancer, then they may have a problem when it comes time to talk with the FDA. They haven't shown that they have a diagnostic yet, that's the point of the last quote in the article, but that quote is in FDA-speak while the rest of the article is less formal sounding. (They've done what's called a retrospective study, which is at most half of what is necessary.)

    Generally, the authors on the paper are smarter than this. Here's an example [] of an article a Stanford Dermatologist usually contributes to. Note that the Dermatologist quoted in that article is also quoted in TFA. Note the difference in tone of the publication, the whole thing is in FDA-speak. Yes, it's super boring. It's also not going to give anyone at the FDA a reason to hold up an application for marketing prior to approval.
  • That is truly amazing.
    It is difficult to diagnose a dermatologist.
  • the algorithm scored 91% as well as the doctors, in terms of sensitivity (catching all the cancerous lesions) and sensitivity (not getting false positives).

    Obviously, their natural language processing algorithm still needs some tuning.

  • Until we know more about how these algorithms make predictions, it'll be tricky integrating them into medicine: "I think you have a melanocytic lesion because I graduated medical school and have trained in dermatology for six years" still carries more weight than "Our highly accurate algorithm said you scored in a particular way on the 21 dimensions that we can't quite correlate to anything tangible, but it suggests you need this invasive surgery".
  • They'll become obsolete as diagnosticians.

  • "... the algorithm scored 91% as well as the doctors, in terms of sensitivity (catching all the cancerous lesions)..."

    Great, it only killed 9% of the people! Let's start using this magnificent technology! Think of the money that will be saved!

    A fundamental problem with computers doing medicine is that it reduces the number of people in medicine and therefore the advance of diagnostics and technique. Computers don't push anything forward. Use too many computers and you will freeze medicine in place at some

    • by CByrd17 ( 987455 )

      Yeah "as well as" is not the same as "91%" as well as.

      How about, "9% worse than seasoned dermatologists?"

      Don't get me wrong, it's a good step forward; but I want to see 9% BETTER than seasoned dermatologists!