• General Dermatology
  • Eczema
  • Alopecia
  • Aesthetics
  • Vitiligo
  • COVID-19
  • Actinic Keratosis
  • Precision Medicine and Biologics
  • Rare Disease
  • Wound Care
  • Rosacea
  • Psoriasis
  • Psoriatic Arthritis
  • Atopic Dermatitis
  • Melasma
  • NP and PA
  • Anti-Aging
  • Skin Cancer
  • Hidradenitis Suppurativa
  • Drug Watch
  • Pigmentary Disorders
  • Acne
  • Pediatric Dermatology
  • Practice Management

The artificial brain as doctor

Article

Neural networks do pattern recognition just as a dermatologist would learn patterns and visual data. Computer accuracy rivals that of humans for classifying skin cancer. Deep neural networks and artificial intelligence may have a growing role in practice.

Computers are learning to distinguish whether skin lesion images are benign or malignant and require further treatment. It appears they’re on par with, if not more accurate than, many dermatologists, a study shows.

Dr. NovoaStanford University researchers compared a computer-driven deep neural network to dermatologists’ ability to visually classify possible melanoma, basal cell or squamous cell carcinoma lesions.

“We recruited 21 board certified dermatologists and showed them 300 images where they had to classify the lesions as benign or malignant, and whether they would biopsy or reassure the patient,” says the study’s lead author Roberto A. Novoa, M.D., clinical assistant professor of dermatology and pathology, Stanford University, Stanford. “From there, the algorithm performed about as well as the dermatologists, if not better. There were dermatologists who did perform better than the algorithm but, in general, this was a proof of concept study, so … we were demonstrating the efficacy of these algorithms for making the diagnosis.”

Computer that think like humans

Deep neural networks are a type of computer algorithm within the field of artificial intelligence - a field that uses computers to mimic brain function, including reasoning.

While it might sound like a new concept, it’s not. Researchers reported on the potential for computers to diagnose facial tumors in 1986.

“The idea for neural networks has been around since the 1960s,” Dr. Novoa says. “But it was only the last five years that the computing power and technological capabilities caught up to the math.”

Today’s neural networks take in vast amounts of information. They then learn the rules that lie behind the data to derive patterns and, eventually, correct answers, according to Dr. Novoa.

“Essentially, deep neural networks are mathematical equations that are stacked in layers. They start at the most basic level by learning the edges of all the objects in an image. Then, they move on to telling you this is a triangle or a square. The next layer might indicate whether an image is a cat or a dog. Finally, there’s a layer that says this is a Belgium malinois or German shepherd,” Dr. Novoa says. “… if it gets the answer wrong, it goes back through the equation and changes the values and weights of that equation, until it gets the most answers correct for the most number of images. By doing so, it learns, over time, what’s important and what’s not.”

NEXT: From dogs to skin cancer

 

From dogs to skin cancer

The genesis of the research published in Nature last year began three years ago, when Dr. Novoa says he saw the advances in the field of deep learning, where these computer programs could differentiate on their own between Belgium shepherds and German Shepherds.

“I thought if we can do this for dogs, we can do this for skin cancer,” he says.

Dr. Novoa and colleagues at Stanford gathered a dataset of nearly 130,000 images from the internet, including open source images and images from Stanford databases. They went through the data and created a visual taxonomy for the more than 2,000 disease categories, whittling those to 10 different general categories. The researchers then used the dataset in the deep neural network’s classifier.

The researchers measured the algorithm’s performance by creating a sensitivity-specificity curve.

Sensitivity represented the algorithm’s ability to correctly identify malignant lesions and specificity represented its ability to accurately identify benign lesions. Assessing the algorithm through the diagnostic tasks of keratinocyte carcinoma classification, melanoma classification and melanoma classification when viewed using dermoscopy, the researchers found the algorithm matched the dermatologists’ performance in all three tasks, with the area under the sensitivity-specificity curve amounting to at least 91 percent of the total area of the graph, according to a Stanford news release.

The deep neural network, in this case, had been trained to classify skin cancer, but the study was done on images of skin cancer that the network had not yet seen.

“If you test it on images it has already seen, it already knows the answer,” he says.

The images in the study already had biopsy-proven results. This allowed the researchers to know the true answers when testing the deep neural network and dermatologists.

The next step is to test the algorithm in the real world, Dr. Novoa said.

“We’re currently putting together a clinical trial, to see how well it will perform in the real world,” he says. “We want to see how well it performs with real patients, with a smart phone camera and a variety of lighting.”

There have been other studies using artificial intelligence and deep neural networks to diagnose melanoma, but this one is different in a few ways, according to Dr. Novoa.

“Most of the other studies were performed using dermoscopic images. Dermoscopy kind of limits the number of variables. It’s always taken from the same distance and with pretty similar lighting,” he says.

“This was looking at clinical images, but they were from a variety of different distances and angles. This also was looking at melanoma, squamous cell carcinoma and basal cell carcinomas. So, there was a wider group of images. We used a pretty large dataset. It’s among the larger datasets that we’ve seen to date.”

Implications

The basic idea of neural networks is that they do pattern recognition just as a dermatologist would learn patterns and visual data, Dr. Novoa said.

Deep neural networks and artificial intelligence may have a growing role in dermatology to help dermatologists provide better care, but not to replace the need for human expertise. While the computer algorithms might be used to triage or screen patients, ultimately a healthcare professional will take responsibility (and liability) for the care, he says.

Algorithms, though, still need improvement.

“If there are biases in the dataset, these can be introduced into the algorithm results. So, for a long time there is going to be a need for supervision of these results in order to have things work optimally,” he says. “For example, if the algorithm has a ruler in the image, it’s more likely to call it cancer. Why? Because on average, images in our dataset that have rulers are more likely to be malignant. That’s just a small example of the kinds of biases that can be introduced into the data.”

The near future of the technology has far-reaching implications, including the potential development of such things as a smart-phone compatible algorithm, where consumers could capture a lesion in an image and the phone would screen it for skin cancer probability.

“Technology has been changing medicine and all of human endeavors for hundreds of years, but it hasn’t eliminated the need for doctors,” Dr. Novoa says. “I don’t think we’re going anywhere. I think dermatologists will be able to adapt and take advantage of these technologies to take care of patients.”

REFERENCES

Esteva A, Kuprel B, Novoa RA, et al. “Dermatologist-level classification of skin cancer with deep neural networks,” Nature. Feb. 2, 2017. DOI: 10.1038/nature21056.

Finlay AY, Hammond P. “Expert systems in dermatology: the computer potential. The example of facial tumour diagnosis,” Dermatologica.

 

Related Videos
© 2024 MJH Life Sciences

All rights reserved.