Sign In

Communications of the ACM

ACM Careers

AI Algorithm Accurately Detects Prostate Cancer


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
prostate biopsy with cancer heatmap

Prostate diagnostic core needle biopsy with cancer heatmap, with red showing high probability of cancer.

Credit: Ibex Medical Analytics

Researchers from UPMC and the University of Pittsburgh have demonstrated an artificial intelligence algorithm with the highest accuracy to date of detecting and characterizing prostate cancer. They describe their work in "An Artificial Intelligence Algorithm for Prostate Cancer Diagnosis in Whole Slide Images of Core Needle Biopsies: A Blinded Clinical Validation and Deployment Study," published in The Lancet Digital Health.

"Humans are good at recognizing anomalies, but they have their own biases or past experience," says senior author Rajiv Dhir, M.D., chief pathologist and vice chair of pathology at UPMC Shadyside and professor of biomedical informatics at Pitt. "Machines are detached from the whole story. There's definitely an element of standardizing care." 

To train the AI to recognize prostate cancer, Dhir and his colleagues provided images from more than a million parts of stained tissue slides taken from patient biopsies. Each image was labeled by expert pathologists to teach the AI how to discriminate between healthy and abnormal tissue. The algorithm was then tested on a separate set of 1,600 slides taken from 100 consecutive patients seen at UPMC for suspected prostate cancer. 

During testing, the AI demonstrated 98% sensitivity and 97% specificity at detecting prostate cancer — significantly higher than previously reported for algorithms working from tissue slides. 

Also, the algorithm is the first to extend beyond cancer detection, reporting high performance for tumor grading, sizing, and invasion of the surrounding nerves. These all are clinically important features required as part of the pathology report. 

AI also flagged six slides that were not noted by the expert pathologists. 

This doesn't necessarily mean that the machine is superior to humans, Dhir says. For example, in the course of evaluating these cases, a pathologist could have simply seen enough evidence of malignancy elsewhere in the patient's samples to recommend treatment. For less experienced pathologists, though, the algorithm could act as a failsafe to catch cases that might otherwise be missed.

"Algorithms like this are especially useful in lesions that are atypical," Dhir says. "A non-specialized person may not be able to make the correct assessment. That's a major advantage of this kind of system." 

While these results are promising, Dhir cautions that new algorithms will have to be trained to detect different types of cancer. The pathology markers aren't universal across all tissue types. But he didn't see why that couldn't be done to adapt this technology to work with breast cancer, for example. 

Additional authors on the study include Liron Pantanowitz, M.B.B.Ch., of the University of Michigan; Gabriela Quiroga-Garza, M.D., of UPMC; Lilach Bien, Ronen Heled, Daphna Laifenfeld, Ph.D., Chaim Linhart, Judith Sandbank, M.D., and Manuela Vecsler, of Ibex Medical Analytics; Anat Albrecht-Shach, M.D., of Shamir Medical Center; Varda Shalev, M.D., M.P.A., of Maccabbi Healthcare Services; and Pamela Michelow, M.S., and Scott Hazelhurst, Ph.D., of the University of the Witwatersrand. 

Funding for this study was provided by Ibex, which also created this commercially available algorithm. Pantanowitz, Shalev, and Albrecht-Shach report fees paid by Ibex, and Pantanowitz and Shalev serve on the medical advisory board. Bien and Linhart are authors on pending patents US 62/743,559 and US 62/981,925. Ibex had no influence over the design of the study or the interpretation of the results.


 

No entries found