AI Article Synopsis

  • Histopathologic features, particularly the Gleason grading system, play a crucial role in diagnosing, treating, and predicting outcomes in prostate cancer, but there's significant variability in cancer behavior.
  • Developing automated grading systems has been challenging due to poor acceptance from the medical community, primarily due to issues with interpretability.
  • A new approach was created using meaningful luminal and architectural features to differentiate low- and high-grade prostate cancer, achieving impressive accuracies of 93.0% in training and 97.6% in testing with specific image datasets.

Article Abstract

Histopathologic features, particularly Gleason grading system, have contributed significantly to the diagnosis, treatment, and prognosis of prostate cancer for decades. However, prostate cancer demonstrates enormous heterogeneity in biological behavior, thus establishing improved prognostic and predictive markers is particularly important to personalize therapy of men with clinically localized and newly diagnosed malignancy. Many automated grading systems have been developed for Gleason grading but acceptance in the medical community has been lacking due to poor interpretability. To overcome this problem, we developed a set of visually meaningful features to differentiate between low- and high-grade prostate cancer. The visually meaningful feature set consists of luminal and architectural features. For luminal features, we compute: 1) the shortest path from the nuclei to their closest luminal spaces; 2) ratio of the epithelial nuclei to the total number of nuclei. A nucleus is considered an epithelial nucleus if the shortest path between it and the luminal space does not contain any other nucleus; 3) average shortest distance of all nuclei to their closest luminal spaces. For architectural features, we compute directional changes in stroma and nuclei using directional filter banks. These features are utilized to create two subspaces; one for prostate images histopathologically assessed as low grade and the other for high grade. The grade associated with a subspace, which results in the minimum reconstruction error is considered as the prediction for the test image. For training, we utilized 43 regions of interest (ROI) images, which were extracted from 25 prostate whole slide images of The Cancer Genome Atlas (TCGA) database. For testing, we utilized an independent dataset of 88 ROIs extracted from 30 prostate whole slide images. The method resulted in 93.0% and 97.6% training and testing accuracies, respectively, for the spectrum of cases considered. The application of visually meaningful features provided promising levels of accuracy and consistency for grading prostate cancer.

Download full-text PDF

Source
http://dx.doi.org/10.1109/JBHI.2016.2565515DOI Listing

Publication Analysis

Top Keywords

prostate cancer
20
visually meaningful
16
features
8
prostate
8
grading prostate
8
gleason grading
8
meaningful features
8
architectural features
8
features compute
8
shortest path
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!