Aims: Machine learning (ML) binary classification in diagnostic histopathology is an area of intense investigation. Several assumptions, including training image quality/format and the number of training images required, appear to be similar in many studies irrespective of the paucity of supporting evidence. We empirically compared training image file type, training set size, and two common convolutional neural networks (CNNs) using transfer learning (ResNet50 and SqueezeNet).
Methods And Results: Thirty haematoxylin and eosin (H&E)-stained slides with carcinoma or normal tissue from three tissue types (breast, colon, and prostate) were photographed, generating 3000 partially overlapping images (1000 per tissue type). These lossless Portable Networks Graphics (PNGs) images were converted to lossy Joint Photographic Experts Group (JPG) images. Tissue type-specific binary classification ML models were developed by the use of all PNG or JPG images, and repeated with a subset of 500, 200, 100, 50, 30 and 10 images. Eleven models were generated for each tissue type, at each quantity of training images, for each file type, and for each CNN, resulting in 924 models. Internal accuracies and generalisation accuracies were compared. There was no meaningful significant difference in accuracies between PNG and JPG models. Models trained with more images did not invariably perform better. ResNet50 typically outperformed SqueezeNet. Models were generalisable within a tissue type but not across tissue types.
Conclusions: Lossy JPG images were not inferior to lossless PNG images in our models. Large numbers of unique H&E-stained slides were not required for training optimal ML models. This reinforces the need for an evidence-based approach to best practices for histopathological ML.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6591093 | PMC |
http://dx.doi.org/10.1111/his.13844 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!