Cellular microscopy imaging is a common form of data acquisition for biological experimentation. Observation of gray-level morphological features allows for the inference of useful biological information such as cellular health and growth status. Cellular colonies can contain multiple cell types, making colony level classification very difficult. Additionally, cell types growing in a hierarchical, downstream fashion, can often look visually similar, although biologically distinct. In this paper, it is determined empirically that traditional deep Convolutional Neural Networks (CNN) and classical object recognition techniques are not sufficient to distinguish between these subtle visual differences, resulting in misclassifications. Instead, Triplet-net CNN learning is employed in a hierarchical classification scheme to improve the ability of the model to discern distinct, fine-grain features of two commonly confused morphological image-patch classes, namely Dense and Spread colonies. The Triplet-net method improves classification accuracy over a four-class deep neural network by  ∼  3 %, a value that was determined to be statistically significant, as well as existing state-of-the-art image patch classification approaches and standard template matching. These findings allow for the accurate classification of multi-class cell colonies with contiguous boundaries, and increased reliability and efficiency of automated, high-throughput experimental quantification using non-invasive microscopy.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TCBB.2023.3247957DOI Listing

Publication Analysis

Top Keywords

cell types
8
classification
5
triplet-net classification
4
classification contiguous
4
contiguous stem
4
cell
4
stem cell
4
cell microscopy
4
microscopy images
4
images cellular
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!