The dysfunction of α and β cells in pancreatic islets can lead to diabetes. Many questions remain on the subcellular organization of islet cells during the progression of disease. Existing three-dimensional cellular mapping approaches face challenges such as time-intensive sample sectioning and subjective cellular identification. To address these challenges, we have developed a subcellular feature-based classification approach, which allows us to identify α and β cells and quantify their subcellular structural characteristics using soft X-ray tomography (SXT). We observed significant differences in whole-cell morphological and organelle statistics between the two cell types. Additionally, we characterize subtle biophysical differences between individual insulin and glucagon vesicles by analyzing vesicle size and molecular density distributions, which were not previously possible using other methods. These sub-vesicular parameters enable us to predict cell types systematically using supervised machine learning. We also visualize distinct vesicle and cell subtypes using Uniform Manifold Approximation and Projection (UMAP) embeddings, which provides us with an innovative approach to explore structural heterogeneity in islet cells. This methodology presents an innovative approach for tracking biologically meaningful heterogeneity in cells that can be applied to any cellular system.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11119489 | PMC |
http://dx.doi.org/10.3390/cells13100869 | DOI Listing |
Cells
May 2024
Department of Chemistry, Bridge Institute, Michelson Center for Convergent Bioscience, University of Southern California, Los Angeles, CA 90089, USA.
The dysfunction of α and β cells in pancreatic islets can lead to diabetes. Many questions remain on the subcellular organization of islet cells during the progression of disease. Existing three-dimensional cellular mapping approaches face challenges such as time-intensive sample sectioning and subjective cellular identification.
View Article and Find Full Text PDFNat Commun
September 2023
Department of Structural Biology, St. Jude Children's Research Hospital, Memphis, TN, USA.
Sci Rep
March 2023
Department of Electrical and Computer Engineering, University of California, Santa Barbara, USA.
This paper presents a method for time-lapse 3D cell analysis. Specifically, we consider the problem of accurately localizing and quantitatively analyzing sub-cellular features, and for tracking individual cells from time-lapse 3D confocal cell image stacks. The heterogeneity of cells and the volume of multi-dimensional images presents a major challenge for fully automated analysis of morphogenesis and development of cells.
View Article and Find Full Text PDFBMC Bioinformatics
October 2019
School of Communications and Electronics, Jiangxi Science & Technology Normal University, Nanchang, 330003, China.
Background: Protein subcellular localization plays a crucial role in understanding cell function. Proteins need to be in the right place at the right time, and combine with the corresponding molecules to fulfill their functions. Furthermore, prediction of protein subcellular location not only should be a guiding role in drug design and development due to potential molecular targets but also be an essential role in genome annotation.
View Article and Find Full Text PDFBMC Bioinformatics
January 2019
Erasmus Optical Imaging Centre, Erasmus MC, Wytemaweg 80, 3015 CN, Rotterdam, The Netherlands.
Background: Single-molecule localization microscopy is a super-resolution microscopy technique that allows for nanoscale determination of the localization and organization of proteins in biological samples. For biological interpretation of the data it is essential to extract quantitative information from the super-resolution data sets. Due to the complexity and size of these data sets flexible and user-friendly software is required.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!