An expectation-maximization approach to quantifying protein stoichiometry with single-molecule imaging.

Bioinform Adv

Department of Chemical and Physical Sciences, University of Toronto Mississauga, Mississauga, ON L5L 1C6, Canada.

Published: November 2021

Motivation: Single-molecule localization microscopy (SMLM) is a super-resolution technique capable of rendering nanometer scale images of cellular structures. Recently, much effort has gone into developing algorithms for extracting quantitative features from SMLM datasets, such as the abundance and stoichiometry of macromolecular complexes. These algorithms often require knowledge of the complicated photophysical properties of photoswitchable fluorophores.

Results: Here, we develop a calibration-free approach to quantitative SMLM built upon the observation that most photoswitchable fluorophores emit a geometrically distributed number of blinks before photobleaching. From a statistical model of a mixture of monomers, dimers and trimers, the method employs an adapted expectation-maximization algorithm to learn the protomer fractions while simultaneously determining the single-fluorophore blinking distribution. To illustrate the utility of our approach, we benchmark it on both simulated datasets and experimental datasets assembled from SMLM images of fluorescently labeled DNA nanostructures.

Availability And Implementation: An implementation of our algorithm written in Python is available at: https://www.utm.utoronto.ca/milsteinlab/resources/Software/MMCode/.

Supplementary Information: Supplementary data are available at online.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9710618PMC
http://dx.doi.org/10.1093/bioadv/vbab032DOI Listing

Publication Analysis

Top Keywords

expectation-maximization approach
4
approach quantifying
4
quantifying protein
4
protein stoichiometry
4
stoichiometry single-molecule
4
single-molecule imaging
4
imaging motivation
4
motivation single-molecule
4
single-molecule localization
4
localization microscopy
4

Similar Publications

Introduction: Missing data in psychometric research presents a substantial challenge, impacting the reliability and validity of study outcomes. Various factors contribute to this issue, including participant non-response, dropout, or technical errors during data collection. Traditional methods like mean imputation or regression, commonly used to handle missing data, rely upon assumptions that may not hold on psychological data and can lead to distorted results.

View Article and Find Full Text PDF

Meteorological data acquired with precision, quality, and reliability are crucial in various agronomy fields, especially in studies related to reference evapotranspiration (ETo). ETo plays a fundamental role in the hydrological cycle, irrigation system planning and management, water demand modeling, water stress monitoring, water balance estimation, as well as in hydrological and environmental studies. However, temporal records often encounter issues such as missing measurements.

View Article and Find Full Text PDF

This study explores the performance of the item response tree (IRTree) approach in modeling missing data, comparing its performance to the expectation-maximization (EM) algorithm and multiple imputation (MI) methods. Both simulation and empirical data were used to evaluate these methods across different missing data mechanisms, test lengths, sample sizes, and missing data proportions. Expected a posteriori was used for ability estimation, and bias and root mean square error (RMSE) were calculated.

View Article and Find Full Text PDF

Regression analysis of group-tested current status data.

Biometrika

September 2024

Department of Statistics, University of South Carolina, 217 LeConte College, Columbia, South Carolina 29208, USA.

Group testing is an effective way to reduce the time and cost associated with conducting large-scale screening for infectious diseases. Benefits are realized through testing pools formed by combining specimens, such as blood or urine, from different individuals. In some studies, individuals are assessed only once and a time-to-event endpoint is recorded, for example, the time until infection.

View Article and Find Full Text PDF

We propose a novel method to adjust for unmeasured time-stable confounding when the time between consecutive treatment administrations is fixed. We achieve this by focusing on a new-user cohort. Furthermore, we envisage that all time-stable confounding goes through the potential time on treatment as dictated by the disease condition at the initiation of treatment.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!