Visible and subvisible particles are a quality attribute in sterile pharmaceutical samples. A common method for characterizing and quantifying pharmaceutical samples containing particulates is imaging many individual particles using high-throughput instrumentation and analyzing the populations data. The analysis includes conventional metrics such as the particle size distribution but can be more sophisticated by interpreting other visual/morphological features. To avoid the hurdles of building new image analysis models capable of extracting such relevant features from scratch, we propose using well-established pretrained deep learning image analysis models such as EfficientNet. We demonstrate that such models are useful as a prescreening tool for high-level characterization of biopharmaceutical particle image data. We show that although these models are originally trained for completely different tasks (such as the classification of daily objects in the ImageNet database), the visual feature vectors extracted by such models can be useful for studying different types of subvisible particles. This applicability is illustrated through multiple case studies: (i) particle risk assessment in prefilled syringe formulations containing different particle types such as silicone oil, (ii) method comparability with the example of accelerated forced degradation, and (iii) excipient influence on particle morphology with the example of Polysorbate 80 (PS80). As examples of agnostic applicability of pretrained models, we also elucidate the application to two high-throughput microscopy methods: microflow and background membrane imaging. We show that different particle populations with different morphological and visual features can be identified in different samples by leveraging out-of-the-box pretrained models to analyze images from each sample.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1002/bit.28488 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!