Humans' decision making process often relies on utilizing visual information from different views or perspectives. However, in machine-learning-based image classification we typically infer an object's class from just a single image showing an object. Especially for challenging classification problems, the visual information conveyed by a single image may be insufficient for an accurate decision. We propose a classification scheme that relies on fusing visual information captured through images depicting the same object from multiple perspectives. Convolutional neural networks are used to extract and encode visual features from the multiple views and we propose strategies for fusing these information. More specifically, we investigate the following three strategies: (1) fusing convolutional feature maps at differing network depths; (2) fusion of bottleneck latent representations prior to classification; and (3) score fusion. We systematically evaluate these strategies on three datasets from different domains. Our findings emphasize the benefit of integrating information fusion into the network rather than performing it by post-processing of classification scores. Furthermore, we demonstrate through a case study that already trained networks can be easily extended by the best fusion strategy, outperforming other approaches by large margin.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7802953 | PMC |
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0245230 | PLOS |
Brain Struct Funct
January 2025
Department of Biomedical Engineering, College of Chemistry and Life Sciences, Beijing University of Technology, Beijing, 100124, China.
The brain undergoes atrophy and cognitive decline with advancing age. The utilization of brain age prediction represents a pioneering methodology in the examination of brain aging. This study aims to develop a deep learning model with high predictive accuracy and interpretability for brain age prediction tasks.
View Article and Find Full Text PDFToxicol Pathol
January 2025
Charles River Laboratories, Edinburgh, UK.
Thyroid tissue is sensitive to the effects of endocrine disrupting substances, and this represents a significant health concern. Histopathological analysis of tissue sections of the rat thyroid gland remains the gold standard for the evaluation for agrochemical effects on the thyroid. However, there is a high degree of variability in the appearance of the rat thyroid gland, and toxicologic pathologists often struggle to decide on and consistently apply a threshold for recording low-grade thyroid follicular hypertrophy.
View Article and Find Full Text PDFBMC Bioinformatics
January 2025
Department of Biophysics, Faculty of Biological Sciences, Tarbiat Modares University, Tehran, 14115-111, Iran.
J Imaging Inform Med
January 2025
Faculty of Medicine and Pharmacy of Rabat, Mohammed V University of Rabat, Rabat, 10000, Morocco.
Gastrointestinal (GI) disease examination presents significant challenges to doctors due to the intricate structure of the human digestive system. Colonoscopy and wireless capsule endoscopy are the most commonly used tools for GI examination. However, the large amount of data generated by these technologies requires the expertise and intervention of doctors for disease identification, making manual analysis a very time-consuming task.
View Article and Find Full Text PDFJ Pharm Sci
January 2025
Department of Chemical and Biological Engineering, University of Colorado Boulder, Boulder, Colorado 80303. Electronic address:
Polysorbate 20 (PS20) is commonly used as an excipient in therapeutic protein formulations. However, over the course of a therapeutic protein product's shelf life, minute amounts of co-purified host-cell lipases may cause slow hydrolysis of PS20, releasing fatty acids (FAs). These FAs may precipitate to form subvisible particles that can be detected and imaged by various techniques, e.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!