Changes in body mass are key indicators of health in humans and animals and are routinely monitored in animal husbandry and preclinical studies. In rodent studies, the current method of manually weighing the animal on a balance causes at least two issues. First, directly handling the animal induces stress, possibly confounding studies. Second, these data are static, limiting continuous assessment and obscuring rapid changes. A non-invasive, continuous method of monitoring animal mass would have utility in multiple biomedical research areas. We combine computer vision with statistical modeling to demonstrate the feasibility of determining mouse body mass by using video data. Our methods determine mass with a 4.8% error across genetically diverse mouse strains with varied coat colors and masses. This error is low enough to replace manual weighing in most mouse studies. We conclude that visually determining rodent mass enables non-invasive, continuous monitoring, improving preclinical studies and animal welfare.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11573914 | PMC |
http://dx.doi.org/10.1016/j.patter.2024.101039 | DOI Listing |
Sci Rep
December 2024
Computer Science Department, Saarland University, Saarbrücken, Germany.
Estimating the numbers and whereabouts of internally displaced people (IDP) is paramount to providing targeted humanitarian assistance. In conflict settings like the ongoing Russia-Ukraine war, on-the-ground data collection is nevertheless often inadequate to provide accurate and timely information. Satellite imagery may sidestep some of these challenges and enhance our understanding of the IDP dynamics.
View Article and Find Full Text PDFSci Rep
December 2024
Department of Computer Science, Birzeit University, P.O. Box 14, Birzeit, West Bank, Palestine.
Accurate classification of logos is a challenging task in image recognition due to variations in logo size, orientation, and background complexity. Deep learning models, such as VGG16, have demonstrated promising results in handling such tasks. However, their performance is highly dependent on optimal hyperparameter settings, whose fine-tuning is both labor-intensive and time-consuming.
View Article and Find Full Text PDFSci Rep
December 2024
Department of Electrical Engineering, College of Engineering, Taif University, P.O. BOX 11099, 21944, Taif, Saudi Arabia.
Weather recognition is crucial due to its significant impact on various aspects of daily life, such as weather prediction, environmental monitoring, tourism, and energy production. Several studies have already conducted research on image-based weather recognition. However, previous studies have addressed few types of weather phenomena recognition from images with insufficient accuracy.
View Article and Find Full Text PDFSci Rep
December 2024
School of Artificial Intelligence and Computer Science, Jiangnan University, Wuxi, 214122, China.
The unknown boundary issue, between superior computational capability of deep neural networks (DNNs) and human cognitive ability, has becoming crucial and foundational theoretical problem in AI evolution. Undoubtedly, DNN-empowered AI capability is increasingly surpassing human intelligence in handling general intelligent tasks. However, the absence of DNN's interpretability and recurrent erratic behavior remain incontrovertible facts.
View Article and Find Full Text PDFNat Commun
December 2024
Department of Computer Science, The University of Hong Kong, Pokfulam Rd, Hong Kong SAR, China.
Proper exposure settings are crucial for modern machine vision cameras to accurately convert light into clear images. However, traditional auto-exposure solutions are vulnerable to illumination changes, splitting the continuous acquisition of unsaturated images, which significantly degrades the overall performance of underlying intelligent systems. Here we present the neuromorphic exposure control (NEC) system.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!