We propose a novel method for applying active learning strategies to interactive 3D image segmentation. Active learning has been recently introduced to the field of image segmentation. However, so far discussions have focused on 2D images only. Here, we frame interactive 3D image segmentation as a classification problem and incorporate active learning in order to alleviate the user from choosing where to provide interactive input. Specifically, we evaluate a given segmentation by constructing an "uncertainty field" over the image domain based on boundary, regional, smoothness and entropy terms. We then calculate and highlight the plane of maximal uncertainty in a batch query step. The user can proceed to guide the labeling of the data on the query plane, hence actively providing additional training data where the classifier has the least confidence. We validate our method against random plane selection showing an average DSC improvement of 10% in the first five plane suggestions (batch queries). Furthermore, our user study shows that our method saves the user 64% of their time, on average.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1007/978-3-642-23626-6_74 | DOI Listing |
Proc Natl Acad Sci U S A
January 2025
Department of Psychology, City College, City University of New York, New York, NY 10031.
Looking at the world often involves not just seeing things, but feeling things. Modern feedforward machine vision systems that learn to perceive the world in the absence of active physiology, deliberative thought, or any form of feedback that resembles human affective experience offer tools to demystify the relationship between seeing and feeling, and to assess how much of visually evoked affective experiences may be a straightforward function of representation learning over natural image statistics. In this work, we deploy a diverse sample of 180 state-of-the-art deep neural network models trained only on canonical computer vision tasks to predict human ratings of arousal, valence, and beauty for images from multiple categories (objects, faces, landscapes, art) across two datasets.
View Article and Find Full Text PDFChild Dev
January 2025
Department of Psychology, Northwestern University, Evanston, Illinois, USA.
In critical approaches to the study of whiteness, white ignorance refers to systematic and intentional ways of (not) knowing that function to perpetuate racism. The current critical qualitative analysis examines how white ignorance surfaces in the racial identity narratives of white adolescents (N = 69, M = 15.91, SD = 0.
View Article and Find Full Text PDFmSphere
January 2025
Department of Veterinary Population Medicine, College of Veterinary Medicine, University of Minnesota, St. Paul, Minnesota, USA.
Existing genetic classification systems for porcine reproductive and respiratory syndrome virus type 2 (PRRSV-2), such as restriction fragment length polymorphisms and sub-lineages, are unreliable indicators of close genetic relatedness or lack sufficient resolution for epidemiological monitoring routinely conducted by veterinarians. Here, we outline a fine-scale classification system for PRRSV-2 genetic variants in the United States. Based on >25,000 U.
View Article and Find Full Text PDFFront Public Health
January 2025
Ultrasound Research Institute, Kaunas University of Technology, Kaunas, Lithuania.
Preschool education is one of the most important priorities of modern educational policies and the basis of lifelong learning. Health-literate educators and parents are better equipped to instill sustainable health practices in young children. Therefore, it is important to examine health literacy and determine how preschool educators and parents perceive the continuous development of health competencies within the framework of sustainability.
View Article and Find Full Text PDFBMJ Open Sport Exerc Med
December 2024
Department of Public and Occupational Health, EMGO, Amsterdam UMC Locatie VUmc, Amsterdam, The Netherlands.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!