Sparse coding has been applied to visual tracking and related vision problems with demonstrated success in recent years. Existing tracking methods based on local sparse coding sample patches from a target candidate and sparsely encode these using a dictionary consisting of patches sampled from target template images. The discriminative strength of existing methods based on local sparse coding is limited as spatial structure constraints among the template patches are not exploited. To address this problem, we propose a structure-aware local sparse coding algorithm, which encodes a target candidate using templates with both global and local sparsity constraints. For robust tracking, we show the local regions of a candidate region should be encoded only with the corresponding local regions of the target templates that are the most similar from the global view. Thus, a more precise and discriminative sparse representation is obtained to account for appearance changes. To alleviate the issues with tracking drifts, we design an effective template update scheme. Extensive experiments on challenging image sequences demonstrate the effectiveness of the proposed algorithm against numerous state-of-the-art methods.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TIP.2018.2797482 | DOI Listing |
Vis Neurosci
December 2024
Department of Psychology to Division of Psychology, University of Stirling, Stirling, UK.
Sparse coding theories suggest that the visual brain is optimized to encode natural visual stimuli to minimize metabolic cost. It is thought that images that do not have the same statistical properties as natural images are unable to be coded efficiently and result in visual discomfort. Conversely, artworks are thought to be even more efficiently processed compared to natural images and so are esthetically pleasing.
View Article and Find Full Text PDFJ Eat Disord
December 2024
Department of Anesthesia, Critical Care and Pain Medicine, Massachusetts General Hospital, Harvard Medical School, Boston, MA, USA.
Objective: Night eating syndrome (NES) is an eating disorder characterized by evening hyperphagia. Despite having a prevalence comparable to some other eating disorders, NES remains sparsely investigated and poorly characterized. The present study examined the phenotypic and genetic associations for NES in the clinical Mass General Brigham Biobank.
View Article and Find Full Text PDFVariational autoencoders (VAEs) employ Bayesian inference to interpret sensory inputs, mirroring processes that occur in primate vision across both ventral (Higgins et al., 2021) and dorsal (Vafaii et al., 2023) pathways.
View Article and Find Full Text PDFProc Natl Acad Sci U S A
December 2024
Committee on Computational Neuroscience, Department of Organismal Biology and Anatomy, University of Chicago, Chicago, IL 60637.
Everything that the brain sees must first be encoded by the retina, which maintains a reliable representation of the visual world in many different, complex natural scenes while also adapting to stimulus changes. This study quantifies whether and how the brain selectively encodes stimulus features about scene identity in complex naturalistic environments. While a wealth of previous work has dug into the static and dynamic features of the population code in retinal ganglion cells (RGCs), less is known about how populations form both flexible and reliable encoding in natural moving scenes.
View Article and Find Full Text PDFElife
December 2024
Department of Brain Sciences, Weizmann Institute of Science, Rehovot, Israel.
Studying and understanding the code of large neural populations hinge on accurate statistical models of population activity. A novel class of models, based on learning to weigh sparse nonlinear Random Projections (RP) of the population, has demonstrated high accuracy, efficiency, and scalability. Importantly, these RP models have a clear and biologically plausible implementation as shallow neural networks.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!