A method for online incremental mining of activity patterns from the surveillance video stream is presented in this paper. The framework consists of a learning block in which Dirichlet process mixture model is employed for the incremental clustering of trajectories. Stochastic trajectory pattern models are formed using the Gaussian process regression of the corresponding flow functions. Moreover, a sequential Monte Carlo method based on Rao-Blackwellized particle filter is proposed for tracking and online classification as well as the detection of abnormality during the observation of an object. Experimental results on real surveillance video data are provided to show the performance of the proposed algorithm in different tasks of trajectory clustering, classification, and abnormality detection.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TIP.2016.2540813 | DOI Listing |
Sci Rep
January 2025
Chubu Institute for Advanced Studies, Chubu University, Kasugai, Aichi, Japan.
Event-based surveillance is crucial for the early detection and rapid response to potential public health risks. In recent years, social networking services (SNS) have been recognized for their potential role in this domain. Previous studies have demonstrated the capacity of SNS posts for the early detection of health crises and affected individuals, including those related to infectious diseases.
View Article and Find Full Text PDFSci Rep
January 2025
Department of Electrical and Computer Engineering, Hawassa University, Hawassa 05, Ethiopia.
Understanding human behavior and human action recognition are both essential components of effective surveillance video analysis for the purpose of guaranteeing public safety. However, existing approaches such as three-dimensional convolutional neural networks (3D CNN) and two-stream neural networks (2SNN) have computational hurdles due to the significant parameterization they require. In this paper, we offer HARNet, a specialized lightweight residual 3D CNN that is built on directed acyclic graphs and was created expressly to handle these issues and achieve effective human action detection.
View Article and Find Full Text PDFJ Med Internet Res
January 2025
Cancer Rehabilitation and Survivorship, Department of Supportive Care, Princess Margaret Cancer Centre, Toronto, ON, Canada.
Background: Virtual follow-up (VFU) has the potential to enhance cancer survivorship care. However, a greater understanding is needed of how VFU can be optimized.
Objective: This study aims to examine how, for whom, and in what contexts VFU works for cancer survivorship care.
PLoS One
January 2025
Faculty of Biology, School of Health Sciences, Medicine & Health, University of Manchester, Manchester, United Kingdom.
Background: Despite the comparatively high prevalence of possible sarcopenia among young-old adults in the community, there is currently no available and effective social media-based intervention to increase the awareness and change the behavior of the target population to prevent sarcopenia. Using co-design methodology, we developed a multicomponent intervention strategy of health education and exercise for sarcopenia prevention utilizing the TikTok platform.
Objectives: The primary purpose of this study is to examine the feasibility and acceptability of the social media-based intervention to enhance muscle function in community-dwelling young-old adults with possible sarcopenia.
Alzheimers Dement
December 2024
1Florida Alzheimer's Disease Research Center, Gainesville, FL, USA.
Background: The use of videoconference platforms for neuropsychological assessment was not as common among mental health practitioners before the COVID-19 pandemic. However, due to lockdowns and quarantines worldwide, mental health professionals had to find a feasible alternative and shift to virtual evaluations. This increased the use of teleneuropsychology in both at a clinical and research level.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!