Real-time automated detection of older adults' hand gestures in home and clinical settings.

Neural Comput Appl

Wicking Dementia Research and Education Centre, University of Tasmania, Hobart, TAS 7000 Australia.

Published: December 2022

Unlabelled: There is an urgent need, accelerated by the COVID-19 pandemic, for methods that allow clinicians and neuroscientists to remotely evaluate hand movements. This would help detect and monitor degenerative brain disorders that are particularly prevalent in older adults. With the wide accessibility of computer cameras, a vision-based real-time hand gesture detection method would facilitate online assessments in home and clinical settings. However, motion blur is one of the most challenging problems in the fast-moving hands data collection. The objective of this study was to develop a computer vision-based method that accurately detects older adults' hand gestures using video data collected in real-life settings. We invited adults over 50 years old to complete validated hand movement tests (fast finger tapping and hand opening-closing) at home or in clinic. Data were collected without researcher supervision via a website programme using standard laptop and desktop cameras. We processed and labelled images, split the data into training, validation and testing, respectively, and then analysed how well different network structures detected hand gestures. We recruited 1,900 adults (age range 50-90 years) as part of the TAS Test project and developed UTAS7k-a new dataset of 7071 hand gesture images, split 4:1 into clear: motion-blurred images. Our new network, RGRNet, achieved 0.782 mean average precision (mAP) on clear images, outperforming the state-of-the-art network structure (YOLOV5-P6, mAP 0.776), and mAP 0.771 on blurred images. A new robust real-time automated network that detects static gestures from a single camera, RGRNet, and a new database comprising the largest range of individual hands, UTAS7k, both show strong potential for medical and research applications.

Supplementary Information: The online version contains supplementary material available at 10.1007/s00521-022-08090-8.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9741488PMC
http://dx.doi.org/10.1007/s00521-022-08090-8DOI Listing

Publication Analysis

Top Keywords

hand gestures
12
real-time automated
8
older adults'
8
hand
8
adults' hand
8
clinical settings
8
hand gesture
8
data collected
8
images split
8
images
5

Similar Publications

Hand gestures provide an alternate interaction modality for blind users and can be supported using commodity smartwatches without requiring specialized sensors. The enabling technology is an accurate gesture recognition algorithm, but almost all algorithms are designed for sighted users. Our study shows that blind user gestures are considerably diferent from sighted users, rendering current recognition algorithms unsuitable.

View Article and Find Full Text PDF

The role of the left primary motor cortex in apraxia.

Neurol Res Pract

January 2025

Department of Neurology, Faculty of Medicine and University Hospital Cologne, University of Cologne, Kerpener Str. 62, 50937, Cologne, Germany.

Background: Apraxia is a motor-cognitive disorder that primary sensorimotor deficits cannot solely explain. Previous research in stroke patients has focused on damage to the fronto-parietal praxis networks in the left hemisphere (LH) as the cause of apraxic deficits. In contrast, the potential role of the (left) primary motor cortex (M1) has largely been neglected.

View Article and Find Full Text PDF

: Gestural production, a crucial aspect of nonverbal communication, plays a key role in the development of verbal and socio-communicative skills. Delays in gestural development often impede verbal acquisition and social interaction in children with Autism Spectrum Disorder (ASD). Although various interventions for ASD focus on improving socio-communicative abilities, they consistently highlight the importance of integrating gestures to support overall communication development.

View Article and Find Full Text PDF

Liquid-Metal-Based Multichannel Strain Sensor for Sign Language Gesture Classification Using Machine Learning.

ACS Appl Mater Interfaces

January 2025

Centre for Robotics and Automation, Department of Biomedical Engineering, City University of Hong Kong, Hong Kong 999077, China.

Liquid metals are highly conductive like metallic materials and have excellent deformability due to their liquid state, making them rather promising for flexible and stretchable wearable sensors. However, patterning liquid metals on soft substrates has been a challenge due to high surface tension. In this paper, a new method is proposed to overcome the difficulties in fabricating liquid-state strain sensors.

View Article and Find Full Text PDF

Hand movements frequently occur with speech. The extent to which the memories that guide co-speech hand movements are tied to the speech they occur with is unclear. Here, we paired the acquisition of a new hand movement with speech.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!