Hand gesture classification is of high importance in any sign language recognition (SLR) system, which is expected to assist individuals suffering from hearing and speech impairment. American sign language (ASL) comprises of static and dynamic gestures representing many alphabets, phrases, and words. ASL recognition system allows us to digitize communication and use it effectively within or outside the hearing-deprived community. Developing an ASL recognition system has been a challenge since some of the involved hand gestures closely resemble each other, and thereby it demands high discriminability features to classify these gestures. SLR through surface-based electromyography (sEMG) signals is computationally intensive to process and using inertial measurement units (IMUs) or flex sensors for SLR occupies too much space on the patient's hand. Video-based recognition systems place restrictions on the users by requiring them to make gestures or motions within the camera's field of view. A novel approach with a precision preserved static gesture classification system is proposed to fulfill the much-needed gap. The paper proposes an array of magnetometers-enabled static hand gesture classification system that offers an average accuracy of 98.60% for classifying alphabets and 94.07% for digits using the KNN classification model. The magnetometer array-based wearable system is devised to minimize the electronics coverage around the hand, and yet establish robust classification results that are useful for ASL recognition. The paper discusses the design of the proposed SLR system and also looks into optimizations that can be made to reduce the cost of the system.Clinical relevance - The proposed novel magnetometer array-based wearable system is cost-effective and works well across different hand sizes. It occupies a negligible amount of space on the user's hand and thus does not interfere with the user's everyday tasks. It is reliable, robust, and error-free for easy adoption towards building ASL recognition system.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/EMBC40787.2023.10340708 | DOI Listing |
Sci Rep
January 2025
University Institute of Computing, Chandigarh University, Punjab, India.
Automatic Sign Language Recognition Systems (ASLR) offers smooth communication between hearing-impaired and normal-hearing individuals, enhancing educational opportunities for impaired. However, it struggles with "curse of dimensionality" due to excessive features resulting in prolonged training time and exhaustive computational demand. This paper proposes technique that integrates machine learning and swarm intelligence to effectively address this issue.
View Article and Find Full Text PDFRadiol Case Rep
February 2025
Department of Radiology, L. Curto Hospital, ASL Salerno, Via Luigi Curto, 84035, Polla, Salerno, Italy.
Sinusitis is a common condition that can lead to various neurological complications due to the spread of infection to the intracranial and orbital regions. Fortunately, the availability of antibiotics has significantly improved the prognosis of sinusitis-associated intracranial complications. As a result, the overall incidence of neurological complications arising from sinusitis remains low.
View Article and Find Full Text PDFPhysiol Rep
December 2024
Department of Anesthesiology, Tianjin Medical University General Hospital, Tianjin, China.
Sleep fragmentation (SF) is increasingly recognized as a contributing factor to postoperative cognitive dysfunction (POCD). Given the critical roles of somatostatin (SST) interneurons, associated gamma-aminobutyric acid (GABA)ergic neurotransmitters, and hippocampal perfusion in sleep-related cognition, this study examined changes in these mechanisms in preoperative SF affecting POCD induced by anesthesia/surgery in aged male mice. The Morris water maze (MWM), novel object recognition (NOR), and Y maze tests were utilized to evaluate POCD.
View Article and Find Full Text PDFCureus
December 2024
Emergency Medicine, University of Arizona College of Medicine - Tucson, Tucson, USA.
Background The use of automatic external defibrillators (AEDs) by lay rescuers can reduce the time to defibrillation and improve survival in out-of-hospital cardiac arrest (OHCA). AEDs use voice prompts to guide users through the defibrillation process, creating a potential barrier for deaf and hard-of-hearing (HoH) individuals. The objective of this study is to assess familiarity with and concerns regarding AED use among members of these communities.
View Article and Find Full Text PDFBrain Sci
November 2024
Association School of Cognitive Psychology (APC-SPC), Viale Castro Pretorio 116, 00185 Rome, Italy.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!