Advancements in sign language processing technology hinge on the availability of extensive, reliable datasets, comprehensive instructions, and adherence to ethical guidelines. To facilitate progress in gesture recognition and translation systems and to support the Azerbaijani sign language community we present the Azerbaijani Sign Language Dataset (AzSLD). This comprehensive dataset was collected from a diverse group of sign language users, encompassing a range of linguistic parameters. Developed within the framework of a vision-based Azerbaijani Sign Language translation project, AzSLD includes recordings of the fingerspelling alphabet, individual words, and sentences. The data acquisition process involved recording signers across various age groups, genders, and proficiency levels to ensure broad representation. Sign language sentences were captured using two cameras from different angles, providing comprehensive visual coverage of each gesture. This approach enables robust training and evaluation of gesture recognition algorithms. The dataset comprises 30,000 meticulously annotated videos, each labeled with precise gesture identifiers and corresponding linguistic translations. To facilitate efficient usage of the dataset, we provide technical instructions and source code for a data loader. Researchers and developers working on sign language recognition, translation, and synthesis systems will find AzSLD invaluable, as it offers a rich repository of labeled data for training and evaluation purposes.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11730573 | PMC |
http://dx.doi.org/10.1016/j.dib.2024.111230 | DOI Listing |
Data Brief
February 2025
ADA University, Baku, Azerbaijan.
Advancements in sign language processing technology hinge on the availability of extensive, reliable datasets, comprehensive instructions, and adherence to ethical guidelines. To facilitate progress in gesture recognition and translation systems and to support the Azerbaijani sign language community we present the Azerbaijani Sign Language Dataset (AzSLD). This comprehensive dataset was collected from a diverse group of sign language users, encompassing a range of linguistic parameters.
View Article and Find Full Text PDFData Brief
February 2025
Sistemas dinámicos, instrumentación y control (SIDICO), Departamento de física, Universidad del Cauca, Colombia.
Sign language is a form of non-verbal communication used by people with hearing disability. This form of communication relies on the use of signs, gestures, facial expressions, and more. Considering that in Colombia, the population with hearing impairments is around half a million, a database of dynamic, alphanumeric signs and commonly used words was created to establish a basic conversation.
View Article and Find Full Text PDFJ Indian Soc Pedod Prev Dent
October 2024
Department of Public Health Dentistry, Narayana Dental College, Nellore, Andhra Pradesh, India.
Background: Literature on the effectiveness of theory-based oral health education on the oral hygiene status of hearing-impaired children is limited.
Aim: To determine the effectiveness of a school oral health education intervention on oral hygiene status and oral health-related knowledge among 5-18-year-old children in Andhra Pradesh, India.
Materials And Methods: A cluster randomized clinical trial was conducted among all institutionalized hearing-impaired children and young adults residing in various special care schools in Nellore district.
Clin Linguist Phon
January 2025
BKV, Linköping University, Linköping, Sweden.
Gestures are essential in early language development. We investigate the use of gestures in children with cochlear implants (CIs), with a particular focus on deictic, iconic, and conventional gestures. The aim is to understand how the use of gestures in everyday interactions relates to age, vocabulary testing results, and language development reported by parents.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!