Cued Speech (CS) is a communication system that uses manual gestures to facilitate lipreading. In this study, we investigated how CS information interacts with natural speech using Event-Related Potential (ERP) analyses in French-speaking, typically hearing adults (TH) who were either naïve or experienced CS producers. The audiovisual (AV) presentation of lipreading information elicited an amplitude attenuation of the entire N1 and P2 complex in both groups, accompanied by N1 latency facilitation in the group of CS producers. Adding CS gestures to lipread information increased the magnitude of effects observed at the N1 time window, but did not enhance P2 amplitude attenuation. Interestingly, presenting CS gestures without lipreading information yielded distinct response patterns depending on participants' experience with the system. In the group of CS producers, AV perception of CS gestures facilitated the early stage of speech processing, while in the group of naïve participants, it elicited a latency delay at the P2 time window. These results suggest that, for experienced CS users, the perception of gestures facilitates early stages of speech processing, but when people are not familiar with the system, the perception of gestures impacts the efficiency of phonological decoding.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10377728PMC
http://dx.doi.org/10.3390/brainsci13071036DOI Listing

Publication Analysis

Top Keywords

perception gestures
12
typically hearing
8
naïve experienced
8
experienced producers
8
amplitude attenuation
8
group producers
8
time window
8
speech processing
8
gestures
6
cued-speech perception
4

Similar Publications

Ever since de Saussure [Course in General Lingustics (Columbia University Press, 1916)], theorists of language have assumed that the relation between form and meaning of words is arbitrary. However, recently, a body of empirical research has established that language is embodied and contains iconicity. Sound symbolism, an intrinsic link language users perceive between word sound and properties of referents, is a representative example of iconicity in language and has offered profound insights into theories of language pertaining to language processing, language acquisition, and evolution.

View Article and Find Full Text PDF

Research on Multimodal Control Method for Prosthetic Hands Based on Visuo-Tactile and Arm Motion Measurement.

Biomimetics (Basel)

December 2024

Institute of Instrument Science and Engineering, Southeast University, Nanjing 210096, China.

Article Synopsis
  • The research focuses on enhancing robotic hand function to assist disabled individuals, leveraging advanced multimodal perception and control methods.
  • Key techniques include using a pinhole camera and YOLOv8 for object recognition, along with multi-frame data and clustering algorithms to ensure accurate grasping by the robotic hand.
  • The resulting system achieves a high grasping success rate of 91.63% while maintaining user comfort, demonstrating its effectiveness and potential for real-world application.
View Article and Find Full Text PDF

Evidence for the dependence of visual and kinesthetic motor imagery on isolated visual and motor practice.

Conscious Cogn

January 2025

School of Kinesiology, University of British Columbia, 210-6081 University Boulevard, Vancouver, BC V6T 1Z1, Canada. Electronic address:

Motor imagery (MI) is a cognitive process believed to rely on the representation developed through experience. The equivalence between MI and execution has been questioned and the relationship between experience types and MI is unclear. We tested how observational and physical practice of hand gesture sequences impacted visual and kinesthetic MI and transfer to the unpracticed effector.

View Article and Find Full Text PDF

Background Virtual reality (VR) is typically used for entertainment or gaming, but many studies have shown that the applications of VR can also extend to medical and clinical education. This is because VR can help health professionals learn complex subjects, improve memory, and increase interest in abstract concepts. In the context of medical education, the immersive nature of a VR setting allows students and clinicians in training to interact with virtual patients and anatomical structures in a three-dimensional environment or from a clinician's point of view.

View Article and Find Full Text PDF

Mechano-Graded Contact-Electrification Interfaces Based Artificial Mechanoreceptors for Robotic Adaptive Reception.

ACS Nano

December 2024

Institute of Functional Nano and Soft Materials (FUNSOM), Joint International Research Laboratory of Carbon-Based Functional Materials and Devices, Soochow University, Suzhou 215123, P. R. China.

Triboelectrification-based artificial mechanoreceptors (TBAMs) is able to convert mechanical stimuli directly into electrical signals, realizing self-adaptive protection and human-machine interactions of robots. However, traditional contact-electrification interfaces are prone to reaching their deformation limits under large pressures, resulting in a relatively narrow linear range. In this work, we fabricated mechano-graded microstructures to modulate the strain behavior of contact-electrification interfaces, simultaneously endowing the TBAMs with a high sensitivity and a wide linear detection range.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!