Background: In electroencephalographic (EEG) or electrocorticographic (ECoG) experiments, visual cues are commonly used for timing synchronization but may inadvertently induce neural activity and cognitive processing, posing challenges when decoding self-initiated tasks.
New Method: To address this concern, we introduced four new visual cues (Fade, Rotation, Reference, and Star) and investigated their impact on brain signals. Our objective was to identify a cue that minimizes its influence on brain activity, facilitating cue-effect free classifier training for asynchronous applications, particularly aiding individuals with severe paralysis.
Results: 22 able-bodied, right-handed participants aged 18-30 performed hand movements upon presentation of the visual cues. Analysis of time-variability between movement onset and cue-aligned data, grand average MRCP, and classification outcomes revealed significant differences among cues. Rotation and Reference cue exhibited favorable results in minimizing temporal variability, maintaining MRCP patterns, and achieving comparable accuracy to self-paced signals in classification.
Comparison With Existing Methods: Our study contrasts with traditional cue-based paradigms by introducing novel visual cues designed to mitigate unintended neural activity. We demonstrate the effectiveness of Rotation and Reference cue in eliciting consistent and accurate MRCPs during motor tasks, surpassing previous methods in achieving precise timing and high discriminability for classifier training.
Conclusions: Precision in cue timing is crucial for training classifiers, where both Rotation and Reference cue demonstrate minimal variability and high discriminability, highlighting their potential for accurate classifications in online scenarios. These findings offer promising avenues for refining brain-computer interface systems, particularly for individuals with motor impairments, by enabling more reliable and intuitive control mechanisms.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.jneumeth.2024.110241 | DOI Listing |
J Behav Addict
January 2025
Department of Psychology, Sun Yat-sen University, Guangzhou, China.
Background And Aims: Uncontrollable gaming behavior is a core symptom of Internet Gaming Disorder (IGD). Attentional bias towards game-related cues may contribute to the difficulty in regulating online gaming behavior. However, the context-specific attentional bias and its cognitive mechanisms in individuals with IGD have not been systematically investigated.
View Article and Find Full Text PDFJ Neurosci
January 2025
Department of Neuroscience and Biomedical Engineering, Aalto University, Espoo FI-00076, Finland.
Our visual system enables us to effortlessly navigate and recognize real-world visual environments. Functional magnetic resonance imaging (fMRI) studies suggest a network of scene-responsive cortical visual areas, but much less is known about the temporal order in which different scene properties are analysed by the human visual system. In this study, we selected a set of 36 full-colour natural scenes that varied in spatial structure and semantic content that our male and female human participants viewed both in 2D and 3D while we recorded magnetoencephalography (MEG) data.
View Article and Find Full Text PDFEar Hear
December 2024
Center for Hearing Research, Boys Town National Research Hospital, Omaha, Nebraska, USA.
Objectives: To investigate the influence of frequency-specific audibility on audiovisual benefit in children, this study examined the impact of high- and low-pass acoustic filtering on auditory-only and audiovisual word and sentence recognition in children with typical hearing. Previous studies show that visual speech provides greater access to consonant place of articulation than other consonant features and that low-pass filtering has a strong impact on perception on acoustic consonant place of articulation. This suggests visual speech may be particularly useful when acoustic speech is low-pass filtered because it provides complementary information about consonant place of articulation.
View Article and Find Full Text PDFPLoS One
January 2025
Department of Psychology, Tokyo Woman's Christian University, Tokyo, Japan.
We perceive and understand others' emotional states from multisensory information such as facial expressions and vocal cues. However, such cues are not always available or clear. Can partial loss of visual cues affect multisensory emotion perception? In addition, the COVID-19 pandemic has led to the widespread use of face masks, which can reduce some facial cues used in emotion perception.
View Article and Find Full Text PDFPLoS One
January 2025
Department of Ophthalmology, Keck School of Medicine, USC Roski Eye Institute, University of Southern California, Los Angeles, California, United States of America.
Failure of central nervous system (CNS) axons to regenerate after injury results in permanent disability. Several molecular neuro-protective and neuro-regenerative strategies have been proposed as potential treatments but do not provide the directional cues needed to direct target-specific axon regeneration. Here, we demonstrate that applying an external guidance cue in the form of electric field stimulation to adult rats after optic nerve crush injury was effective at directing long-distance, target-specific retinal ganglion cell (RGC) axon regeneration to native targets in the diencephalon.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!