Purpose: Object naming requires visual decoding, conceptualization, semantic categorization, and phonological encoding, all within 400 to 600 ms of stimulus presentation and before a word is spoken. In this study, we sought to predict semantic categories of naming responses based on prearticulatory brain activity recorded with scalp EEG in healthy individuals.

Methods: We assessed 19 healthy individuals who completed a naming task while undergoing EEG. The naming task consisted of 120 drawings of animate/inanimate objects or abstract drawings. We applied a one-dimensional, two-layer, neural network to predict the semantic categories of naming responses based on prearticulatory brain activity.

Results: Classifications of animate, inanimate, and abstract responses had an average accuracy of 80%, sensitivity of 72%, and specificity of 87% across participants. Across participants, time points with the highest average weights were between 470 and 490 milliseconds after stimulus presentation, and electrodes with the highest weights were located over the left and right frontal brain areas.

Conclusions: Scalp EEG can be successfully used in predicting naming responses through prearticulatory brain activity. Interparticipant variability in feature weights suggests that individualized models are necessary for highest accuracy. Our findings may inform future applications of EEG in reconstructing speech for individuals with and without speech impairments.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10628367PMC
http://dx.doi.org/10.1097/WNP.0000000000000933DOI Listing

Publication Analysis

Top Keywords

naming responses
16
responses based
12
based prearticulatory
12
brain activity
12
prearticulatory brain
12
semantic categorization
8
stimulus presentation
8
predict semantic
8
semantic categories
8
categories naming
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!