When reading a sentence, individual words can be combined to create more complex meaning. In this study, we sought to uncover brain regions that reflect the representation of the meaning of sentences at the topic level, as opposed to the meaning of their individual constituent words when considered irrespective of their context. Using fMRI, we recorded the neural activity of participants while reading sentences. We constructed a topic-level sentence representations using the final layer of a convolutional neural network (CNN) trained to classify Wikipedia sentences into broad semantic categories. This model was contrasted with word-level sentence representations constructed using the average of the word embeddings constituting the sentence. Using representational similarity analysis, we found that the medial prefrontal cortex, lateral anterior temporal lobe, precuneus, and angular gyrus more strongly represent sentence topic-level, compared to word-level, meaning, uncovering the important role of these semantic system regions in the representation of topic-level meaning. Results were comparable when sentence meaning was modelled with a multilayer perceptron that was not sensitive to word order within a sentence, suggesting that the learning objective, in the terms of the topic being modelled, is the critical factor in capturing these neural representational spaces.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10184870PMC
http://dx.doi.org/10.1016/j.neuroimage.2022.119005DOI Listing

Publication Analysis

Top Keywords

medial prefrontal
8
prefrontal cortex
8
cortex lateral
8
lateral anterior
8
anterior temporal
8
temporal lobe
8
lobe precuneus
8
precuneus angular
8
angular gyrus
8
sentence representations
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!