Reading comprehension, a fundamental cognitive ability essential for knowledge acquisition, is a complex skill, with a notable number of learners lacking proficiency in this domain. This study introduces innovative tasks for Brain-Computer Interface (BCI), predicting the relevance of words or tokens read by individuals to the target inference words. We use state-of-the-art Large Language Models (LLMs) to guide a new reading embedding representation in training. This representation, integrating EEG and eye-tracking biomarkers through an attention-based transformer encoder, achieved a mean 5-fold cross-validation accuracy of 68.7% across nine subjects using a balanced sample, with the highest single-subject accuracy reaching 71.2%. This study pioneers the integration of LLMs, EEG, and eye-tracking for predicting human reading comprehension at the word level. We fine-tune the pre-trained Bidirectional Encoder Representations from Transformers (BERT) model for word embedding, devoid of information about the reading tasks. Despite this absence of task-specific details, the model effortlessly attains an accuracy of 92.7%, thereby validating our findings from LLMs. This work represents a preliminary step toward developing tools to assist reading. The code and data are available in github.

Download full-text PDF

Source
http://dx.doi.org/10.1109/EMBC53108.2024.10781627DOI Listing

Publication Analysis

Top Keywords

eeg eye-tracking
12
word embedding
8
reading embedding
8
large language
8
reading comprehension
8
reading
6
embedding reading
4
embedding large
4
language model
4
model eeg
4

Similar Publications

Unlabelled: Face masks became a part of everyday life during the SARS-CoV-2 pandemic. Previous studies showed that the face cognition mechanism involves holistic face processing, and the absence of face features could lower the cognition ability. This is opposed to the experience during the pandemic, when people could correctly recognize faces, although the mask covered a part of the face.

View Article and Find Full Text PDF

Anterior-posterior interactions in the alpha band (8-12 Hz) have been implicated in a variety of functions including perception, attention, and working memory. The underlying neural communication can be flexibly controlled by adjusting phase relations when activities across anterior-posterior regions oscillate at a matched frequency. We thus investigated how alpha oscillation frequencies spontaneously converged along anterior-posterior regions by tracking oscillatory EEG activity while participants rested.

View Article and Find Full Text PDF

While using fully autonomous vehicles is expected to radically change the way we live our daily lives, it is not yet available in most parts of the world, so we only have sporadic results on passenger reactions. Furthermore, we have very limited insights into how passengers react to an unexpected event during the ride. Previous physiological research has shown that passengers have lower levels of anxiety in the event of a human-driven condition compared to a self-driving condition.

View Article and Find Full Text PDF

This EEG and eye-tracking study investigated affective influences on cognitive preparation using a precued pro-/antisaccade task with emotional faces as cues. Negative information interfered with preparatory processes with high but not low executive function load.

View Article and Find Full Text PDF

Effective emotion recognition is vital for human interaction and has an impact on several fields such as psychology, social sciences, human-computer interaction, and emotional artificial intelligence. This study centers on the innovative contribution of a novel Myanmar emotion dataset to enhance emotion recognition technology in diverse cultural contexts. Our unique dataset is derived from a carefully designed emotion elicitation paradigm, using 15 video clips per session for three emotions (positive, neutral, and negative), with five clips per emotion.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!