Objective: We introduce a novel, phase-based, functional connectivity descriptor that encapsulates not only the synchronization strength between distinct brain regions, but also the time-lag between the involved neural oscillations. The new estimator employs complex-valued measurements and results in a brain network sketch that lives on the smooth manifold of Hermitian Positive Definite (HPD) matrices.
Approach: Leveraging the HPD property of the proposed descriptor, we adapt a recently introduced dimensionality reduction methodology that is based on Riemannian Geometry and discriminatively detects the recording sites which best reflect the differences in network organization between contrasting recording conditions in order to overcome the problem of high-dimensionality, usually encountered in the connectivity patterns derived from multisite encephalographic recordings.
Objective: Spatial covariance matrices are extensively employed as brain activity descriptors in brain computer interface (BCI) research that, typically, involve the whole array of sensors. Here, we introduce a methodological framework for delineating the subset of sensors, the covariance structure of which offers a reduced, but more powerful, representation of brain's coordination patterns that ultimately leads to reliable mind reading.
Methods: Adopting a Riemannian geometry approach, we turn the problem of sensor selection as a maximization of a functional that is computed over the manifold of symmetric positive definite (SPD) matrices and encapsulates class separability in a way that facilitates the search among subsets of different size.
Gaze-based keyboards offer a flexible way for human-computer interaction in both disabled and able-bodied people. Besides their convenience, they still lead to error-prone human-computer interaction. Eye tracking devices may misinterpret user's gaze resulting in typesetting errors, especially when operated in fast mode.
View Article and Find Full Text PDFWe present a dataset that combines multimodal biosignals and eye tracking information gathered under a human-computer interaction framework. The dataset was developed in the vein of the MAMEM project that aims to endow people with motor disabilities with the ability to edit and author multimedia content through mental commands and gaze activity. The dataset includes EEG, eye-tracking, and physiological (GSR and Heart rate) signals collected from 34 individuals (18 able-bodied and 16 motor-impaired).
View Article and Find Full Text PDF