MEG and EEG source analysis is frequently used for the presurgical evaluation of pharmacoresistant epilepsy patients. The source localization of the epileptogenic zone depends, among other aspects, on the selected inverse and forward approaches and their respective parameter choices. In this validation study, we compare the standard dipole scanning method with two beamformer approaches for the inverse problem, and we investigate the influence of the covariance estimation method and the strength of regularization on the localization performance for EEG, MEG, and combined EEG and MEG. For forward modelling, we investigate the difference between calibrated six-compartment and standard three-compartment head modelling. In a retrospective study, two patients with focal epilepsy due to focal cortical dysplasia type IIb and seizure freedom following lesionectomy or radiofrequency-guided thermocoagulation (RFTC) used the distance of the localization of interictal epileptic spikes to the resection cavity resp. RFTC lesion as reference for good localization. We found that beamformer localization can be sensitive to the choice of the regularization parameter, which has to be individually optimized. Estimation of the covariance matrix with averaged spike data yielded more robust results across the modalities. MEG was the dominant modality and provided a good localization in one case, while it was EEG for the other. When combining the modalities, the good results of the dominant modality were mostly not spoiled by the weaker modality. For appropriate regularization parameter choices, the beamformer localized better than the standard dipole scan. Compared to the importance of an appropriate regularization, the sensitivity of the localization to the head modelling was smaller, due to similar skull conductivity modelling and the fixed source space without orientation constraint.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8796031 | PMC |
http://dx.doi.org/10.3390/brainsci12010114 | DOI Listing |
Hum Brain Mapp
January 2025
Montreal Neurological Institute, McGill University, Montréal, Quebec, Canada.
Perception and production of music and speech rely on auditory-motor coupling, a mechanism which has been linked to temporally precise oscillatory coupling between auditory and motor regions of the human brain, particularly in the beta frequency band. Recently, brain imaging studies using magnetoencephalography (MEG) have also shown that accurate auditory temporal predictions specifically depend on phase coherence between auditory and motor cortical regions. However, it is not yet clear whether this tight oscillatory phase coupling is an intrinsic feature of the auditory-motor loop, or whether it is only elicited by task demands.
View Article and Find Full Text PDFNeuroimage Clin
January 2025
Clinical Neurophysiology Research Laboratory, Western Psychiatric Hospital, University of Pittsburgh School of Medicine, Pittsburgh, PA, USA.
Predicting symptom progression in first-episode psychosis (FEP) is crucial for tailoring treatment and improving outcomes. Temporal lobe function, indicated by neurophysiological biomarkers like N100, predicts symptom progression and correlates with untreated psychosis. Our recent report showed that source-localized magnetoencephalography (MEG) M100 responses to tones in an oddball paradigm predicted recovery in FEP positive symptoms.
View Article and Find Full Text PDFNeurosci Biobehav Rev
January 2025
Experimental Therapeutics and Pathophysiology Branch, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA; Department of Psychiatry and Psychotherapy, Jena University Hospital, Jena, Germany. Electronic address:
Understanding how the brain distinguishes emotional from neutral scenes is crucial for advancing brain-computer interfaces, enabling real-time emotion detection for faster, more effective responses, and improving treatments for emotional disorders like depression and anxiety. However, inconsistent research findings have arisen from differences in study settings, such as variations in the time windows, brain regions, and emotion categories examined across studies. This review sought to compile the existing literature on the timing at which the adult brain differentiates basic affective from neutral scenes in less than one second, as previous studies have consistently shown that the brain can begin recognizing emotions within just a few milliseconds.
View Article and Find Full Text PDFNeuroimage
January 2025
Dept. of Electrical and Computer Engineering, Worcester Polytechnic Institute, Worcester, MA, USA.
A fast BEM (boundary element method) based approach is developed to solve an EEG/MEG forward problem for a modern high-resolution head model. The method utilizes a charge-based BEM accelerated by the fast multipole method (BEM-FMM) with an adaptive mesh pre-refinement method (called b-refinement) close to the singular dipole source(s). No costly matrix-filling or direct solution steps typical for the standard BEM are required; the method generates on-skin voltages as well as MEG magnetic fields for high-resolution head models within 90 s after initial model assembly using a regular workstation.
View Article and Find Full Text PDFBiol Psychol
December 2024
Institute for Biomagnetism and Biosignal Analysis, University of Münster, Germany; Otto Creutzfeldt Center for Cognitive and Behavioral Neuroscience, University of Münster, Germany.
The ventromedial prefrontal cortex is widely linked with emotional phenomena, including appraisal, modulation, and reward processing. Its perigenual part is suggested to mediate the appetitive value of stimulation. In our previous study, besides changes in evoked MEG responses, we were able to induce an apparent behavioral bias toward more positive valence while interpreting the ambiguous, morphed faces after the effect of excitatory tDCS stimulation of the perigenual ventromedial cortex (pgVM).
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!