Objective: We used a combination of simulation and recordings from human subjects to characterize how saccadic eye movements affect the magnetoencephalogram (MEG).

Methods: We used simulated saccadic eye movements to generate simulated MEG signals. We also recorded the MEG signals from three healthy adults to 5° magnitude saccades that were vertical up and down, and horizontal left and right.

Results: The signal elicited by the rotating eye dipoles is highly dependent on saccade direction, can cover a large area, can sometimes have a non-intuitive trajectory, but does not significantly extend above approximately 30Hz in the frequency domain. In contrast, the saccadic spikes (which are primarily monophasic pulses, but can be biphasic) are highly localized to the lateral frontal regions for all saccade directions, and in the frequency domain extend up past 60Hz.

Conclusions: Gamma band saccadic artifact is spatially localized to small regions regardless of saccade direction, but beta band and lower frequency saccadic artifact have broader spatial extents that vary strongly as a function of saccade direction.

Significance: We have here characterized the MEG saccadic artifact in both the spatial and the frequency domains for saccades of different directions. This could be important in ruling in or ruling out artifact in MEG recordings.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.clinph.2016.12.013DOI Listing

Publication Analysis

Top Keywords

saccadic eye
12
eye movements
12
saccadic artifact
12
meg signals
8
saccade direction
8
frequency domain
8
regions saccade
8
saccadic
7
movements sensor-level
4
sensor-level magnetoencephalogram
4

Similar Publications

Visual information emerging from the extrafoveal locations is important for visual search, saccadic eye movement control, and spatial attention allocation. Our everyday sensory experience with visual object categories varies across different parts of the visual field which may result in location-contingent variations in visual object recognition. We used a body, animal body, and chair two-forced choice object category recognition task to investigate this possibility.

View Article and Find Full Text PDF

Background: Phenotyping Alzheimer's Disease (AD) can be crucial to providing personalized treatment. Several studies have analyzed the use of digital biomarkers to characterize a subject's behavior, usually obtained from a single modality, such as speech. However, combining several modalities in a single study has not been deeply studied.

View Article and Find Full Text PDF

Background: Canada and the United States are both aging and becoming increasingly diverse. Despite this demographic shift, non-White racial/ethnic groups remain underrepresented in research on cognitive impairment and dementia. A major barrier to inclusivity is the lack of cognitive assessments that are valid in individuals with diverse language and cultural backgrounds.

View Article and Find Full Text PDF

Background: Clinical rating scales and neuropsychological tests are commonly used for assessing sign and disease severity, yet lack detail in the early stages Alzheimer's Disease (AD). Existing evaluation methods can be subjective, nonlinear, expensive, or reliant on anecdotal evidence making objective and consistent characterization and phenotyping of AD difficult. Multimodal analysis of patient behavior, rather than scoring of patient-generated output which can be skewed by compensation strategies, presents a unique opportunity to objectively quantify AD related changes.

View Article and Find Full Text PDF

Background: Phenotyping Alzheimer's Disease (AD) can be crucial to providing personalized treatment. Several studies have analyzed the use of digital biomarkers to characterize a subject's behavior, usually obtained from a single modality, such as speech. However, combining several modalities in a single study has not been deeply studied.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!