A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Brain activity during divided and selective attention to auditory and visual sentence comprehension tasks. | LitMetric

Brain activity during divided and selective attention to auditory and visual sentence comprehension tasks.

Front Hum Neurosci

Division of Cognitive Psychology and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland ; Advanced Magnetic Imaging Centre, Aalto NeuroImaging, Aalto University Espoo, Finland ; Helsinki Collegium for Advanced Studies, University of Helsinki Helsinki, Finland ; Swedish Collegium for Advanced Study Uppsala, Sweden.

Published: March 2015

AI Article Synopsis

  • The study utilized fMRI to explore brain activity while participants judged sentence congruence in visual, auditory, or both modalities.
  • Significant performance drops occurred when attention was split between modalities compared to focusing on one, indicating interference due to shared cortical processing.
  • Increased activation was seen in frontal areas, and no additional brain regions were recruited for dual-tasking, showing that semantic tasks did not inhibit each other across modalities.

Article Abstract

Using functional magnetic resonance imaging (fMRI), we measured brain activity of human participants while they performed a sentence congruence judgment task in either the visual or auditory modality separately, or in both modalities simultaneously. Significant performance decrements were observed when attention was divided between the two modalities compared with when one modality was selectively attended. Compared with selective attention (i.e., single tasking), divided attention (i.e., dual-tasking) did not recruit additional cortical regions, but resulted in increased activity in medial and lateral frontal regions which were also activated by the component tasks when performed separately. Areas involved in semantic language processing were revealed predominantly in the left lateral prefrontal cortex by contrasting incongruent with congruent sentences. These areas also showed significant activity increases during divided attention in relation to selective attention. In the sensory cortices, no crossmodal inhibition was observed during divided attention when compared with selective attention to one modality. Our results suggest that the observed performance decrements during dual-tasking are due to interference of the two tasks because they utilize the same part of the cortex. Moreover, semantic dual-tasking did not appear to recruit additional brain areas in comparison with single tasking, and no crossmodal inhibition was observed during intermodal divided attention.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4333810PMC
http://dx.doi.org/10.3389/fnhum.2015.00086DOI Listing

Publication Analysis

Top Keywords

selective attention
16
divided attention
16
attention
9
brain activity
8
performance decrements
8
compared selective
8
single tasking
8
recruit additional
8
crossmodal inhibition
8
inhibition observed
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!