Recently developed accounts of language comprehension propose that sentences are understood by constructing a perceptual simulation of the events being described. These simulations involve the re-activation of patterns of brain activation that were formed during the comprehender's interaction with the world. In two experiments we explored the specificity of the processing mechanisms required to construct simulations during language comprehension. Participants listened to (and made judgments on) sentences that described motion in a particular direction (e.g. "The car approached you"). They simultaneously viewed dynamic black-and-white stimuli that produced the perception of movement in the same direction as the action specified in the sentence (i.e. towards you) or in the opposite direction as the action specified in the sentence (i.e. away from you). Responses were faster to sentences presented concurrently with a visual stimulus depicting motion in the opposite direction as the action described in the sentence. This suggests that the processing mechanisms recruited to construct simulations during language comprehension are also used during visual perception, and that these mechanisms can be quite specific.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.cognition.2004.06.005 | DOI Listing |
Tech Coloproctol
January 2025
Ellen Leifer Shulman and Steven Shulman Digestive Disease Center, Cleveland Clinic Florida, 2950 Cleveland Clinic Blvd, Weston, FL, USA.
Introduction: Chatbots have been increasingly used as a source of patient education. This study aimed to compare the answers of ChatGPT-4 and Google Gemini to common questions on benign anal conditions in terms of appropriateness, comprehensiveness, and language level.
Methods: Each chatbot was asked a set of 30 questions on hemorrhoidal disease, anal fissures, and anal fistulas.
BMC Med Educ
January 2025
Bangladesh Medical College Hospital, Dhaka, 1209, Bangladesh.
Background: The involvement of undergraduate medical students in research is pivotal for the advancement of evidence-based clinical practice. This study aimed to assess the extent of research involvement and the factors influencing it among undergraduate medical students in Bangladesh.
Methods: A multi-center cross-sectional study involving 2864 medical students from both public and private medical colleges was conducted between June and December 2023.
Eur Arch Otorhinolaryngol
January 2025
Faculty of Applied Sciences, Department of Accounting and Financial Management, Necmettin Erbakan University, Konya, Turkey.
Purpose: Vestibular neuritis (VN) is a common cause of vertigo with significant impact on patients' quality of life. This study aimed to analyze global research trends in VN using bibliometric methods to identify key themes, influential authors, institutions, and countries contributing to the field.
Methods: We conducted a comprehensive search of the Web of Science Core Collection database for publications related to VN from 1980 to 2024.
Objective: To explore the lived experiences and extent of cognitive symptoms in Long COVID (LC) in a UK-based sample.
Design: This study implemented a mixed-methods design. Eight focus groups were conducted to collect qualitative data, and the Framework Analysis was used to reveal the experiences and impact of cognitive symptoms.
J Biomed Inform
January 2025
University of Manchester, United Kingdom.
Objective: Extracting named entities from clinical free-text presents unique challenges, particularly when dealing with discontinuous entities-mentions that are separated by unrelated words. Traditional NER methods often struggle to accurately identify these entities, prompting the development of specialised computational solutions. This paper systematically reviews and presents the methodologies developed for Discontinuous Named Entity Recognition in clinical texts, highlighting their effectiveness and the challenges they face.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!