Although structural priming seems to rely on the same mechanisms in production and comprehension, effects are not always consistent between modalities. Methodological differences often result in different data types, namely choice data in production and reaction time data in comprehension. In a structural priming experiment with English ditransitives, we collected choice data and reaction time data in both modalities. The choice data showed priming of the DO and PO dative. The reaction times revealed priming of the PO dative. In production, PO targets were chosen faster after a PO prime than after a baseline prime. In comprehension, DO targets were read slower after a PO prime than after a baseline prime. This result can be explained from competition between alternatives during structure selection. Priming leads to facilitation of the primed structure or inhibition of the opposite structure depending on the relative frequency of structures, which may differ across modalities.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1080/02643294.2023.2279735 | DOI Listing |
Microbiome
January 2025
Estonian Genome Centre, Institute of Genomics, University of Tartu, Tartu, Estonia.
Background: Accurate classification of host phenotypes from microbiome data is crucial for advancing microbiome-based therapies, with machine learning offering effective solutions. However, the complexity of the gut microbiome, data sparsity, compositionality, and population-specificity present significant challenges. Microbiome data transformations can alleviate some of the aforementioned challenges, but their usage in machine learning tasks has largely been unexplored.
View Article and Find Full Text PDFBMC Med Educ
January 2025
Research Center for Environmental Determinants of Health, Health Institute, Kermanshah University of Medical Sciences, Kermanshah, Iran.
Aims: This study evaluates both financial and non-financial preferences of nursing students to choose a hospital for work in future.
Background: In Iran's healthcare system, the persistent shortage and uneven distribution of nurses have been significant challenges. Addressing such issues requires attention to nurses' preferences, which can be instrumental in designing effective interventions.
BMC Anesthesiol
January 2025
Department of Anesthesiology and Reanimation, Mardin Artuklu University School of Medicine, Diyarbakır Road, Artuklu, Mardin, 47100, Turkey.
Background: In medicine, Artificial intelligence has begun to be utilized in nearly every domain, from medical devices to the interpretation of imaging studies. There is still a need for more experience and more studies related to the comprehensive use of AI in medicine. The aim of the present study is to evaluate the ability of AI to make decisions regarding anesthesia methods and to compare the most popular AI programs from this perspective.
View Article and Find Full Text PDFLearn Behav
January 2025
Department of Psychological and Brain Sciences, Villanova University, Villanova, PA, USA.
The perception of objects is a challenging task that requires recognizing visual elements and integrating them into a whole. While human vision prioritizes attention to the overall configuration, data from other species suggests this bias towards global form perception is not universal. Studies with pigeons indicate preferential attention to local details when both local and global information may be diagnostic, but studies with other bird species are more limited.
View Article and Find Full Text PDFCNS Drugs
January 2025
Cognitive and Clinical Neuroimaging Core, McLean Hospital, McLean Imaging Center, Belmont, MA, USA.
The relationship between cannabis use and mental health is complex, as studies often report seemingly contradictory findings regarding whether cannabis use results in more positive or negative treatment outcomes. With an increasing number of individuals using cannabis for both recreational (i.e.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!