Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2802702 | PMC |
http://dx.doi.org/10.1258/jrsm.2009.090268 | DOI Listing |
Vet Sci
December 2024
Center for Animals and Public Policy, Cummings School of Veterinary Medicine, Tufts University, 200 Westboro Rd., North Grafton, MA 01536, USA.
Youth mental health interventions incorporating trained therapy animals are increasingly popular, but more research is needed to understand the specific interactive behaviors between participants and therapy dogs. Understanding the role of these interactive behaviors is important for supporting both intervention efficacy and animal welfare and well-being. The goal of this study was to develop ethograms to assess interactive behaviors (including both affiliative and stress-related behaviors) of participants and therapy dogs during a social stress task, explore the relationship between human and dog behaviors, and assess how these behaviors may vary between experimental conditions with varying levels of physical contact with the therapy dog.
View Article and Find Full Text PDFPalliat Support Care
April 2024
Faculty of Health, University of Technology Sydney, Improving Palliative, Aged and Chronic Care through Clinical Research and Translation (IMPACCT), Sydney, NSW, Australia.
J Prev Alzheimers Dis
January 2024
Jeffrey N. Motter, Department of Psychiatry, Division of Geriatric Psychiatry, 1051 Riverside Drive, New York, NY 10032, United States. Email:
Background: Computerized cognitive training (CCT) has emerged as a potential treatment option for mild cognitive impairment (MCI). It remains unclear whether CCT's effect is driven in part by expectancy of improvement.
Objectives: This study aimed to determine factors associated with therapeutic expectancy and the influence of therapeutic expectancy on treatment effects in a randomized clinical trial of CCT versus crossword puzzle training (CPT) for older adults with MCI.
Radiology
October 2023
From the Department of Radiology, Division of Neuroradiology, Alzheimer Disease Imaging Research Laboratory (C.O.L., J.R.P.), and Neurocognitive Disorders Program, Departments of Psychiatry and Medicine (P.M.D.), Duke University Medical Center, DUMC-Box 3808, Durham, NC 27710-3808; and Duke Institute for Brain Sciences (P.M.D.) and Department of Electrical and Computer Engineering, Department of Computer Science, Department of Biostatistics and Bioinformatics (L.Z., M.A.M.), Duke University, Durham, NC.
Background PET can be used for amyloid-tau-neurodegeneration (ATN) classification in Alzheimer disease, but incurs considerable cost and exposure to ionizing radiation. MRI currently has limited use in characterizing ATN status. Deep learning techniques can detect complex patterns in MRI data and have potential for noninvasive characterization of ATN status.
View Article and Find Full Text PDFJ Speech Lang Hear Res
August 2023
ACTE, LaDisco and ULB Neuroscience Institute, Université Libre de Bruxelles, Brussels, Belgium.
Purpose: Our study addresses three main questions: (a) Do autistics and neurotypicals produce different patterns of disfluencies, depending on the experimenter's direct versus averted gaze? (b) Are these patterns correlated to gender, skin conductance responses, fixations on the experimenter's face, alexithymia, or social anxiety scores? Lastly, (c) can eye-tracking and electrodermal activity data be used in distinguishing listener- versus speaker-oriented disfluencies?
Method: Within a live face-to-face paradigm combining a wearable eye-tracker with electrodermal activity sensors, 80 adults (40 autistics, 40 neurotypicals) defined words in front of an experimenter who was either staring at their eyes (direct gaze condition) or looking elsewhere (averted gaze condition).
Results: Autistics produce less listener-oriented (, ) and more speaker-oriented (prolongations, breath) disfluencies than neurotypicals. In both groups, men produce less than women.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!