Background: The application of intelligent robots in therapy is becoming more and more important for people with dementia. More extensive research is still needed to evaluate its impact on behavioral and psychological dementia symptoms, as well as quality of life in different care settings.

Objective: The purpose of this research is to methodically assess how well intelligence robot interventions work for patients with dementia.

Methods: In accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) 2020 guidelines, a comprehensive search was conducted on PubMed, CINAHL, the Cochrane Library, Embase, and Web of Science from the time of their founding to February 2024, to identify relevant randomized controlled trials on the use of intelligent robots in people with dementia. Two authors (WF and RZ) independently applied the Cochrane Collaboration bias assessment tool to assess the included studies' quality. The intervention effect of intelligent robots on patients with dementia was summarized using a fixed-effect model or a random-effects model with Stata software (version 16.0; StataCorp). Subgroup analysis was performed according to the intelligent robot type and the intervention duration. Publication bias was tested using funnel plots, Egger tests, and the trim-and-fill method.

Results: In total, 15 studies were finally included for systematic review, encompassing 705 participants, of which 12 studies were subjected to meta-analysis. The meta-analysis found that compared with the control group, intelligent robot intervention significantly reduced the levels of agitation (standardized mean difference -0.36, 95% CI -0.56 to -0.17; P<.001) and anxiety (weighted mean difference -1.93, 95% CI -3.13 to -0.72; P=.002) in patients with dementia. However, the intervention of intelligent robots had no significant effect on the following (all P>.05): cognitive function, neuropsychiatric symptoms, depression, quality of life, step count during the day, and the hours of lying down during the night of patients with dementia. Subgroup analysis revealed that the improvement of depression was related to the duration of the intervention (≤12 vs 12 weeks: 0.08, 95% CI -0.20 to 0.37 vs -0.68, 95% CI -1.00 to -0.37; P=.26) and was independent of the type of intelligent robots (animal robots vs humanoid robots: -0.30, 95% CI -0.75 to 0.15 vs 0.07, 95% CI -0.21 to -0.34; P=.26).

Conclusions: This study shows that intelligent robot intervention can help improve the agitation and anxiety levels of people with dementia. The intervention may be more effective the longer it is implemented. The appearance of the intelligent robot has no effect on the intervention effect. Further research is needed to help collect physiological data, such as physical activity in people with dementia; explore the impact of other intelligent robot design features on the intervention effect; and provide a reference for improving intelligent robots and intervention programs.

Trial Registration: PROSPERO CRD42024523007; https://tinyurl.com/mwscn985.

Download full-text PDF

Source
http://dx.doi.org/10.2196/59892DOI Listing

Publication Analysis

Top Keywords

intelligent robot
24
people dementia
20
intelligent robots
20
robot intervention
12
intelligent
11
intervention
9
robot interventions
8
dementia
8
systematic review
8
randomized controlled
8

Similar Publications

Purpose: This study aimed to examine the child-robot interaction characteristics relevant to the use of robot Pepper as a new tool in neurorehabilitation.

Method: The study was conducted at the Children's Clinic of Tartu University Hospital and involved 89 children (aged 4-16 years): 39 healthy children and 50 children with neurological disorders. Forty-nine children interacted with Pepper directly, whereas 40 interacted via video.

View Article and Find Full Text PDF

Of rats and robots: A mutual learning paradigm.

J Exp Anal Behav

March 2025

Behavioral Neuroscience Laboratory, Department of Psychology, Boğaziçi University, Istanbul, Turkey.

Robots are increasingly used alongside Skinner boxes to train animals in operant conditioning tasks. Similarly, animals are being employed in artificial intelligence research to train various algorithms. However, both types of experiments rely on unidirectional learning, where one partner-the animal or the robot-acts as the teacher and the other as the student.

View Article and Find Full Text PDF

Distance-Readout Paper-Based Microfluidic Chip with a DNA Hydrogel Valve for AFB1 Detection.

Anal Chem

March 2025

Beijing Key Laboratory of Traditional Chinese Medicine Basic Research on Prevention and Treatment for Major Diseases, Robot Intelligent Laboratory of Traditional Chinese Medicine, Experimental Research Center, China Academy of Chinese Medical Sciences, Beijing 100700, P. R. China.

Accurate and rapid aflatoxin B1 (AFB1) detection is essential for ensuring the safety of food supplies. In this paper, we introduce a distance-readout paper-based microfluidic chip (DPMC) that offers a sensitive and reliable method for the detection of AFB1. The DPMC comprises a DNA hydrogel sensitive valve and a paper-based capillary channel.

View Article and Find Full Text PDF

Predictive control of musculotendon loads across fast and slow-twitch muscles in a simulated system with parallel actuation.

Wearable Technol

February 2025

Neuromuscular Robotics Laboratory, Department of Biomechanical Engineering, University of Twente, Enschede, the Netherlands.

Research in lower limb wearable robotic control has largely focused on reducing the metabolic cost of walking or compensating for a portion of the biological joint torque, for example, by applying support proportional to estimated biological joint torques. However, due to different musculotendon unit (MTU) contractile speed properties, less attention has been given to the development of wearable robotic controllers that can steer MTU dynamics directly. Therefore, closed-loop control of MTU dynamics needs to be robust across fiber phenotypes, that is ranging from slow type I to fast type IIx in humans.

View Article and Find Full Text PDF

Robust perception systems allow farm robots to recognize weeds and vegetation, enabling the selective application of fertilizers and herbicides to mitigate the environmental impact of traditional agricultural practices. Today's perception systems typically rely on deep learning to interpret sensor data for tasks such as distinguishing soil, crops, and weeds. These approaches usually require substantial amounts of manually labeled training data, which is often time-consuming and requires domain expertise.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!