A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Technology and Dementia Preconference. | LitMetric

Technology and Dementia Preconference.

Alzheimers Dement

Cumulus Neuroscience, Dublin, Ireland.

Published: December 2024

Background: Current tools for Alzheimer's disease screening and staging used in clinical research (e.g. ACE-3, ADAS-Cog) require substantial face-to-face time with trained professionals, and may be affected by subjectivity, "white coat syndrome" and other biases. Alzheimer's symptomology is multi-factorial and varies day-to-day. To enable stratification for precision treatments, more effective composite endpoints are required, with accurate quantification of functional impairment in individual domains. Ideally these would be measured objectively and frequently in real-world contexts to improve validity, and to reduce clinical burden in trial and care settings.

Method: We present a machine-learning stratification analysis using data from the Cumulus real-world multi-domain neuroassessment platform (Figure 1), in CNS101, a year-long study designed with a consortium of 10 pharma companies. Participants completed a range of tablet-based functional tasks with wake EEG, and separately recorded EEG during sleep. This cross-sectional analysis uses 1111 sessions from repeated sampling with 101 participants (47 patients with mild Alzheimer's type dementia, and 54 corresponding controls, recruited across 7 sites in the UK), during an early two-week burst period of autonomous at-home use. Bagging of decision trees was used to compare the power of different data sources in discriminating dementia from neurotypical states. This algorithm is widely used for its performance across learning tasks (linear and non-linear), and heterogeneous datasets of differing size, levels of noise, and collinearity. For each set of input data, 100 classifiers were trained and evaluated (10-fold cross-validation in each of 10 random partitions), and the mean Area Under the Curve (AUC) of sensitivity/specificity trade-off was calculated.

Result: A full set of 691 multimodal features yielded a classifier with AUC performance of 0.953 and error rate 4.7% (Figures 2, 3) - compared to AUC of 0.921 and error rate 7.9% for the set of 13 ADAS-Cog subscores. While individual modalities/domains all performed above chance, the 50 features from Cumulus cognitive tasks alone had an AUC performance of 0.955.

Conclusion: Patients using digital technology autonomously in the home can yield data that matches or exceeds the discriminative power of a traditional composite scale. This can enable objective precision measurement of disease severity at scale.

Download full-text PDF

Source
http://dx.doi.org/10.1002/alz.094323DOI Listing

Publication Analysis

Top Keywords

auc performance
8
error rate
8
technology dementia
4
dementia preconference
4
preconference background
4
background current
4
current tools
4
tools alzheimer's
4
alzheimer's disease
4
disease screening
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!