In recent years, the analysis of movement patterns has increasingly focused on the individuality of movements. After long speculations about weak individuality, strong individuality is now accepted, and the first situation-dependent fine structures within it are already identified. Methodologically, however, only signals of the same movements have been compared so far. The goal of this work is to detect cross-movement commonalities of individual walking, running, and handwriting patterns using data augmentation. A total of 17 healthy adults (35.8 ± 11.1 years, eight women and nine men) each performed 627.9 ± 129.0 walking strides, 962.9 ± 182.0 running strides, and 59.25 ± 1.8 handwritings. Using the conditional cycle-consistent generative adversarial network (CycleGAN), conditioned on the participant's class, a pairwise transformation between the vertical ground reaction force during walking and running and the vertical pen pressure during handwriting was learned in the first step. In the second step, the original data of the respective movements were used to artificially generate the other movement data. In the third step, whether the artificially generated data could be correctly assigned to a person via classification using a support vector machine trained with original data of the movement was tested. The classification F1-score ranged from 46.8% for handwriting data generated from walking data to 98.9% for walking data generated from running data. Thus, cross-movement individual patterns could be identified. Therefore, the methodology presented in this study may help to enable cross-movement analysis and the artificial generation of larger amounts of data.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10436554 | PMC |
http://dx.doi.org/10.3389/fbioe.2023.1204115 | DOI Listing |
JASA Express Lett
January 2025
Department of Imaging Sciences, University of Rochester, Rochester, New York 14642, USA.
Ultrasound tomography fundamentally relies on low-frequency data to avoid cycle skipping in full-waveform inversion (FWI). In the absence of sufficiently low-frequency data, we can extrapolate low-frequency content from existing high-frequency signals by using the same approach used in frequency-difference beamforming. This low-frequency content is then used to kickstart FWI and avoid cycle skipping at higher frequencies.
View Article and Find Full Text PDFJMIR Res Protoc
January 2025
National Radiotherapy, Oncology and Nuclear Medicine Centre, Korle-bu Teaching Hospital, Accra, Ghana.
Background: Cancer is a leading cause of global mortality, accounting for nearly 10 million deaths in 2020. This is projected to increase by more than 60% by 2040, particularly in low- and middle-income countries. Yet, palliative and psychosocial oncology care is very limited in these countries.
View Article and Find Full Text PDFJ Neurosurg
January 2025
1Department of Neurosurgery, Inselspital, Bern University Hospital, University Bern, Switzerland.
Objective: The effectiveness and optimal stimulation site of deep brain stimulation (DBS) for central poststroke pain (CPSP) remain elusive. The objective of this retrospective international multicenter study was to assess clinical as well as neuroimaging-based predictors of long-term outcomes after DBS for CPSP.
Methods: The authors analyzed patient-based clinical and neuroimaging data of previously published and unpublished cohorts from 6 international DBS centers.
J Neurosurg Pediatr
January 2025
Departments of1Neurosurgery and.
Objective: Intraventricular baclofen (IVB) administration is used for the treatment of secondary dystonia associated with cerebral palsy (CP), but it has not been reported as a first-line infusion technique for spasticity. In this study, the authors report outcomes of patients with mixed or isolated spasticity treated with IVB administration.
Methods: A retrospective analysis was performed of consecutive patients treated with IVB between 2019 and 2023.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!