Background: Single camera markerless motion capture has the potential to facilitate at home movement assessment due to the ease of setup, portability, and affordable cost of the technology. However, it is not clear what the current healthcare applications of single camera markerless motion capture are and what information is being collected that may be used to inform clinical decision making. This review aims to map the available literature to highlight potential use cases and identify the limitations of the technology for clinicians and researchers interested in the collection of movement data.
Survey Methodology: Studies were collected up to 14 January 2022 using Pubmed, CINAHL and SPORTDiscus using a systematic search. Data recorded included the description of the markerless system, clinical outcome measures, and biomechanical data mapped to the International Classification of Functioning, Disability and Health Framework (ICF). Studies were grouped by patient population.
Results: A total of 50 studies were included for data collection. Use cases for single camera markerless motion capture technology were identified for Neurological Injury in Children and Adults; Hereditary/Genetic Neuromuscular Disorders; Frailty; and Orthopaedic or Musculoskeletal groups. Single camera markerless systems were found to perform well in studies involving single plane measurements, such as in the analysis of infant general movements or spatiotemporal parameters of gait, when evaluated against 3D marker-based systems and a variety of clinical outcome measures. However, they were less capable than marker-based systems in studies requiring the tracking of detailed 3D kinematics or fine movements such as finger tracking.
Conclusions: Single camera markerless motion capture offers great potential for extending the scope of movement analysis outside of laboratory settings in a practical way, but currently suffers from a lack of accuracy where detailed 3D kinematics are required for clinical decision making. Future work should therefore focus on improving tracking accuracy of movements that are out of plane relative to the camera orientation or affected by occlusion, such as supination and pronation of the forearm.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9148557 | PMC |
http://dx.doi.org/10.7717/peerj.13517 | DOI Listing |
Sensors (Basel)
December 2024
Institute of Computer and Communication Engineering, Department of Electrical Engineering, National Cheng Kung University, Tainan 701, Taiwan.
Precision depth estimation plays a key role in many applications, including 3D scene reconstruction, virtual reality, autonomous driving and human-computer interaction. Through recent advancements in deep learning technologies, monocular depth estimation, with its simplicity, has surpassed the traditional stereo camera systems, bringing new possibilities in 3D sensing. In this paper, by using a single camera, we propose an end-to-end supervised monocular depth estimation autoencoder, which contains an encoder with a structure with a mixed convolution neural network and vision transformers and an effective adaptive fusion decoder to obtain high-precision depth maps.
View Article and Find Full Text PDFJ Esthet Restor Dent
January 2025
Department of Oral Surgery and Stomatology, School of Dental Medicine, University of Bern, Bern, Switzerland.
Objective: To assess the reproducibility and reliability of the pink (PES) and white esthetic scores (WES) using digital images and the intra- and inter-examiner agreement among different clinical backgrounds and assessment methods.
Material And Methods: Standardized intraoral images were obtained from adult subjects with an implant-supported single-tooth fixed dental prosthesis located in the maxillary esthetic zone using a digital camera and a true-color intraoral scanner. According to the PES and WES criteria, the images were evaluated by 20 calibrated evaluators, 5 prosthodontists, 5 periodontists, 5 undergraduates, and 5 oral surgeons.
Lasers Surg Med
January 2025
Wellman Center for Photomedicine, Massachusetts General Hospital, Boston, Massachusetts, USA.
Objectives: This work highlights the methods used to develop a multi-pulse 1726 nm laser system combined with bulk air-cooling for selective sebaceous gland (SG) photothermolysis using thermal imaging and software algorithms. This approach enables treating to a desired tissue temperature and depth to provide a safe, effective, reproducible, and durable treatment of acne.
Methods: We designed and built a 1726 nm laser system with a 40 W maximum power output, a highly controlled air-cooling device, and a thermal camera in the handpiece, which permits real-time temperature monitoring of the epidermis.
Sci Rep
January 2025
School of Physics and Astronomy, University of Glasgow, Glasgow, United Kingdom.
SPDC photon-pairs exhibit spatial correlations which can be measured using detector arrays sensitive to single photons. However, these detector arrays have multiple readout modes and in order to optimise detection it is important to select the optimum mode to detect the correlations against a background of optical and electronic noise. These quantum correlations enable applications in imaging, sensing, communication, and optical processing.
View Article and Find Full Text PDFCornea
November 2024
Department of Ophthalmology and Visual Sciences, University of Louisville, Louisville, KY.
Purpose: To visualize the behavior of perfluorohexyloctane (PFHO), an eye drop to treat dry eye disease (DED), on the surface of saline in vitro and on the human ocular surface using infrared emissivity.
Methods: Emissivity videos were used to measure the spreading and disappearance rates of PFHO on saline (with and without mucin for spreading rate) and layered over a 125 nm film of meibum on the surface of saline using a TearView camera. Ocular surface emissivity was videoed in a volunteer without DED before and after instillation of 1 drop of PFHO.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!