Tracking human operators working in the vicinity of collaborative robots can improve the design of safety architecture, ergonomics, and the execution of assembly tasks in a human-robot collaboration scenario. Three commercial spatial computation kits were used along with their Software Development Kits that provide various real-time functionalities to track human poses. The paper explored the possibility of combining the capabilities of different hardware systems and software frameworks that may lead to better performance and accuracy in detecting the human pose in collaborative robotic applications. This study assessed their performance in two different human poses at six depth levels, comparing the raw data and noise-reducing filtered data. In addition, a laser measurement device was employed as a ground truth indicator, together with the average Root Mean Square Error as an error metric. The obtained results were analysed and compared in terms of positional accuracy and repeatability, indicating the dependence of the sensors' performance on the tracking distance. A Kalman-based filter was applied to fuse the human skeleton data and then to reconstruct the operator's poses considering their performance in different distance zones. The results indicated that at a distance less than 3 m, Microsoft Azure Kinect demonstrated better tracking performance, followed by Intel RealSense D455 and Stereolabs ZED2, while at ranges higher than 3 m, ZED2 had superior tracking performance.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10818797 | PMC |
http://dx.doi.org/10.3390/s24020578 | DOI Listing |
PLoS One
January 2025
Department of Pathology, University of Texas Medical Branch, Galveston, Texas, United States of America.
Tick-borne spotted fever rickettsioses (SFRs) continue to cause severe illness and death in otherwise-healthy individuals due to lack of a timely and reliable diagnostic laboratory test. We recently identified a diagnostic biomarker for SFRs, the putative N-acetylmuramoyl-l-alanine amidase RC0497. Here, we developed a prototype laboratory test that targets RC0497 for diagnosis of SFRs.
View Article and Find Full Text PDFPLoS One
January 2025
Department of Laboratory, The Second Hospital of Shanxi Medical University, Taiyuan, Shanxi, P.R. China.
Background: Systemic lupus erythematosus (SLE) is a complex and incurable autoimmune disease, so several drug remission for SLE symptoms have been developed and used at present. However, treatment varies by patient and disease activity, and existing medications for SLE were far from satisfactory. Novel drug targets to be found for SLE therapy are still needed.
View Article and Find Full Text PDFPLoS One
January 2025
Instituto de Biología, Universidad Nacional Autónoma de México (UNAM), México City, México.
Dogs can discriminate between people infected with SARS-CoV-2 from those uninfected, although their results vary depending on the settings in which they are exposed to infected individuals or samples of urine, sweat or saliva. This variability likely depends on the viral load of infected people, which may be closely associated with physiological changes in infected patients. Determining this viral load is challenging, and a practical approach is to use the cycle threshold (Ct) value of a RT-qPCR test.
View Article and Find Full Text PDFPLoS One
January 2025
School of Optometry and Vision Science, UNSW Sydney, Sydney, New South Wales, Australia.
Purpose: In this study, we investigated the performance of deep learning (DL) models to differentiate between normal and glaucomatous visual fields (VFs) and classify glaucoma from early to the advanced stage to observe if the DL model can stage glaucoma as Mills criteria using only the pattern deviation (PD) plots. The DL model results were compared with a machine learning (ML) classifier trained on conventional VF parameters.
Methods: A total of 265 PD plots and 265 numerical datasets of Humphrey 24-2 VF images were collected from 119 normal and 146 glaucomatous eyes to train the DL models to classify the images into four groups: normal, early glaucoma, moderate glaucoma, and advanced glaucoma.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!