Evaluation of Hands-On Clinical Exam Performance Using Marker-less Video Tracking.

Proc Hum Factors Ergon Soc Annu Meet

College of Engineering, The University of Wisconsin Madison, 1513 University Ave., Madison, WI 53706.

Published: September 2014

This study investigates the potential of using marker-less video tracking of the hands for evaluating hands-on clinical skills. Experienced family practitioners attending a national conference were recruited and asked to conduct a breast examination on a simulator that simulates different clinical presentations. Videos were made of the clinician's hands during the exam and video processing software for tracking hand motion to quantify hand motion kinematics was used. Practitioner motion patterns indicated consistent behavior of participants across multiple pathologies. Different pathologies exhibited characteristic motion patterns in the aggregate at specific parts of an exam, indicating consistent inter-participant behavior. Marker-less video kinematic tracking therefore shows promise in discriminating between different examination procedures, clinicians, and pathologies.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4576738PMC
http://dx.doi.org/10.1177/1541931214581145DOI Listing

Publication Analysis

Top Keywords

marker-less video
12
hands-on clinical
8
video tracking
8
hand motion
8
motion patterns
8
evaluation hands-on
4
clinical exam
4
exam performance
4
performance marker-less
4
video
4

Similar Publications

Article Synopsis
  • Body kinematics is crucial for various fields but current methods using infrared wearables are costly and impractical for natural movements.
  • To address this, algorithms like OpenPose have been created to estimate body movements from regular video, although they are less accurate compared to traditional methods like Vicon.
  • A study found that OpenPose's accuracy varies by participant and movement size, performing well for larger movements but struggling with smaller ones, providing insights into the limitations of video-based motion capture technology.
View Article and Find Full Text PDF

Assessing the effects of 5-HT and 5-HT receptor antagonists on DOI-induced head-twitch response in male rats using marker-less deep learning algorithms.

Pharmacol Rep

November 2024

Behavioral Neuroscience and Drug Development, Maj Institute of Pharmacology, Polish Academy of Sciences, Smętna 12, Kraków, 31-343, Poland.

Article Synopsis
  • The study investigates a new, marker-less method using deep learning to track head-twitch responses in rodents, traditionally observed by humans.
  • High-speed videos were analyzed with DeepLabCut and SimBA, showing strong agreement with human counts while evaluating the effects of the psychedelic DOI.
  • Results confirmed that certain 5-HT receptor antagonists can reduce head-twitch responses, supporting the idea that this behavior is specific to serotonergic activity, and demonstrated the effectiveness of the automated tracking tools for research purposes.
View Article and Find Full Text PDF

The widespread implementation of lung cancer screening and thin-slice computed tomography (CT) has led to the more frequent detection of small nodules, which are commonly referred to thoracic surgeons. Surgical resection is the final diagnostic and treatment option for such nodules; however, surgeons must perform preoperative or intraoperative markings for the identification of such nodules and their precise resection. Historically, hook-wire marking has been performed more frequently worldwide; however, lethal complications, such as air embolism, have been reported.

View Article and Find Full Text PDF

Objective: This systematic review investigates of Augmented Reality (AR) systems used in minimally invasive surgery of deformable organs, focusing on initial registration, dynamic tracking, and visualization. The objective is to acquire a comprehensive understanding of the current knowledge, applications, and challenges associated with current AR-techniques, aiming to leverage these insights for developing a dedicated AR pulmonary Video or Robotic Assisted Thoracic Surgery (VATS/RATS) workflow.

Methods: A systematic search was conducted within Embase, Medline (Ovid) and Web of Science on April 16, 2024, following the Preferred Reporting items for Systematic Reviews and Meta-Analyses (PRISMA).

View Article and Find Full Text PDF

Accurate and fast extraction of step parameters from video recordings of gait allows for richer information to be obtained from clinical tests such as Timed Up and Go. Current deep-learning methods are promising, but lack in accuracy for many clinical use cases. Extracting step parameters will often depend on extracted landmarks (keypoints) on the feet.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!