Aims & Objectives: The primary aim of this paper is to determine whether smart glasses or head-mounted displays improve efficiency in a procedural or theatre setting without compromising the quality of the procedure performed. Additionally, this paper aims to qualitatively explore applications in surgical education, whilst on-call, consulting and patient observation.
Design: This paper is a systematic review of the literature available on the topic of smart glasses or head-mounted displays in surgical or procedural settings.
Methods: A search of Pubmed, Cochrane and the Wiley Online Library was performed in accordance with the PRISMA guidelines. Procedural times and adverse outcomes were compared between the smart glass and non-smart glass groups in each of the quantitative studies. A literature review of studies, including those not satisfying the primary aim was conducted and is included in this paper.
Results: 32 studies were identified that complied with the inclusion criteria of this paper. 8 of these studies focused on procedural times and adverse outcomes, with and without smart glass usage. Procedural time was reduced when smart glass technology was used, without an increase in adverse patient outcomes.
Conclusions: Surgeons should consider whether the relatively short reduction in procedural time is worth the high cost, privacy issues, battery complaints and user discomfort involved with these devices. There are promising applications of this technology in the areas of surgical education and consultation. However, more trials are necessary to assess the value of using smart glasses in these settings.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1177/15533506241265274 | DOI Listing |
Transl Vis Sci Technol
January 2025
Amsterdam UMC, Vrije Universiteit Amsterdam, Ophthalmology, Amsterdam, The Netherlands.
Purpose: This study assessed objective performance, usability, and acceptance of artificial intelligence (AI) by people with vision impairment. The goal was to provide evidence-based data to enhance technology selection for people with vision loss (PVL) based on their loss and needs.
Methods: Using a cross-sectional, counterbalanced, cross-over study involving 25 PVL, we compared performance using two smart glasses (OrCam and Envision Glasses) and two AI apps (Seeing AI and Google Lookout).
Sensors (Basel)
December 2024
The State Key Laboratory of Fluid Power and Mechatronic Systems, School of Mechanical Engineering, Zhejiang University, Hangzhou 310027, China.
In physical spaces, pointing interactions cannot rely on cursors, rays, or virtual hands for feedback as in virtual environments; users must rely solely on their perception and experience to capture targets. Currently, research on modeling target distribution for pointing interactions in physical space is relatively sparse. Area division is typically simplistic, and theoretical models are lacking.
View Article and Find Full Text PDFImaging Sci Dent
December 2024
OMFS IMPATH Research Group, Department of Imaging and Pathology, Faculty of Medicine, KU Leuven, Leuven, Belgium.
BMC Anesthesiol
November 2024
Department of Anesthesiology, West China Hospital of Sichuan University, Chengdu, 610041, Sichuan, P. R. China.
J Med Internet Res
November 2024
College of Nursing, University of Colorado, Aurora, CO, United States.
Background: Smart glasses have emerged as a promising solution for enhancing communication and care coordination among distributed medical teams. While prior research has explored the feasibility of using smart glasses to improve prehospital communication between emergency medical service (EMS) providers and remote physicians, a research gap remains in understanding the specific requirements and needs of EMS providers for smart glass implementation.
Objective: This study aims to iteratively design and evaluate a smart glass application tailored for prehospital communication by actively involving prospective users in the system design process.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!