A proof-of-concept augmented reality system in oral and maxillofacial surgery.

J Stomatol Oral Maxillofac Surg

EnCoV, Institut Pascal, UMR 6602, CNRS/UBP/SIGMA, EnCoV, 63000, Clermont-Ferrand, France.

Published: September 2021

Background: The advent of digital medical imaging, medical image analysis and computer vision has opened the surgeon horizons with the possibility to add virtual information to the real operative field. For oral and maxillofacial surgeons, overlaying anatomical structures to protect (such as teeth, sinus floors, inferior and superior alveolar nerves) or to remove (such as cysts, tumours, impacted teeth) presents a real clinical interest.

Material And Methods: Through this work, we propose a proof-of-concept markerless augmented reality system for oral and maxillofacial surgery, where a virtual scene is generated preoperatively and mixed with reality to reveal the location of hidden anatomical structures intraoperatively. We devised a computer software to process still video frames of the operating field and to display them on the operating room screens.

Results: Firstly, we give a description of the proposed system, where virtuality aligns with reality without artificial markers. The dental occlusion plan analysis and cusps detection allow us to initialise the alignment process. Secondly, we validate the feasibility with an experimental approach on a 3D printed jaw phantom and an ex-vivo pig jaw. Thirdly, we evaluate the potential clinical benefit on a patient.

Conclusion: this proof-of-concept highlights the feasibility and the interest of augmented reality for hidden anatomical structures visualisation without artificial markers.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jormas.2021.05.012DOI Listing

Publication Analysis

Top Keywords

augmented reality
12
oral maxillofacial
12
anatomical structures
12
reality system
8
system oral
8
maxillofacial surgery
8
hidden anatomical
8
artificial markers
8
reality
5
proof-of-concept augmented
4

Similar Publications

Scalable InGaN nanowire µ-LEDs: paving the way for next-generation display technology.

Natl Sci Rev

January 2025

Division of Advanced Materials Engineering, College of Engineering, Research Center for Advanced Materials Development (RCAMD), Jeonbuk National University (JBNU), Jeonju 54896, South Korea.

Ever-increasing demand for efficient optoelectronic devices with a small-footprinted on-chip light emitting diode has driven their expansion in self-emissive displays, from micro-electronic displays to large video walls. InGaN nanowires, with features like high electron mobility, tunable emission wavelengths, durability under high current densities, compact size, self-emission, long lifespan, low-power consumption, fast response, and impressive brightness, are emerging as the choice of micro-light emitting diodes (µLEDs). However, challenges persist in achieving high crystal quality and lattice-matching heterostructures due to composition tuning and bandgap issues on substrates with differing crystal structures and high lattice mismatches.

View Article and Find Full Text PDF

Multidimensional 3D-rendered objects are an important component of vision research and video- gaming applications, but it has remained challenging to parametrically control and efficiently generate those objects. Here, we describe a toolbox for controlling and efficiently generating 3D rendered objects composed of ten separate visual feature dimensions that can be fine-adjusted using python scripts. The toolbox defines objects as multi-dimensional feature vectors with primary dimensions (object body related features), secondary dimensions (head related features) and accessory dimensions (including arms, ears, or beaks).

View Article and Find Full Text PDF

Background: Reminiscence therapy through music is a psychosocial intervention with benefits for older patients with neurocognitive disorders. Therapies using virtual or augmented reality are efficient in ecologically assessing, and eventually training, episodic memory in older populations. We designed a semi-immersive musical game called "A Life in Songs," which invites patients to immerse themselves in a past era through visuals and songs from that time period.

View Article and Find Full Text PDF

Extended reality as a modality to train non-technical skills in healthcare: A scoping review.

Appl Ergon

January 2025

Department of Industrial Engineering, Clemson University, Clemson, SC, USA. Electronic address:

The need to train non-technical skills (NTS) has seen a growing emphasis in recent literature, as they have been associated with improved patient outcomes. NTS training often utilizes live simulations where healthcare workers can practice these skills, but simulations like this can be expensive and resource intensive to run. Training technical skills using extended reality tools (e.

View Article and Find Full Text PDF

Toward structured abdominal examination training using augmented reality.

Int J Comput Assist Radiol Surg

January 2025

Faculty of Computer Science and Research Campus STIMULATE, Otto-von-Guericke University of Magdeburg, Magdeburg, Germany.

Purpose: Structured abdominal examination is an essential part of the medical curriculum and surgical training, requiring a blend of theory and practice from trainees. Current training methods, however, often do not provide adequate engagement, fail to address individual learning needs or do not cover rare diseases.

Methods: In this work, an application for structured Abdominal Examination Training using Augmented Reality (AETAR) is presented.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!