A novel computer-assisted surgery (CAS) technique that merges dynamic and static CAS approaches to treat completely edentulous patients with dental implants is described. Radiographic and surgical stents are designed with specific fiducial markers that are recognized by the static and dynamic CAS software program. During the surgical procedure, implants are placed following the static surgical guide and the indications from the dynamic navigation system. This technique combines the advantages of static and dynamic CAS approaches to allow accurate and predictable minimally invasive implant placement.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.prosdent.2021.02.022 | DOI Listing |
Sensors (Basel)
December 2024
Guangdong Provincial Key Laboratory of Optical Fiber Sensing and Communications, Institute of Photonics Technology, Jinan University, Guangzhou 510630, China.
Real-time online monitoring of track deformation during railway construction is crucial for ensuring the safe operation of trains. However, existing monitoring technologies struggle to effectively monitor both static and dynamic events, often resulting in high false alarm rates. This paper presents a monitoring technology for track deformation during railway construction based on dynamic Brillouin optical time-domain reflectometry (Dy-BOTDR), which effectively meets requirements in the monitoring of both static and dynamic events of track deformation.
View Article and Find Full Text PDFSensors (Basel)
December 2024
Department of Mechanical Engineering, Brigham Young University, Provo, UT 84602, USA.
Flexible high-deflection strain gauges have been demonstrated to be cost-effective and accessible sensors for capturing human biomechanical deformations. However, the interpretation of these sensors is notably more complex compared to conventional strain gauges, particularly during dynamic motion. In addition to the non-linear viscoelastic behavior of the strain gauge material itself, the dynamic response of the sensors is even more difficult to capture due to spikes in the resistance during strain path changes.
View Article and Find Full Text PDFSensors (Basel)
December 2024
Federal Highway Research Institute, 51427 Bergisch Gladbach, Germany.
Weigh-in-motion (WIM) systems aim to estimate a vehicle's weight by measuring static wheel loads as it passes at highway speed over roadway-embedded sensors. Vehicle oscillations and the resulting dynamic load components are critical factors affecting measurements and limiting accuracy. As of now, a satisfactory solution has yet to be found.
View Article and Find Full Text PDFSensors (Basel)
December 2024
Department of Electronic & Computer Engineer, University of Limerick, V94 T9PX Limerick, Ireland.
Current deep learning-based phase unwrapping techniques for iToF Lidar sensors focus mainly on static indoor scenarios, ignoring motion blur in dynamic outdoor scenarios. Our paper proposes a two-stage semi-supervised method to unwrap ambiguous depth maps affected by motion blur in dynamic outdoor scenes. The method trains on static datasets to learn unwrapped depth map prediction and then adapts to dynamic datasets using continuous learning methods.
View Article and Find Full Text PDFSensors (Basel)
December 2024
Department of Computer Engineering, Gachon University, Sujeong-gu, Seongnam-si 13120, Republic of Korea.
Generating accurate and contextually rich captions for images and videos is essential for various applications, from assistive technology to content recommendation. However, challenges such as maintaining temporal coherence in videos, reducing noise in large-scale datasets, and enabling real-time captioning remain significant. We introduce MIRA-CAP (Memory-Integrated Retrieval-Augmented Captioning), a novel framework designed to address these issues through three core innovations: a cross-modal memory bank, adaptive dataset pruning, and a streaming decoder.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!