Publications by authors named "Andrei State"

In this paper, we present our novel design for switchable AR/VR near-eye displays which can help solve the vergence-accommodation-conflict issue. The principal idea is to time-multiplex virtual imagery and real-world imagery and use a tunable lens to adjust focus for the virtual display and the see-through scene separately. With this novel design, prescription eyeglasses for near- and far-sighted users become unnecessary.

View Article and Find Full Text PDF

This paper presents the implementation and evaluation of a 50,000-pose-sample-per-second, 6-degree-of-freedom optical head tracking instrument with motion-to-pose latency of 28μs and dynamic precision of 1-2 arcminutes. The instrument uses high-intensity infrared emitters and two duo-lateral photodiode-based optical sensors to triangulate pose. This instrument serves two purposes: it is the first step towards the requisite head tracking component in sub- 100μs motion-to-photon latency optical see-through augmented reality (OST AR) head-mounted display (HMD) systems; and it enables new avenues of research into human visual perception - including measuring the thresholds for perceptible real-virtual displacement during head rotation and other human research requiring high-sample-rate motion tracking.

View Article and Find Full Text PDF

We propose a new approach for 3D reconstruction of dynamic indoor and outdoor scenes in everyday environments, leveraging only cameras worn by a user. This approach allows 3D reconstruction of experiences at any location and virtual tours from anywhere. The key innovation of the proposed ego-centric reconstruction system is to capture the wearer's body pose and facial expression from near-body views, e.

View Article and Find Full Text PDF

This paper introduces a computer-based system that is designed to record a surgical procedure with multiple depth cameras and reconstruct in three dimensions the dynamic geometry of the actions and events that occur during the procedure. The resulting 3D-plus-time data takes the form of dynamic, textured geometry and can be immersively examined at a later time; equipped with a Virtual Reality headset such as Oculus Rift DK2, a user can walk around the reconstruction of the procedure room while controlling playback of the recorded surgical procedure with simple VCR-like controls (play, pause, rewind, fast forward). The reconstruction can be annotated in space and time to provide more information of the scene to users.

View Article and Find Full Text PDF

We describe an augmented reality, optical see-through display based on a DMD chip with an extremely fast (16 kHz) binary update rate. We combine the techniques of post-rendering 2-D offsets and just-in-time tracking updates with a novel modulation technique for turning binary pixels into perceived gray scale. These processing elements, implemented in an FPGA, are physically mounted along with the optical display elements in a head tracked rig through which users view synthetic imagery superimposed on their real environment.

View Article and Find Full Text PDF

Purpose: To develop a consistent and reproducible method in an animal model for studies of radiofrequency (RF) ablation of primary hepatocellular carcinoma (HCC).

Materials And Methods: Fifteen woodchucks were inoculated with woodchuck hepatitis virus (WHV) to establish chronic infections. When serum γ-glutamyl transpeptidase levels became elevated, the animals were evaluated with ultrasound, and, in most cases, preoperative magnetic resonance (MR) imaging to confirm tumor development.

View Article and Find Full Text PDF

Two-dimensional (2D) videoconferencing has been explored widely in the past 15-20 years to support collaboration in healthcare. Two issues that arise in most evaluations of 2D videoconferencing in telemedicine are the difficulty obtaining optimal camera views and poor depth perception. To address these problems, we are exploring the use of a small array of cameras to reconstruct dynamic three-dimensional (3D) views of a remote environment and of events taking place within.

View Article and Find Full Text PDF

Radio frequency ablation is a minimally invasive intervention that introduces -- under 2D ultrasound guidance and via a needle-like probe -- high-frequency electrical current into non-resectable hepatic tumors. These recur mostly on the periphery, indicating errors in probe placement. Hypothesizing that a contextually correct 3D display will aid targeting and decrease recurrence, we have developed a prototype guidance system based on a head-tracked 3D display and motion-tracked instruments.

View Article and Find Full Text PDF

This paper shows a number of stereoscopic images depicting the UNC augmented reality guidance system for medical visualization in operation.

View Article and Find Full Text PDF

We report the results of a randomized, controlled trial to compare the accuracy of standard ultrasound-guided needle biopsy to biopsies performed using a 3D Augmented Reality (AR) guidance system. A board-certified radiologist conducted 50 core biopsies of breast phantoms, with biopsies randomly assigned to one of the methods in blocks of five biopsies each. The raw ultrasound data from each biopsy was recorded.

View Article and Find Full Text PDF