Publications by authors named "Bonkon Koo"

Background: Angiography-derived fractional flow reserve (virtual FFR) has shown excellent diagnostic performance compared with wire-based FFR. However, virtual FFR pullback curves have not been validated yet.

Objectives: To validate the accuracy of virtual FFR pullback curves compared to wire-based FFR pullbacks and to assess their clinical utility using patient-reported outcomes.

View Article and Find Full Text PDF

Eye trackers play a crucial role in the development of future display systems, such as head-mounted displays and augmented reality glasses. However, ensuring robustness and accuracy in gaze estimation poses challenges, particularly with limited space available for the transmitter and receiver components within these devices. To address the issues, we propose what we believe is a novel eye tracker design mounted on foldable temples, which not only supports accurate gaze estimation but also provides slim form-factor and unobstructed vision.

View Article and Find Full Text PDF

Objective: Artificial manipulation of animal movement could offer interesting advantages and potential applications using the animal's inherited superior sensation and mobility. Although several behavior control models have been introduced, they generally epitomize virtual reward-based training models. In this model, rats are trained multiple times so they can recall the relationship between cues and rewards.

View Article and Find Full Text PDF

Although several studies have been performed to detect cancer using canine olfaction, none have investigated whether canine olfaction trained to the specific odor of one cancer is able to detect odor related to other unfamiliar cancers. To resolve this issue, we employed breast and colorectal cancer in vitro, and investigated whether trained dogs to odor related to metabolic waste from breast cancer are able to detect it from colorectal cancer, and vice versa. The culture liquid samples used in the cultivation of cancerous cells (4T1 and CT26) were employed as an experimental group.

View Article and Find Full Text PDF

A correction to this article has been published and is linked from the HTML version of this paper. The error has been fixed in the paper.

View Article and Find Full Text PDF

Here, we report that the development of a brain-to-brain interface (BBI) system that enables a human user to manipulate rat movement without any previous training. In our model, the remotely-guided rats (known as ratbots) successfully navigated a T-maze via contralateral turning behaviour induced by electrical stimulation of the nigrostriatal (NS) pathway by a brain- computer interface (BCI) based on the human controller's steady-state visually evoked potentials (SSVEPs). The system allowed human participants to manipulate rat movement with an average success rate of 82.

View Article and Find Full Text PDF

In this paper we present an immersive brain computer interface (BCI) where we use a virtual reality head-mounted display (VRHMD) to invoke SSVEP responses. Compared to visual stimuli in monitor display, we demonstrate that visual stimuli in VRHMD indeed improve the user engagement for BCI. To this end, we validate our method with experiments on a VR maze game, the goal of which is to guide a ball into the destination in a 2D grid map in a 3D space, successively choosing one of four neighboring cells using SSVEP evoked by visual stimuli on neighboring cells.

View Article and Find Full Text PDF

Background: For a self-paced motor imagery based brain-computer interface (BCI), the system should be able to recognize the occurrence of a motor imagery, as well as the type of the motor imagery. However, because of the difficulty of detecting the occurrence of a motor imagery, general motor imagery based BCI studies have been focusing on the cued motor imagery paradigm.

New Method: In this paper, we present a novel hybrid BCI system that uses near infrared spectroscopy (NIRS) and electroencephalography (EEG) systems together to achieve online self-paced motor imagery based BCI.

View Article and Find Full Text PDF

We present a novel human-machine interface, called GOM-Face , and its application to humanoid robot control. The GOM-Face bases its interfacing on three electric potentials measured on the face: 1) glossokinetic potential (GKP), which involves the tongue movement; 2) electrooculogram (EOG), which involves the eye movement; 3) electromyogram, which involves the teeth clenching. Each potential has been individually used for assistive interfacing to provide persons with limb motor disabilities or even complete quadriplegia an alternative communication channel.

View Article and Find Full Text PDF