4 results match your criteria: "Multimedia and Film Chung-Ang University[Affiliation]"

For sustainable operation and maintenance of urban railway infrastructure, intelligent visual inspection of the railway infrastructure attracts increasing attention to avoid unreliable, manual observation by humans at night, while trains do not operate. Although various automatic approaches were proposed using image processing and computer vision techniques, most of them are focused only on railway tracks. In this paper, we present a novel railway inspection system using facility detection based on deep convolutional neural network and computer vision-based image comparison approach.

View Article and Find Full Text PDF

Analysis of Design and Fabrication Parameters for Lensed Optical Fibers as Pertinent Probes for Sensing and Imaging.

Sensors (Basel)

November 2018

School of Electrical Engineering and Computer Science, Gwangju Institute of Science and Technology, 123 Cheomdangwagi-ro, Buk-gu, Gwangju 61005, Korea.

A method for adjusting the working distance and spot size of a fiber probe while suppressing or enhancing the back-coupling to the lead-in fiber is presented. As the optical fiber probe, a lensed optical fiber (LOF) was made by splicing a short piece of coreless silica fiber (CSF) on a single-mode fiber and forming a lens at the end of the CSF. By controlling the length of the CSF and the radius of lens curvature, the optical properties of the LOF were adjusted.

View Article and Find Full Text PDF

Mapping the environment of a vehicle and localizing a vehicle within that unknown environment are complex issues. Although many approaches based on various types of sensory inputs and computational concepts have been successfully utilized for ground robot localization, there is difficulty in localizing an unmanned aerial vehicle (UAV) due to variation in altitude and motion dynamics. This paper proposes a robust and efficient indoor mapping and localization solution for a UAV integrated with low-cost Light Detection and Ranging (LiDAR) and Inertial Measurement Unit (IMU) sensors.

View Article and Find Full Text PDF

In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors.

View Article and Find Full Text PDF