Robust Stereo Visual-Inertial Odometry Using Nonlinear Optimization.

Sensors (Basel)

School of Mechanical Engineering and Automation, Northeastern University, Shenyang 110819, China.

Published: August 2019

The fusion of visual and inertial odometry has matured greatly due to the complementarity of the two sensors. However, the use of high-quality sensors and powerful processors in some applications is difficult due to size and cost limitations, and there are also many challenges in terms of robustness of the algorithm and computational efficiency. In this work, we present VIO-Stereo, a stereo visual-inertial odometry (VIO), which jointly combines the measurements of the stereo cameras and an inexpensive inertial measurement unit (IMU). We use nonlinear optimization to integrate visual measurements with IMU readings in VIO tightly. To decrease the cost of computation, we use the FAST feature detector to improve its efficiency and track features by the KLT sparse optical flow algorithm. We also incorporate accelerometer bias into the measurement model and optimize it together with other variables. Additionally, we perform circular matching between the previous and current stereo image pairs in order to remove outliers in the stereo matching and feature tracking steps, thus reducing the mismatch of feature points and improving the robustness and accuracy of the system. Finally, this work contributes to the experimental comparison of monocular visual-inertial odometry and stereo visual-inertial odometry by evaluating our method using the public EuRoC dataset. Experimental results demonstrate that our method exhibits competitive performance with the most advanced techniques.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6749198PMC
http://dx.doi.org/10.3390/s19173747DOI Listing

Publication Analysis

Top Keywords

visual-inertial odometry
16
stereo visual-inertial
12
nonlinear optimization
8
odometry
5
stereo
5
robust stereo
4
visual-inertial
4
odometry nonlinear
4
optimization fusion
4
fusion visual
4

Similar Publications

Event-Based Visual/Inertial Odometry for UAV Indoor Navigation.

Sensors (Basel)

December 2024

SOTI Aerospace, SOTI Inc., Mississauga, ON L5N 8L9, Canada.

Indoor navigation is becoming increasingly essential for multiple applications. It is complex and challenging due to dynamic scenes, limited space, and, more importantly, the unavailability of global navigation satellite system (GNSS) signals. Recently, new sensors have emerged, namely event cameras, which show great potential for indoor navigation due to their high dynamic range and low latency.

View Article and Find Full Text PDF

Simultaneous localization and mapping (SLAM) techniques can be used to navigate the visually impaired, but the development of robust SLAM solutions for crowded spaces is limited by the lack of realistic datasets. To address this, we introduce InCrowd-VI, a novel visual-inertial dataset specifically designed for human navigation in indoor pedestrian-rich environments. Recorded using Meta Aria Project glasses, it captures realistic scenarios without environmental control.

View Article and Find Full Text PDF

GPS/VIO integrated navigation system based on factor graph and fuzzy logic.

Sci Rep

December 2024

Department of Electrical Engineering, Iran University of Science and Technology, Tehran, 16846-13114, Iran.

In today's technologically advanced landscape, precision in navigation and positioning holds paramount importance across various applications, from robotics to autonomous vehicles. A common predicament in location-based systems is the reliance on Global Positioning System (GPS) signals, which may exhibit diminished accuracy and reliability under certain conditions. Moreover, when integrated with the Inertial Navigation System (INS), the GPS/INS system could not provide a long-term solution for outage problems due to its accumulated errors.

View Article and Find Full Text PDF

Event-based cameras offer unique advantages over traditional cameras, such as high dynamic range, absence of motion blur, and microsecond-level latency. This paper introduces an innovative approach to visual odometry, to our knowledge, by integrating the newly proposed Rotated Binary DART (RBD) descriptor within a Visual-Inertial Navigation System (VINS)-based event visual odometry framework. Our method leverages event optical flow and RBD for precise feature selection and matching, ensuring robust performance in dynamic environments.

View Article and Find Full Text PDF

Composite robots often encounter difficulties due to changes in illumination, external disturbances, reflective surface effects, and cumulative errors. These challenges significantly hinder their capabilities in environmental perception and the accuracy and reliability of pose estimation. We propose a nonlinear optimization approach to overcome these issues to develop an integrated localization and navigation framework, IIVL-LM (IMU, Infrared, Vision, and LiDAR Fusion for Localization and Mapping).

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!