Unified Robot and Inertial Sensor Self-Calibration.

Robotica

Department of Mechanical Engineering, Vanderbilt University, Nashville, TN, USA.

Published: May 2023

AI Article Synopsis

Article Abstract

Robots and inertial measurement units (IMUs) are typically calibrated independently. IMUs are placed in purpose-built, expensive automated test rigs. Robot poses are typically measured using highly accurate (and thus expensive) tracking systems. In this paper, we present a quick, easy, and inexpensive new approach to calibrate both simultaneously, simply by attaching the IMU anywhere on the robot's end effector and moving the robot continuously through space. Our approach provides a fast and inexpensive alternative to both robot and IMU calibration, without any external measurement systems. We accomplish this using continuous-time batch estimation, providing statistically optimal solutions. Under Gaussian assumptions, we show that this becomes a nonlinear least squares problem and analyze the structure of the associated Jacobian. Our methods are validated both numerically and experimentally and compared to standard individual robot and IMU calibration methods.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10508886PMC
http://dx.doi.org/10.1017/s0263574723000012DOI Listing

Publication Analysis

Top Keywords

robot imu
8
imu calibration
8
unified robot
4
robot inertial
4
inertial sensor
4
sensor self-calibration
4
self-calibration robots
4
robots inertial
4
inertial measurement
4
measurement units
4

Similar Publications

Wearable motion capture gloves enable the precise analysis of hand and finger movements for a variety of uses, including robotic surgery, rehabilitation, and most commonly, virtual augmentation. However, many motion capture gloves restrict natural hand movement with a closed-palm design, including fabric over the palm and fingers. In order to alleviate slippage, improve comfort, reduce sizing issues, and eliminate movement restrictions, this paper presents a new low-cost data glove with an innovative open-palm and finger-free design.

View Article and Find Full Text PDF

WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatch.

Front Robot AI

January 2025

Interactive Robotics Laboratory, School of Computing and Augmented Intelligence (SCAI), Arizona State University (ASU), Tempe, AZ, United States.

We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartphone strapped onto the upper arm and 3) a Pocket mode, which determines body orientation from a smartphone in any pocket. We evaluate all modes on large-scale datasets consisting of recordings from up to 8 human subjects using a range of consumer-grade devices.

View Article and Find Full Text PDF

To improve the efficiency of mobile robot movement, this paper investigates the fusion of the A* algorithm with the Dynamic Window Approach (DWA) algorithm (IA-DWA) to quickly search for globally optimal collision-free paths and avoid unknown obstacles in time. First, the data from the odometer and the inertial measurement unit (IMU) are fused using the extended Kalman filter (EKF) to reduce the error caused by wheel slippage on the mobile robot's positioning and improve the mobile robot's positioning accuracy. Second, the prediction function, weight coefficients, search neighborhood, and path smoothing processing of the A* algorithm are optimally designed to incorporate the critical point information in the global path into the DWA calculation framework.

View Article and Find Full Text PDF

Human hands have over 20 degrees of freedom, enabled by a complex system of bones, muscles, and joints. Hand differences can significantly impair dexterity and independence in daily activities. Accurate assessment of hand function, particularly digit movement, is vital for effective intervention and rehabilitation.

View Article and Find Full Text PDF

Initial Pose Estimation Method for Robust LiDAR-Inertial Calibration and Mapping.

Sensors (Basel)

December 2024

Department of Intelligent Systems & Robotics, Chungbuk National University, Cheongju 28644, Republic of Korea.

Handheld LiDAR scanners, which typically consist of a LiDAR sensor, Inertial Measurement Unit, and processor, enable data capture while moving, offering flexibility for various applications, including indoor and outdoor 3D mapping in fields such as architecture and civil engineering. Unlike fixed LiDAR systems, handheld devices allow data collection from different angles, but this mobility introduces challenges in data quality, particularly when initial calibration between sensors is not precise. Accurate LiDAR-IMU calibration, essential for mapping accuracy in Simultaneous Localization and Mapping applications, involves precise alignment of the sensors' extrinsic parameters.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!